Heuristic is a method for solving a problem
Поможем в ✍️ написании учебной работы
Поможем с курсовой, контрольной, дипломной, рефератом, отчетом по практике, научно-исследовательской и любой другой работой

A heuristic technique, often called simply a heuristic, is any ap- proach to problem solving, learning, or discovery that employs a practi- cal method not guaranteed to be optimal or perfect, but sufficient for the immediate goals. Where finding an optimal solution is impossible or impractical, heuristic methods can be used to speed up the process of finding a satisfactory solution. Heuristics can be mental shortcuts that ease the cognitive load of making a decision. Examples of this method include using a rule of thumb, an educated guess, an intuitive judgment, guesstimate, stereotyping, profiling, or common sense.

Heuristics are strategies derived from previous experiences with sim- ilar problems. These strategies rely on using readily accessible, though loosely applicable, information to control problem solving in human beings, machines, and abstract issues. The most fundamental heuristic is trial and error, which can be used in everything from matching nuts and bolts to finding the values of variables in algebra problems.

Here are a few other commonly used heuristics, from George Pólya's 1945 book, How to Solve It:

· If you are having difficulty understanding a problem, try draw- ing a picture.

· If you can't find a solution, try assuming that you have a solu- tion and seeing what you can derive from that.

· If the problem is abstract, try examining a concrete example.

· Try solving a more general problem first.

In psychology, heuristics are simple, efficient rules, learned or hard- coded by evolutionary processes, that have been proposed to explain how people make decisions, come to judgments, and solve problems


typically when facing complex problems or incomplete information. Researchers test if people use those rules with various methods. These rules work well under most circumstances, but in certain cases lead to systematic errors or cognitive biases.

The study of heuristics in human decision-making was developed in the 1970s and 80s by Israeli psychologists Amos Tverskyand Daniel Kahneman, although the concept was originally introduced by Nobel laureate Herbert A. Simon. Simon's original, primary object of research was problem solving which showed that we operate within what he calls bounded rationality. He coined the term "satisficing", which denotes the situation where people seek solutions or accept choices or judgments that are "good enough" for their purposes, but could be optimized.

Rudolf Groner analyzed the history of heuristics from its roots in an- cient Greece up to contemporary work in cognitive psychology and arti- ficial intelligence. Gerd Gigerenzer focused on the "fast and frugal" properties of heuristics, i.e., using heuristics in a way that is principally accurate and thus eliminating most cognitive bias. Heuristics – like the recognition heuristic or the take-the-best heuristic – are viewed as spe- cial tools that tackle specific tasks under conditions of uncertainty and are organized in an "adaptive toolbox". From one particular batch of research, Gigerenzer and Wolfgang Gaissmaier found that both individ- uals and organizations rely on heuristics in an adaptive way. They also found that ignoring part of the information, rather than weighing all the options, can actually lead to more accurate decisions.

Heuristics, through greater refinement and research, have begun to be applied to other theories, or be explained by them. For example: the cognitive-experiential self-theory also is an adaptive view of heuristic processing. CEST breaks down two systems that process information. At some times, roughly speaking, individuals consider issues rationally, systematically, logically, deliberately, effortfully, and verbally. On other occasions, individuals consider issues intuitively, effortlessly, globally, and emotionally. From this perspective, heuristics are part of a larger experiential processing system that is often adaptive, but vulnerable to error in situations that require logical analysis.

In 2002, Daniel Kahneman and Shane Frederick proposed that cogni- tive heuristics work by a process called attribute substitution, which happens without conscious awareness. According to this theory, when somebody makes a judgment that is computationally complex, a rather easier calculated "heuristic attribute" is substituted. In effect, a cogni-


tively difficult problem is dealt with by answering a rather simpler prob- lem, without being aware of this happening. This theory explains cases where judgments fail to show regression toward the mean. Heuristics can be considered to reduce the complexity of clinical judgements in healthcare.

· Anchoring and adjustment – Describes the common human ten- dency to rely too heavily on the first piece of information offered when making decisions. For example, in a study done with children, the chil- dren were told to estimate the number of jellybeans in a jar. Groups of children were given either a high or low "base" number. Children esti- mated the number of jellybeans to be closer to the anchor number that they were given.

· Availability heuristic – A mental shortcut that occurs when people make judgments about the probability of events by the ease with which examples come to mind. For example, in a 1973 Tversky & Kahneman experiment, the majority of participants reported that there were more words in the English language that start with the letter K than for which K was the third letter. There are actually twice as many words in the English Language that have K as the third letter as those that start with K, but words that start with K are much easier to recall and bring to mind.

· Representativeness heuristic – A mental shortcut used when making judgments about the probability of an event under uncertainty. Or, judging a situation based on how similar the prospects are to the prototypes the person holds in his or her mind. For example, in a 1982 Tversky and Kahneman experiment, participants were given a descrip- tion of a woman named Linda. Based on the description, it was likely that Linda was a feminist. Eighty to ninety percent of participants, choosing from two options, chose that it was more likely for Linda to be a feminist and a bank teller than only a bank teller. The likelihood of two events cannot be greater than that of either of the two events indi- vidually. For this reason, the representativeness heuristic is exemplary of the conjunction fallacy.

· Naïve diversification – When asked to make several choices at once, people tend to diversify more than when making the same type of decision sequentially.

· Escalation of commitment – Describes the phenomenon where people justify increased investment in a decision, based on the cumula- tive prior investment, despite new evidence suggesting that the cost,


starting today, of continuing the decision outweighs the expected bene- fit. This is related to the sunk cost fallacy.

· Familiarity heuristic – A mental shortcut applied to various sit- uations in which individuals assume that the circumstances underlying the past behavior still hold true for the present situation and that the past behavior thus can be correctly applied to the new situation. Especially prevalent when the individual experiences a high cognitive load.

Heuristics were also found to be used in the manipulation and crea- tion of cognitive maps. Cognitive maps are internal representations of our physical environment, particularly associated with spatial relation- ships. These internal representations of our environment are used as memory as a guide in our external environment. It was found that when questioned about maps imaging, distancing, etc., people commonly made distortions to images. These distortions took shape in the regulari- zation of images.

There are several ways that humans form and use cognitive maps. Visual intake is a key part of mapping. The first is by using landmarks. This is where a person uses a mental image to estimate a relationship, usually distance, between two objects. Second, is route-road knowledge, and this is generally developed after a person has performed a task and is relaying the information of that task to another person. Third, is sur- vey. A person estimates a distance based on a mental image that, to them, might appear like an actual map. This image is generally created when a person's brain begins making image corrections. These are pre- sented in five ways: 1. Right-angle bias is when a person straightens out an image, like mapping an intersection, and begins to give everything 90-degree angles, when in reality it may not be that way. 2. Symmetry heuristic is when people tend to think of shapes, or buildings, as being more symmetrical than they really are. 3. Rotation heuristic is when a person takes a naturally distorted image and straightens it out for their mental image. 4. Alignment heuristic is similar to the previous, where people align objects mentally to make them straighter than they really are. 5. Relative-position heuristic: people do not accurately distance landmarks in their mental image based on how well they remember that particular item.

Another method of creating cognitive maps is by means of auditory intake based on verbal descriptions. Using the mapping based from a person's visual intake, another person can create a mental image, such as directions to a certain location.


"Heuristic device" is used when an entity X exists to enable under- standing of, or knowledge concerning, some other entity Y. A good ex- ample is a model that, as it is never identical with what it models, is a heuristic device to enable understanding of what it models. Stories, met- aphors, etc., can also be termed heuristic in that sense. A classic exam- ple is the notion of utopia as described in Plato's best-known work, The Republic. This means that the "ideal city" as depicted in The Republic is not given as something to be pursued, or to present an orientation-point for development; rather, it shows how things would have to be connect- ed, and how one thing would lead to another (often with highly prob- lematic results), if one would opt for certain principles and carry them through rigorously.

"Heuristic" is also often used as a noun to describe a rule-of-thumb, procedure, or method. Philosophers of science have emphasized the im- portance of heuristics in creative thought and constructing scientific theories. In legal theory, especially in the theory of law and economics, heuristics are used in the law when case-by-case analysiswould be im- practical, insofar as "practicality" is defined by the interests of a govern- ing body.

The present securities regulation regime largely assumes that all in- vestors act as perfectly rational persons. In truth, actual investors face cognitive limitations from biases, heuristics, and framing effects.

For instance, in all states in the United States the legal drinking age for unsupervised persons is 21 years, because it is argued that people need to be mature enough to make decisions involving the risks of alco- hol consumption. However, assuming people mature at different rates, the specific age of 21 would be too late for some and too early for oth- ers. In this case, the somewhat arbitrary deadline is used because it is impossible or impractical to tell whether an individual is sufficiently mature for society to trust them with that kind of responsibility. Some proposed changes, however, have included the completion of an alcohol education course rather than the attainment of 21 years of age as the cri- terion for legal alcohol possession. This would put youth alcohol policy more on a case-by-case basis and less on a heuristic one, since the com- pletion of such a course would presumably be voluntary and not uni- form across the population.

The same reasoning applies to patent law. Patents are justified on the grounds that inventors must be protected so they have incentive to in- vent. It is therefore argued that it is in society's best interest that inven-


tors receive a temporary government-granted monopoly on their idea, so that they can recoup investment costs and make economic profit for a limited period. In the United States, the length of this temporary mo- nopoly is 20 years from the date the application for patent was filed, though the monopoly does not actually begin until the application has matured into a patent. However, like the drinking-age problem above, the specific length of time would need to be different for every product to be efficient. A 20-year term is used because it is difficult to tell what the number should be for any individual patent. More recently, some, including University of North Dakota law professor Eric E. Johnson, have argued that patents in different kinds of industries – such as soft- ware patents – should be protected for different lengths of time.

Stereotyping is a type of heuristic that all people use to form opin- ions or make judgments about things they have never seen or experi- enced. They work as a mental shortcut to assess everything from the social status of a person based on their actions to assumptions that a plant that is tall, has a trunk, and has leaves is a tree even though the person making the evaluation has never seen that particular type of tree before.

Stereotypes, as first described by journalist Walter Lippmann in his book Public Opinion, are the pictures we have in our heads that are built around experiences as well as what we are told about the world.

The concept of heuristics has not only followers in academic world, but also critiques and controversies. Many scholars do not like to accept idea of heuristics and biases. The followers think that is only because they do not want to research this subject. Heuristic may also refer to:

· Heuristic (computer science), a technique to produce a solution in a reasonable time frame that is good enough for solving the problem at hand

· Heuristic, an experience-based method reducing use of calcula- tions

· Heuristic algorithm, a computer program for making a determi- nation

· Heuristics in judgment and decision-making discovered by re- search in psychology and behavioral economics

· Heuristic argument, a non-rigorous argument that relies on an analogy or intuition

· Heuristic function, a ranking method used with search algo- rithms


TIPS was developed by the Soviet inventor and science-fiction au- thor G. Altshuller and his colleagues, beginning in 1946. In English the name is typically rendered as "the theory of inventive problem solving", and occasionally goes by the English acronym TIPS.

Following Altshuller's insight, the theory developed on a foundation of extensive research covering hundreds of thousands of inventions across many different fields to produce a theory which defines general- isable patterns in the nature of inventive solutions and the distinguishing characteristics of the problems that these inventions have overcome.

An important part of the theory has been devoted to revealing pat- terns of evolution and one of the objectives which has been pursued by leading practitioners of TRIZ has been the development of an algorith- mic approach to the invention of new systems, and to the refinement of existing ones. TRIZ includes a practical methodology, tool sets, a knowledge base, and model-based technology for generating innovative solutions for problem solving. It is intended for application in problem formulation, system analysis, failure analysis, and patterns of system evolution. There is a general similarity of purposes and methods with the field of pattern language, a cross discipline practice for explicitly describing and sharing holistic patterns of design.

The research has produced three primary findings:

1. problems and solutions are repeated across industries and sci- ences

2. patterns of technical evolution are also repeated across indus- tries and sciences

3. the innovations used scientific effects outside the field in which they were developed

TRIZ practitioners apply all these findings in order to create and to improve products, services, and systems. TRIZ in its classical form was developed by the Soviet inventor and science fiction writer Genrich Altshuller and his associates. He started developing TRIZ in 1946 His work on what later resulted in TRIZ was interrupted in 1950 by his ar- rest and sentencing to 25 years in the Vorkuta Gulaglabor camps. The arrest was partially triggered by letters which he and Raphael Shapiro sent to Stalin, ministers and newspapers about certain decisions made by the Soviet Government, which they believed were erroneous. Altshuller and Shapiro were freed during the Khrushchev Thaw following Stalin's death in 1953 and returned to Baku.


The first paper on TRIZ titled "On the psychology of inventive crea- tion" was published in 1956 in "Issues in Psychology", Altshuller had reviewed about 40,000 patent abstracts in order to find out in what way the innovation had taken place and developed the concept of technical contradictions, the concept of ideality of a system, contradiction matrix, and 40 principles of invention. In the years that followed he developed the concepts of physical contradictions, SuField analysis (structural sub- stance-field analysis), standard solutions, several laws of technical sys- tems evolution, and numerous other theoretical and practical approach- es.

Altshuller also observed clever and creative people at work: he un- covered patterns in their thinking, and developed thinking tools and techniques to model this "talented thinking". These tools include Smart Little People and Thinking in Time and Scale (or the Screens of Talent- ed Thought).

In 1971 Altshuller convinced The Inventors Society to establish in Baku the first TRIZ teaching facility, called the Azerbaijan Public Insti- tute for Inventive Creation and the first TRIZ research lab called The Public Lab for Inventive Creation. Altshuller was appointed the head of the lab by the society. The lab incubated the TRIZ movement and in the years that followed other TRIZ teaching institutes were established in all major cities of the USSR. From 1986 Altshuller switched his attention away from technical TRIZ, and started investigating the development of individual creativity. He also developed a version of TRIZ for children, which was trialled in various schools. In 1989 the TRIZ Association was formed, with Altshuller chosen as President.

Following the end of the cold war, the waves of emigrants from the former Soviet Union brought TRIZ to other countries and drew attention to it overseas. In 1995 the Altshuller Institute for TRIZ Studies was es- tablished in Boston, USA. TRIZ presents a systematic approach for un- derstanding and defining challenging problems: difficult problems re- quire an inventive solution, and TRIZ provides a range of strategies and tools for finding these inventive solutions. One of the earliest findings of the massive research on which the theory is based is that the vast ma- jority of problems that require inventive solutions typically reflect a need to overcome a dilemma or a trade-off between two contradictory elements. The central purpose of TRIZ-based analysis is to systematical- ly apply the strategies and tools to find superior solutions that overcome the need for a compromise or trade-off between the two elements.


By the early 1970s two decades of research covering hundreds of thousands of patents had confirmed Altshuller's initial insight about the patterns of inventive solutions and one of the first analytical tools was published in the form of 40 inventive principles, which could account for virtually all of those patents that presented truly inventive solutions. The combination of all of these concepts together – the analysis of the contradiction, the pursuit of an ideal solution and the search for one or more of the principles which will overcome the contradiction, are the key elements in a process which is designed to help the inventor to en- gage in the process with purposefulness and focus.

One of the tools which evolved as an extension of the 40 principles was a contradiction matrix in which the contradictory elements of a problem were categorized according to a list of 39 factors which could impact on each other. The combination of each pairing of these 39 ele- ments is set out in a matrix. Each of the 39 elements is represented down the rows and across the columns (as the negatively affected ele- ment) and based upon the research and analysis of patents: wherever precedent solutions have been found that resolve a conflict between two of the elements, the relevant cells in the matrix typically contain a sub- set of three or four principles that have been applied most frequently in inventive solutions which resolve contradictions between those two el- ements.

The main objective of the contradiction matrix was to simplify the process of selecting the most appropriate Principle to resolve a specific contradiction. It was the core of all modifications of ARIZ till 1973. But in 1973, after introducing the concept of physical contradictions and creating SuField analysis, Altshuller realized that the contradiction ma- trix was comparatively an inefficient tool and stopped working on it. Beginning ARIZ-71c contradiction matrix ceased to be the core of ARIZ and therefore was not a tool for solving inventive problems that Altshuller believed should be pursued.

Physical contradictions and separation principles as well as SuField analysis, etc. became the core. Despite this, the 40 principles of inven- tion has remained the most popular tool taught in introductory seminars and has consistently attracted the most attention amongst the tens of thousands of individuals who visit TRIZ-focused web sites in a typical month. Therefore, many of those who learn TRIZ or have attended sem- inars are taught quite wrongly that TRIZ is primarily composed of the


40 principles and contradiction matrix, the truth is ARIZ is the core methodology of TRIZ.

ARIZ is an algorithmic approach to finding inventive solutions by identifying and resolving contradictions. This includes the "system of inventive standards solutions" which Altshuller used to replace the 40 principles and contradiction matrix, it consists of SuField modeling and the 76 inventive standards. A number of TRIZ-based computer pro- grams have been developed whose purpose is to provide assistance to engineers and inventors in finding inventive solutions for technological problems. Some of these programs are also designed to apply another TRIZ methodology whose purpose is to reveal and forecast emergency situations and to anticipate circumstances which could result in undesir- able outcomes.

One of the important branches of TRIZ is focused on analysing and predicting trends of evolution in the characteristics that existing solu- tions are likely to develop in successive generations of a system.

· Ideal final result - the ultimate idealistic solution of a problem when the desired result is achieved by itself. Note that the Ideal Final Result is also an ARIZ term for the formulation of the inventive prob- lem in the form of a Technical Contradiction and a Physical Contradic- tion;

· Administrative contradiction - contradiction between the needs and abilities;

· Technical contradiction - an inverse dependence between pa- rameters/characteristics of a machine or technology;

· Physical contradiction - opposite/contradictory physical re- quirements to an object;

· Separation principle - a method of resolving physical contradic- tions by separating contradictory requirements;

· Vepol or Su-field - a minimal technical system consisting of two material objects and a "field". "Field" is the source of energy whereas one of the substances is "transmission" and the other one is the "tool";

· Fepol or Ferfiel - a sort of Vepol where "substances" are ferro- magnetic objects;

· Level of invention;

· Standard solution - a standard inventive solution of a higher level;

· Laws of technical systems evolution;


· Algorithm of inventive problems solving, which combines vari- ous specialized methods of TRIZ into one universal tool;

· Talented Thinking or Thinking in Time and Scale;

Altshuller has shown that at the heart of some inventive problems lie contradictions (one of the basic TRIZ concepts) between two or more elements, such as, "If we want more acceleration, we need a larger en- gine; but that will increase the cost of the car," that is, more of some- thing desirable also brings more of something less desirable, or less of something else also desirable. These are called technical contradictions by Altshuller. He also defined so-called physical or inherent contradic- tions: More of one thing and less of the same thing may both be desired in the same system. For instance, a higher temperature may be needed to melt a compound more rapidly, but a lower temperature may be needed to achieve a homogeneous mixture.

An inventive situation which challenges us to be inventive, might in- volve several such contradictions. Conventional solutions typically "trade" one contradictory parameter for another; no special inventive- ness is needed for that. Rather, the inventor would develop a creative approach for resolving the contradiction, such as inventing an engine that produces more acceleration without increasing the cost of the en- gine.

Altshuller screened patents in order to find out what kind of contra- dictions were resolved or dissolved by the invention and the way this had been achieved. From this he developed a set of 40 inventive princi- ples and later a matrix of contradictions. Rows of the matrix indicate the 39 system features that one typically wants to improve, such as speed, weight, accuracy of measurement and so on. Columns refer to typical undesired results. Each matrix cell points to principles that have been most frequently used in patents in order to resolve the contradiction.

For instance, Dolgashev mentions the following contradiction: in- creasing accuracy of measurement of machined balls while avoiding the use of expensive microscopes and elaborate control equipment. The ma- trix cell in row "accuracy of measurement" and column "complexity of control" points to several principles, among them the Copying Principle, which states, "Use a simple and inexpensive optical copy with a suitable scale instead of an object that is complex, expensive, fragile or incon- venient to operate." From this general invention principle, the following idea might solve the problem: Taking a high-resolution image of the machined ball. A screen with a grid might provide the required meas-


urement. As mentioned above, Altshuler abandoned this method of de- fining and solving "technical" contradictions in the mid 1980s and in- stead used SuField modeling and the 76 inventive standards and a num- ber of other tools included in the algorithm for solving inventive prob- lems, ARIZ.

Altshuller also studied the way technical systems have been devel- oped and improved over time. From this, he discovered several trends that help engineers predict the most likely improvements that can be made to a given product. The most important of these laws involves the ideality of a system.

One more technique that is frequently used by inventors involves the analysis of substances, fields and other resources that are currently not being used and that can be found within the system or nearby. TRIZ us- es non-standard definitions for substances and fields. Altshuller devel- oped methods to analyze resources; several of his invention principles involve the use of different substances and fields that help resolve con- tradictions and increase ideality of a technical system. For instance, vid- eotext systems used television signals to transfer data, by taking ad- vantage of the small time segments between TV frames in the signals.

SuField analysis produces a structural model of the initial technolog- ical system, exposes its characteristics, and with the help of special laws, transforms the model of the problem. Through this transformation the structure of the solution that eliminates the shortcomings of the ini- tial problem is revealed. SuField analysis is a special language of formu- las with which it is possible to easily describe any technological system in terms of a specific model. A model produced in this manner is trans- formed according to special laws and regularities, thereby revealing the structural solution of the problem.

Various TRIZ software are based on this algorithm.

Starting with an updated matrix of contradictions, semantic analysis, subcategories of inventive principles and lists of scientific effects, some new interactive applications are other attempts to simplify the problem formulation phase and the transition from a generic problem to a whole set of specific solutions. (See the external links for details.)

Although TRIZ was developed from the analysis of technical sys- tems, it has been used widely as a method for understanding and solving complex management problems. Examples include finding additional cost savings for the legal department of a local government body: the inventive solution generated was to generate additional revenue. The


results of the TRIZ work are expected to generate £1.7 m in profit in the first 5 years.

Case studies on the use of TRIZ are difficult to acquire as many companies believe TRIZ gives them a competitive advantage and are reluctant to publicise their adoption of the method. However some ex- amples are available: Samsung is the most famous success story, and has invested heavily in embedding TRIZ use throughout the company, right up to and including the CEO. Rolls-Royce, BAE Systems and GE are all documented users of TRIZ TRIZ is a Whizz article; Mars has documented how applying TRIZ led to a new patent for chocolate pack- aging. TRIZ has also been used successfully by Leafield Engineering, Smart Stabilizer Systems and Buro Happold to solve problems and gen- erate new patents.

Various promoters of TRIZ reported that car companies Rolls- Royce, Ford, and Daimler-Chrysler, Johnson & Johnson, aeronautics companies Boeing, NASA, technology companies Hewlett Packard, Motorola, General Electric, Xerox, IBM, LG, Samsung, Intel, Procter and Gamble, Expedia and Kodak have used TRIZ methods in some pro- jects.

The European TRIZ Association is an association based in Germany, founded in 2000. ETRIA considers itself an open community to unite the efforts, suggest opportunities for global standardization, conduct further research and development, and provide mechanisms for the ex- change of information and knowledge on TRIZ and TRIZ-based innova- tion technologies. ETRIA is developing a web-based collaborative envi- ronment targeted the creation of links between any and all institution- sconcerned with conceptual questions pertaining to the creation, organi- zation, and efficient processing of innovation knowledge and innovation technologies.

TRIZ is considered as a cross-disciplinary, generic methodology, but it has not previously been presented in terms of logic or any other for- mal knowledge representation. Most of the concepts introduced in TRIZ are fuzzy, and most of the techniques are still heuristic and only partial- ly formalized. For further development and conceptual re-organization of the TRIZ knowledge base, ETRIA involves and collaborates with TRIZ experts and professionals from the domains of logic, organization science, informatics and linguistics. The Association holds conferences with associated publications.

ETRIA has the following goals


· Research and development of innovation knowledge by inte- grating conceptual approaches to classification developed by artificial intelligence and knowledge management communities;

· International observation, analysis, evaluation and reporting of progress in these directions;

· Promotion and exchange of information and experience be- tween scientists and practitioners in TRIZ, universities and other educa- tional organizations;

· Development of TRIZ through contributions from dedicated ex- perts and specialists in particular areas of expertise.

1. SIT (systematic inventive thinking)

2. USIT (unified structured inventive thinking)

3. Trizics (Methodology for the systematic application of TRIZ)

 














Tribology

Tribology is the science and engineering of interacting surfaces in relative motion. It includes the study and application of the principles of friction, lubrication and wear. Tribology is a branch of mechanical engi- neering and materials science.

The word tribology derives from the Greek root τριβ- of the verb τρίβω, tribo, I rub in classic Greek and the suffix -logy from -λογία, - logia study of, knowledge of. It was coined by the British physicist Da- vid Tabor, and also by Peter Jost in 1964, a lubrication expert who no- ticed the problems with increasing friction on machines, and started the new discipline of tribology. The tribological interactions of a solid sur- face's exposed face with interfacing materials and environment may re- sult in loss of material from the surface. The process leading to loss of material is known as wear. Major types of wear include abrasion, fric- tion, erosion, and corrosion. Wear can be minimized by modifying the surface properties of solids by one or more "surface engineering" pro- cesses or by use of lubricants.

Estimated direct and consequential annual loss to industries in the USA due to wear is approximately 1-2% of GDP. Engineered surfaces extend the working life of both original and recycled and resurfaced equipment, thus saving large sums of money and leading to conserva- tion of material, energy and the environment. Methodologies to mini- mize wear include systematic approaches to diagnose the wear and to prescribe appropriate solutions. Important methods include:


· Point like contact theory was established by Heinrich Hertz in 1880s.

· Fluid lubrication dynamics was established by Arnold Johannes Sommerfeld in 1900s.

· Terotechnology, where multidisciplinary engineering and man- agement techniques are used to protect equipment and machinery from degradation

· Horst Czichos's systems approach, where appropriate material is selected by checking properties against tribological requirements under operating environment

· Asset Management by Material Prognosis - a concept similar to terotechnology which has been introduced by the US Military for up- keep of equipment in good health and start-ready condition for 24 hours. Good health monitoring systems combined with appropriate remedies at maintenance and repair stages have led to improved performance, relia- bility and extended life cycle of the assets, such as advanced military hardware and civil aircraft.

In recent years, micro- and nanotribology have been gaining ground. Frictional interactions in microscopically small components are becom- ing increasingly important for the development of new products in elec- tronics, life sciences, chemistry, sensors and by extension for all modern technology. on the basis of the ―Stribeck curve‖. These curves clearly show the minimum value of friction as the demarcation between full fluid-film lubrication and some solid asperity interactions.

Stribeck and others systematically studied the variation of friction between two liquid lubricated surfaces as a function of a dimensionless lubrication parameter ηN/P, where η is the dynamic viscosity, N the sliding speed and P the load projected on to the geometrical surface.

The ―Stribeck-curve‖ has been a classic teaching element in tribolo- gy classes. Duncan Dowson surveyed the history of tribology in his book History of Tribology. This comprehensive book covers develop- ments from prehistory, through early civilizations and finally the key developments up to the end of the twentieth century.

Historically, Leonardo da Vinci was the first to enunciate two laws of friction. Guillaume Amontons rediscovered the classic rules, but un- like da Vinci, made his findings public at the Academie Royale des Sci- ences for verification. They were further developed by Charles- Augustin de Coulomb. Charles Hatchett carried out the first reliable test on frictional wear using a simple reciprocating machine to evaluate wear


on gold coins. He found that compared to self-mated coins, coins with grits between them wore at a faster rate. Michael J Neale was a leader the field of Tribology in the mid to late 1900's - For nearly 40 years he specialised in solving problems in machinery design by applying his knowledge of Tribology. Neale was respected as an educator with a gift for integrating theoretical work with his own practical experience to produce easy-to-understand design guides. The Tribology Handbook, which he first edited in 1973 and updated in 1995, is used around the world and forms the basis of numerous training courses for engineering designers.

The "Stribeck curve" or "Stribeck – Hersey curve" (named after Richard Stribeck, who heavily documented and established examples of it, and Mayo D. Hersey), which is used to categorize the friction proper- ties between two surfaces, was developed in the first half of the 20th century. The research of Professor Richard Stribeck was performed in Berlin at the Royal Prussian Technical Testing Institute. Similar work was previously performed around 1885 by Prof. Adolf Martens at the same Institute and in the mid-1870s by Dr. Robert H. Thurston at the Stevens Institute of Technology in the U.S. Prof. Dr. Thurston was therefore close to establishing the ―Stribeck curve‖, but he presented no

―Stribeck‖-like graphs, as he evidently did not fully believe in the rele- vance of this dependency. Since that time the ―Stribeck-curve‖ has been a classic teaching element in tribology classes.

The graphs of friction force reported by Stribeck stem from a care- fully conducted, wide-ranging series of experiments on journal bearings. Stribeck systematically studied the variation of friction between two liquid lubricated surfaces. His results were presented on 5 December 1901 during a public session of the railway society and published on 6 September 1902. They clearly showed the minimum value of friction as the demarcation between full fluid-film lubrication and some solid as- perity interactions. Stribeck studied different bearing materials and as- pect ratios D/L from 1:1 to 1:2. The maximum sliding speed was 4 m/s and the geometrical contact pressure was limited to 5 MPa. These oper- ating conditions were related to railway wagon journal bearings.

The reason why the form of the friction curve for liquid lubricated surfaces was later attributed to Stribeck, although both Thurston and Martens achieved their results considerably earlier, may be because Stribeck published in the most important technical journal in Germany at that time, Zeitschrift des Vereins Deutscher Ingenieure. Martens pub-


lished his results ―only‖ in the official journal of the Royal Prussian Technical Testing Institute, which has now become BAM. The VDI journal, as one of the most important journals for engineers, provided wide access to these data and later colleagues rationalized the results into the three classical friction regimes. Thurston however, did not have the experimental means to record a continuous graph of the coefficient of friction but only measured the friction at discrete points; this may be the reason why the minimum in the coefficient of friction was not dis- covered by him. Instead, Thurston's data did not indicate such a pro- nounced minimum of friction for a liquid lubricated journal bearing as was demonstrated by the graphs of Martens and Stribeck.

The term tribology became widely used following The Jost Report in 1966. The report said that friction, wear and corrosion were costing the UK huge sums of money every year. As a result, the UK set up several national centres for tribology. Since then the term has diffused into the international engineering field, with many specialists now identifying as tribologists. There are now numerous national and international socie- ties, such as the Society for Tribologists and Lubrication Engineers in the USA, the Institution of Mechanical Engineers Tribology Group in the UK or the German Society for Tribology and MYTRIBOS.

Most technical universities have researchers working on tribology, often as part of mechanical engineering departments. The limitations in tribological interactions are, however, no longer mainly determined by mechanical designs, but by material limitations. So the discipline of tri- bology now counts at least as many materials engineers, physicists and chemists as it does mechanical engineers.

Since the 1990s, new areas of tribology have emerged, including the nanotribology, biotribology, and green tribology. These interdisciplinary areas study the friction, wear and lubrication at the nanoscale, in bio- medical applications, and ecological aspects of friction, lubrication and wear.

Recently, intensive studies of superlubricity have sparked due to high demand in energy savings. Development of new materials, such as gra- phene, initiated development of fundamentally new approaches in the lubrication field. Moreover, the industrial process such as heat treatment also change the wear rate.

The study of tribology is commonly applied in bearing design but ex- tends into almost all other aspects of modern technology, even to such unlikely areas as hair conditioners and cosmetics such as lipstick, pow-


ders and lipgloss. Any product where one material slides or rubs over another is affected by complex tribological interactions, whether lubri- cated like hip implants and other artificial prostheses, or unlubricated as in high temperature sliding wear in which conventional lubricants can- not be used but in which the formation of compacted oxide layer glazes have been observed to protect against wear.

Tribology plays an important role in manufacturing. In metal- forming operations, friction increases tool wear and the power required to work a piece. This results in increased costs due to more frequent tool replacement, loss of tolerance as tool dimensions shift, and greater forc- es required to shape a piece. The use of lubricants which minimize di- rect surface contact reduces tool wear and power requirements.

 





Biotechnology

Biotechnology is the use of living systems and organisms to develop or make products, or any technological application that uses biological systems, living organisms, or derivatives thereof, to make or modify products or processes for specific use. Depending on the tools and ap- plications, it often overlaps with the fields of bioengineering, biomedi- cal engineering, biomanufacturing, molecular engineering, etc. For thousands of years, humankind has used biotechnology in agriculture, food production, and medicine. The term is largely believed to have been coined in 1919 by Hungarian engineer Károly Ereky. In the late 20th and early 21st centuries, biotechnology has expanded to include new and diverse sciences such as genomics, recombinant gene tech- niques, applied immunology, and development of pharmaceutical thera- pies and diagnostic tests.

The wide concept of "biotech" or "biotechnology" encompasses a wide range of procedures for modifying living organisms according to human purposes, going back to domestication of animals, cultivation of the plants, and "improvements" to these through breeding programs that employ artificial selection and hybridization. Modern usage also in- cludes genetic engineeringas well as cell and tissue culture technologies. The American Chemical Society defines biotechnology as the applica- tion of biological organisms, systems, or processes by various industries to learning about the science of life and the improvement of the value of materials and organisms such as pharmaceuticals, crops, and livestock. As per European Federation of Biotechnology, biotechnology is the in- tegration of natural science and organisms, cells, parts thereof, and mo-


lecular analogues for products and services. Biotechnology also writes on the pure biological sciences. In many instances, it is also dependent on knowledge and methods from outside the sphere of biology includ- ing:

· bioinformatics, a new brand of computer science

· bioprocess engineering

· biorobotics

· chemical engineering

Conversely, modern biological sciences are intimately entwined and heavily dependent on the methods developed through biotechnology and what is commonly thought of as the life sciencesindustry. Biotechnolo- gy is the research and development in the laboratory using bioinformat- ics for exploration, extraction, exploitation and production from any living organisms and any source of biomass by means of biochemical engineering where high value-added products could be planned, fore- casted, formulated, developed, manufactured, and marketed for the pur- pose of sustainable operations and gaining durable patents rights.

By contrast, bioengineering is generally thought of as a related field that more heavily emphasizes higher systems approaches for interfacing with and utilizing living things. Bioengineering is the application of the principles of engineering and natural sciences to tissues, cells and mole- cules. This can be considered as the use of knowledge from working with and manipulating biology to achieve a result that can improve functions in plants and animals. Relatedly, biomedical engineering is an overlapping field that often draws upon and applies biotechnology , es- pecially in certain sub-fields of biomedical and/or chemical engineering such as tissue engineering, biopharmaceutical engineering, and genetic engineering.

Although not normally what first comes to mind, many forms of hu- man-derived agriculture clearly fit the broad definition of "utilizing a biotechnological system to make products". Indeed, the cultivation of plants may be viewed as the earliest biotechnological enterprise.

Agriculture has been theorized to have become the dominant way of producing food since the Neolithic Revolution. Through early biotech- nology, the earliest farmers selected and bred the best suited crops, hav- ing the highest yields, to produce enough food to support a growing population. As crops and fields became increasingly large and difficult to maintain, it was discovered that specific organisms and their by- products could effectively fertilize, restore nitrogen, and control pests.


Throughout the history of agriculture, farmers have inadvertently altered the genetics of their crops through introducing them to new environ- ments and breeding them with other plants – one of the first forms of biotechnology.

These processes also were included in early fermentation of beer. These processes were introduced in early Mesopotamia, Egypt, China and India, and still use the same basic biological methods. In brewing, malted grains convert starch from grains into sugar and then adding spe- cific yeasts to produce beer. In this process, carbohydrates in the grains were broken down into alcohols such as ethanol. Later other cultures produced the process of lactic acid fermentation which allowed the fer- mentation and preservation of other forms of food, such as soy sauce. Fermentation was also used in this time period to produce leavened bread. Although the process of fermentation was not fully understood until Louis Pasteur's work in 1857, it is still the first use of biotechnolo- gy to convert a food source into another form.

Before the time of Charles Darwin's work and life, animal and plant scientists had already used selective breeding. Darwin added to that body of work with his scientific observations about the ability of science to change species. These accounts contributed to Darwin's theory of natural selection.

For thousands of years, humans have used selective breeding to im- prove production of crops and livestock to use them for food. In selec- tive breeding, organisms with desirable characteristics are mated to pro- duce offspring with the same characteristics. For example, this tech- nique was used with corn to produce the largest and sweetest crops.

In the early twentieth century scientists gained a greater understand- ing of microbiology and explored ways of manufacturing specific prod- ucts. In 1917, Chaim Weizmann first used a pure microbiological cul- ture in an industrial process, that of manufacturing corn starch using Clostridium acetobutylicum, to produce acetone, which the United Kingdom desperately needed to manufacture explosives during World War I. Biotechnology has also led to the development of antibiotics. In 1928, Alexander Fleming discovered the mold Penicillium. His work led to the purification of the antibiotic compound formed by the mold by Howard Florey, Ernst Boris Chain and Norman Heatley – to form what we today know as penicillin. In 1940, penicillin became available for medicinal use to treat bacterial infections in humans.


The field of modern biotechnology is generally thought of as having been born in 1971 when Paul Berg's experiments in gene splicing had early success. Herbert W. Boyer and Stanley N. Cohen significantly ad- vanced the new technology in 1972 by transferring genetic material into a bacterium, such that the imported material would be reproduced. The commercial viability of a biotechnology industry was significantly ex- panded on June 16, 1980, when the United States Supreme Court ruled that a genetically modified microorganism could be patented in the case of Diamond v. Chakrabarty. Indian-born Ananda Chakrabarty, working for General Electric, had modified a bacterium capable of breaking down crude oil, which he proposed to use in treating oil spills.

Chakrabarty's work did not involve gene manipulation but rather the transfer of entire organelles between strains of the Pseudomonas bacte- rium. Revenue in the industry is expected to grow by 12.9% in 2008. Another factor influencing the biotechnology sector's success is im- proved intellectual property rights legislation – and enforcement – worldwide, as well as strengthened demand for medical and pharmaceu- tical products to cope with an ageing, and ailing, U.S. population.

Rising demand for biofuels is expected to be good news for the bio- technology sector, with the Department of Energyestimating ethanol usage could reduce U.S. petroleum-derived fuel consumption by up to 30% by 2030. The biotechnology sector has allowed the U.S. farming industry to rapidly increase its supply of corn and soybeans – the main inputs into biofuels – by developing genetically modified seeds which are resistant to pests and drought. By boosting farm productivity, bio- technology plays a crucial role in ensuring that biofuel production tar- gets are met.

Biotechnology has applications in four major industrial areas, includ- ing health care, crop production and agriculture, non food uses of crops and other products, and environmental uses.

For example, one application of biotechnology is the directed use of organisms for the manufacture of organic products. Another example is using naturally present bacteria by the mining industry in bioleaching. Biotechnology is also used to recycle, treat waste, clean up sites con- taminated by industrial activities, and also to produce biological weap- ons.

A series of derived terms have been coined to identify several branches of biotechnology; for example:


· Bioinformatics is an interdisciplinary field which addresses bio- logical problems using computational techniques, and makes the rapid organization as well as analysis of biological data possible. The field may also be referred to as computational biology, and can be defined as, conceptualizing biology in terms of molecules and then applying infor- matics techniques to understand and organize the information associated with these molecules, on a large scale. Bioinformatics plays a key role in various areas, such as functional genomics, structural genomics, and proteomics, and forms a key component in the biotechnology and phar- maceutical sector.

· Blue biotechnology is a term that has been used to describe the marine and aquatic applications of biotechnology, but its use is relative- ly rare.

· Green biotechnology is biotechnology applied to agricultural processes. An example would be the selection and domestication of plants via micropropagation. Another example is the designing of trans- genic plants to grow under specific environments in the presence of chemicals. One hope is that green biotechnology might produce more environmentally friendly solutions than traditional industrial agriculture. An example of this is the engineering of a plant to express a pesticide, thereby ending the need of external application of pesticides. An exam- ple of this would be Bt corn. Whether or not green biotechnology prod- ucts such as this are ultimately more environmentally friendly is a topic of considerable debate.

· Red biotechnology is applied to medical processes. Some ex- amples are the designing of organisms to produce antibiotics, and the engineering of genetic cures through genetic manipulation.

· White biotechnology, also known as industrial biotechnology, is biotechnology applied to industrial processes. An example is the design- ing of an organism to produce a useful chemical. Another example is the using of enzymes as industrial catalysts to either produce valuable chemicals or destroy hazardous/polluting chemicals. White biotechnol- ogy tends to consume less in resources than traditional processes used to produce industrial goods.

The investment and economic output of all of these types of applied biotechnologies is termed as bioeconomy. In medicine, modern biotech- nology finds applications in areas such as pharmaceutical drug discovery and production, pharmacogenomics, and genetic testing.


Pharmacogenomics is the technology that analyses how genetic makeup affects an individual's response to drugs. It deals with the influ- ence of genetic variation on drug response in patients by correlating gene expression or single-nucleotide polymorphisms with a drug's effi- cacyor toxicity. By doing so, pharmacogenomics aims to develop ra- tional means to optimize drug therapy, with respect to the patients‘ gen- otype, to ensure maximum efficacy with minimal adverse effects. Such approaches promise the advent of personalized medicine; in which drugs and drug combinations are optimized for each individual's unique genetic makeup.

Biotechnology has contributed to the discovery and manufacturing of traditional small molecule pharmaceutical drugs as well as drugs that are the product of biotechnology – biopharmaceutics. Modern biotechnolo- gy can be used to manufacture existing medicines relatively easily and cheaply. The first genetically engineered products were medicines de- signed to treat human diseases. To cite one example, in 1978 Genentech developed synthetic humanized insulin by joining its gene with a plas- midvector inserted into the bacterium Escherichia coli. Insulin, widely used for the treatment of diabetes, was previously extracted from the pancreas of abattoir animals. The resulting genetically engineered bacte- rium enabled the production of vast quantities of synthetic human insu- lin at relatively low cost. Biotechnology has also enabled emerging therapeutics like gene therapy. The application of biotechnology to basic science has also dramatically improved our understanding of biology and as our scientific knowledge of normal and disease biology has in- creased, our ability to develop new medicines to treat previously un- treatable diseases has increased as well.

Genetic testing allows the genetic diagnosis of vulnerabilities to in- herited diseases, and can also be used to determine a child's parentage or in general a person's ancestry. In addition to studying chromosomes to the level of individual genes, genetic testing in a broader sense includes biochemicaltests for the possible presence of genetic diseases, or mutant forms of genes associated with increased risk of developing genetic dis- orders. Genetic testing identifies changes in chromosomes, genes, or proteins. Most of the time, testing is used to find changes that are asso- ciated with inherited disorders. The results of a genetic test can confirm or rule out a suspected genetic condition or help determine a person's chance of developing or passing on a genetic disorder. As of 2011 sev- eral hundred genetic tests were in use. Since genetic testing may open


up ethical or psychological problems, genetic testing is often accompa- nied by genetic counseling.

Agriculture. Genetically modified crops are plants used in agricul- ture, the DNA of which has been modified with genetic engineering techniques. In most cases the aim is to introduce a new trait to the plant which does not occur naturally in the species.

Examples in food crops include resistance to certain pests, diseases, stressful environmental conditions, resistance to chemical treatments, reduction of spoilage, or improving the nutrient profile of the crop. Ex- amples in non-food crops include production of pharmaceutical agents, biofuels, and other industrially useful goods, as well as for bioremedia- tion. Farmers have widely adopted GM technology. Between 1996 and 2011, the total surface area of land cultivated with GM crops had in- creased by a factor of 94, from 17,000 square kilometers to 1,600,000 km2. 10% of the world's crop lands were planted with GM crops in 2010. As of 2011, 11 different transgenic crops were grown commercially on 395 million acres in 29 countries such as the USA, Brazil, Argentina, India, Canada, China, Paraguay, Pakistan, South Af- rica, Uruguay, Bolivia, Australia, Philippines, Myanmar, Burkina Faso, Mexico and Spain.

Genetically modified foods are foods produced from organisms that have had specific changes introduced into their DNA with the methods of genetic engineering. These techniques have allowed for the introduc- tion of new crop traits as well as a far greater control over a food's ge- netic structure than previously afforded by methods such as selective breeding and mutation breeding. Commercial sale of genetically modi- fied foods began in 1994, when Calgene first marketed its Flavr Savrde- layed ripening tomato. To date most genetic modification of foods have primarily focused on cash crops in high demand by farmers such as soy- bean, corn, canola, and cotton seed oil. These have been engineered for resistance to pathogens and herbicides and better nutrient profiles. GM livestock have also been experimentally developed, although as of No- vember 2013 none are currently on the market.

There is a scientific consensus that currently available food derived from GM crops poses no greater risk to human health than conventional food, but that each GM food needs to be tested on a case-by-case basis before introduction. Nonetheless, members of the public are much less likely than scientists to perceive GM foods as safe. The legal and regu- latory status of GM foods varies by country, with some nations banning


or restricting them, and others permitting them with widely differing degrees of regulation.

GM crops also provide a number of ecological benefits, if not used in excess. However, opponents have objected to GM crops per se on sev- eral grounds, including environmental concerns, whether food produced from GM crops is safe, whether GM crops are needed to address the world's food needs, and economic concerns raised by the fact these or- ganisms are subject to intellectual property law.

Industrial biotechnology is the application of biotechnology for in- dustrial purposes, including industrial fermentation. It includes the prac- tice of using cells such as micro-organisms, or components of cells like enzymes, to generate industrially useful products in sectors such as chemicals, food and feed, detergents, paper and pulp, textiles and biofu- els. In doing so, biotechnology uses renewable raw materials and may contribute to lowering greenhouse gas emissions and moving away from a petrochemical-based economy.

The environment can be affected by biotechnologies, both positively and adversely. Vallero and others have argued that the difference be- tween beneficial biotechnology versus the adverse effects stemming from biotechnological enterprises can be seen as applications and impli- cations, respectively. Cleaning up environmental wastes is an example of an application of environmental biotechnology; whereas loss of bio- diversity or loss of containment of a harmful microbe are examples of environmental implications of biotechnology.

The regulation of genetic engineering concerns approaches taken by governments to assess and manage the risks associated with the use of genetic engineering technology, and the development and release of ge- netically modified organisms, including genetically modified crops and genetically modified fish. There are differences in the regulation of GMOs between countries, with some of the most marked differences occurring between the USA and Europe. Regulation varies in a given country depending on the intended use of the products of the genetic engineering. For example, a crop not intended for food use is generally not reviewed by authorities responsible for food safety. The European Union differentiates between approval for cultivation within the EU and approval for import and processing. While only a few GMOs have been approved for cultivation in the EU a number of GMOs have been ap- proved for import and processing. The cultivation of GMOs has trig- gered a debate about coexistence of GM and non GM crops. Depending


on the coexistence regulations incentives for cultivation of GM crops differ.

In 1988, after prompting from the United States Congress, the Na- tional Institute of General Medical Sciences instituted a funding mecha- nism for biotechnology training. Universities nationwide compete for these funds to establish Biotechnology Training Programs. Each suc- cessful application is generally funded for five years then must be com- petitively renewed. Graduate students in turn compete for acceptance into a BTP; if accepted, then stipend, tuition and health insurance sup- port is provided for two or three years during the course of their Ph.D. thesis work. Nineteen institutions offer NIGMS supported BTPs. Bio- technology training is also offered at the undergraduate level and in community colleges.

 









Дата: 2019-07-24, просмотров: 300.