#redirectTemplate:Dated maintenance category
Stochastic (from the Greek στόχος for aim or guess) means random. A stochastic process is one whose behavior is non-deterministic, in that a system's subsequent state is determined both by the process's predictable actions and by a random element. However, according to M. Kac and E. Nelson, any kind of time development (be it deterministic or essentially probabilistic) which is analyzable in terms of probability deserves the name of stochastic process.
The use of the term stochastic to mean based on the theory of probability has been traced back to Ladislaus Bortkiewicz, who meant the sense of making conjectures that the Greek term bears since ancient philosophers, and after the title of "Ars Conjectandi" that Bernoulli gave to his work on probability theory.
In artificial intelligence, stochastic programs work by using probabilistic methods to solve problems, as in simulated annealing, stochastic neural networks, stochastic optimization, and genetic algorithms. A problem itself may be stochastic as well, as in planning under uncertainty. A deterministic environment is much simpler for an agent to deal with.
An example of a stochastic process in the natural world is pressure in a gas as modeled by the Wiener process. Even though (classically speaking) each molecule is moving in a deterministic path, the motion of a collection of them is computationally and practically unpredictable. A large enough set of molecules will exhibit stochastic characteristics, such as filling the container, exerting equal pressure, diffusing along concentration gradients, etc. These are emergent properties of the systems.
The name "Monte Carlo" for the stochastical Monte Carlo method was popularized by physics researchers Stanislaw Ulam, Enrico Fermi, John von Neumann, and Nicholas Metropolis, among others. The name is a reference to the Monte Carlo Casino in Monaco where Ulam's uncle would borrow money to gamble. The use of randomness and the repetitive nature of the process are analogous to the activities conducted at a casino.
Random methods of computation and experimentation (generally considered forms of stochastic simulation) can be arguably traced back to the earliest pioneers of probability theory (see, e.g., Buffon's needle, and the work on small samples by William Sealy Gosset), but are more specifically traced to the pre-electronic computing era. The general difference usually described about a Monte Carlo form of simulation is that it systematically "inverts" the typical mode of simulation, treating deterministic problems by first finding a probabilistic analog (see Simulated annealing). Previous methods of simulation and statistical sampling generally did the opposite: using simulation to test a previously understood deterministic problem. Though examples of an "inverted" approach do exist historically, they were not considered a general method until the popularity of the Monte Carlo method spread.
Perhaps the most famous early use was by Enrico Fermi in 1930, when he used a random method to calculate the properties of the newly-discovered neutron. Monte Carlo methods were central to the simulations required for the Manhattan Project, though were severely limited by the computational tools at the time. Therefore, it was only after electronic computers were first built (from 1945 on) that Monte Carlo methods began to be studied in depth. In the 1950s they were used at Los Alamos for early work relating to the development of the hydrogen bomb, and became popularized in the fields of physics, physical chemistry, and operations research. The Rand Corporation and the U.S. Air Force were two of the major organizations responsible for funding and disseminating information on Monte Carlo methods during this time, and they began to find a wide application in many different fields.
Uses of Monte Carlo methods require large amounts of random numbers, and it was their use that spurred the development of pseudorandom number generators, which were far quicker to use than the tables of random numbers which had been previously used for statistical sampling.
In biological systems, introducing stochastic 'noise' has been found to help improve the signal strength of the internal feedback loops for balance and other vestibular communication. It has been found to help diabetic and stroke patients with balance control.
Stochastic effect, or "chance effect" is one classification of radiation effects that refers to the random, statistical nature of the damage. In contrast to the deterministic effect, severity is independent of dose. Only the probability of an effect increases with dose. Cancer is a stochastic effect.
Simonton (2003, Psych Bulletin) argues that creativity in science (of scientists) is a constrained stochastic behaviour such that new theories in all sciences are, at least in part, the product of a stochastic process.
The results of a stochastic process (statistics) can only be known after computing it.
Stochastic processes can be used in music to compose a fixed piece or can be produced in performance. Stochastic music was pioneered by Iannis Xenakis, who used probability, game theory, group theory, set theory, and Boolean algebra, and frequently used computers to produce his scores. Earlier, John Cage and others had composed aleatoric or indeterminate music, which is created by chance processes but does not have the strict mathematical basis (Cage's Music of Changes, for example, uses a system of charts based on the I-Ching).
When color reproductions are made, the image is separated into its component colors by taking multiple photographs filtered for each color. One resultant film or plate represents each of the cyan, magenta, yellow, and black data. Color printing is a binary system, where ink is either present or not present, so all color separations to be printed must be translated into dots at some stage of the work-flow. Traditional line screens which are amplitude modulated had problems with moiré but were used until stochastic screening became available. A stochastic (or frequency modulated) dot pattern creates a sharper image.
Non-deterministic approaches in language studies are largely inspired by the work of Ferdinand de Saussure. In usage-based linguistic theories, for example, where it is argued that competence, or langue, is based on performance, or parole, in the sense that linguistic knowledge is based on frequency of experience, grammar is often said to be probabilistic and variable rather than fixed and absolute. This is so, because one's competence changes in accordance with one's experience with linguistic units. This way, the frequency of usage-events determines one's knowledge of the language in question.
Stochastic social science theory is similar to systems theory in that events are interactions of systems, although with a marked emphasis on unconscious processes. The event creates its own conditions of possibility, rendering it unpredictable if simply for the amount of variables involved. Stochastic social science theory can be seen as an elaboration of a kind of 'third axis' in which to situate human behavior alongside the traditional 'nature vs. nurture' opposition. See Julia Kristeva on her usage of the 'semiotic', Luce Irigaray on reverse Heideggerian epistemology, and Pierre Bourdieu on polythetic space for examples of stochastic social science theory.
Manufacturing processes are assumed to be stochastic processes. This assumption is largely valid for either continuous or batch manufacturing processes. Testing and monitoring of the process is recorded using a process control chart which plots a given process control parameter over time. Typically a dozen or many more parameters will be tracked simultaneously. Statistical models are used to define limit lines which define when corrective actions must be taken to bring the process back to its intended operational window.
The financial markets use stochastic models to represent the seemingly random behaviour of assets such as stocks, commodities and interest rates. These models are then used by quantitative analysts to value options on stock prices, bond prices, and on interest rates, see Markov models. Moreover, it is at the heart of the insurance industry.