Filter results by Topics

Your search for all content returned 22 results

Save search You must be logged in as an individual save a search. Log-in/register
Article
Analysis of variance

A statistical technique that partitions the total variation in experimental data into components assignable to specific sources. Analysis of variance is applicable to data for which (1) effects of sources are additive, (2) uncontrolled or unexplained experimental variations (which are grouped as experimental errors) are independent of other sources of variation, (3) variance of experimental errors is homogeneous, and (4) experimental errors follow a normal distribution. When data depart from these assumptions, one must exercise extreme care in interpreting the results of an analysis of variance. Statistical tests indicate the contribution of the components to the observed variation. See also: Experiment; Statistics

Article
Bayesian statistics

An approach to statistics in which estimates are based on a synthesis of a prior distribution and current sample data. Bayesian statistics is not a branch of statistics in the way that, say, nonparametric statistics is. It is, in fact, a self-contained paradigm providing tools and techniques for all statistical problems. In the classical, "frequentist" viewpoint of statistical theory, a statistical procedure is judged by averaging its performance over all possible data. However, the Bayesian approach gives prime importance to how a given procedure performs for the actual data observed in a given situation. Further, in contrast to the classical procedures, the Bayesian procedures formally utilize information available from sources other than the statistical investigation. Such information, available through expert judgment, past experience, or prior belief, is described by a probability distribution on the set of all possible values of the unknown parameter of the statistical model at hand. This probability distribution is called the prior distribution. The crux of the Bayesian approach is the synthesis of the prior distribution and the current sample data into a posterior probability distribution from which all decisions and inferences are made (see figure). This synthesis is achieved by using a theorem proved by English statistician Thomas Bayes in the eighteenth century. See also: Probability distribution; Probability; Statistics

Article
Binomial theorem

One of the most important algebraic identities, with many applications in a variety of fields. The binomial theorem, discovered by Isaac Newton, expresses the result of multiplying a binomial by itself any number of times, as given in Eq. (1),

081600MF0010
081600MF0020(1)

Article
Biometrics

The application of mathematical and statistical methods to describe and analyze data concerning the variation of biological characteristics obtained from either observation or experiment. The concept of a population and the samples derived from it are discussed in this article.

Article
Boltzmann statistics

To describe a system consisting of a large number of particles in a physically useful manner, recourse must be had to so-called statistical procedures. If the mechanical laws operating in the system are those of classical mechanics, and if the system is sufficiently dilute, the resulting statistical treatment is referred to as Boltzmann or classical statistics. (Dilute in this instance means that the total volume available is much larger than the proper volume of the particles.) A gas is a typical example: The molecules interacting according to the laws of classical mechanics are the constituents of the system, and the pressure, temperature, and other parameters are the overall entities which determine the macroscopic behavior of the gas. In a case of this kind it is neither possible nor desirable to solve the complicated equations of motion of the molecules; one is not interested in the position and velocity of every molecule at any time. The purpose of the statistical description is to extract from the mechanical description just those features relevant for the determination of the macroscopic properties and to omit others.

Article
Bose-Einstein statistics

The statistical description of quantum-mechanical systems in which there is no restriction on the way in which particles can be distributed over the individual energy levels. This description applies when the system has a symmetric wave function. This in turn has to be the case when the particles described are of integer spin.

Article
Combinatorial theory

The branch of mathematics which studies arrangements of elements (usually a finite number) into sets under certain prescribed constraints. Problems combinatorialists attempt to solve include the enumeration problem (how many such arrangements are there?), the structure problem (what are the properties of these arrangements and how efficiently can associated calculations be made?), and, when the constraints become more subtle, the existence problem (is there such an arrangement?).

Article
Estimation theory

A branch of probability and statistics concerned with deriving information about properties of random variables, stochastic processes, and systems based on observed samples. Some of the important applications of estimation theory are found in control and communication systems, where it is used to estimate the unknown states and parameters of the system. For example, the position and velocity of a satellite is estimated from ground radar observations of its range, elevation, and azimuth. These observations are contaminated with random noise due to atmospheric propagation and radar circuitry. The statistical properties of random noise are assumed known except for some parameters which can be estimated from the data. Generally, the random noise is assumed to have a gaussian distribution, and its mean and covariance may be known or unknown. It is also assumed to be “white,” that is, uncorrelated from one time instant to the next. The integral of white noise is a Wiener process or brownian motion process which plays a fundamental role in the theory of stochastic processes. See also: Distribution (probability); Electrical noise; Stochastic process

Article
Factor analysis

A method of quantitative multivariate analysis with the goal of representing the interrelationships among a set of continuously measured variables (usually represented by their intercorrelations) by a number of underlying, linearly independent reference variables called factors. Although the term factor analysis has come to represent a family of analysis methods, the two most commonly used approaches are the full component model, in which the entire variance of the variables (represented by unities inserted in the principal diagonal of the correlation matrix) is analyzed, and the common factor model, in which the proportion of the variance that is accounted for by the common factors (represented by communality estimates inserted in the principal diagonal) is analyzed.

Article
Fermi-Dirac statistics

The statistical description of particles or systems of particles that satisfy the Pauli exclusion principle. This description was first given by E. Fermi, who applied the Pauli exclusion principle to the translational energy levels of a system of electrons. It was later shown by P. A. M. Dirac that this form of statistics is also obtained when the total wave function of the system is antisymmetrical. See also: Exclusion principle; Nonrelativistic quantum theory