












BackFitting
An iterative method of fitting additive models, by fitting each
term to the residuals given the rest. It is a version of the
GaussSeidel methods of numerical linear algebra. 








BackPropagation
is the method used to calculate the gradient vector of a fitting
criterion for a feedforward neural network with respect to
the parameters (weights). Also used for a steepestdescent algorithm
with the gradient vector computed in this way. 








Bartlett Test of Sphericity
Statistical test for the overall significance of all correlations
with a correlation matrix 








Bayes Formular
An elementary formula of probability. If are disjoint events,
and then 








Bayes Rule
is a rule which attains the Bayes risk, and so is the 'goldstandard',
the vest possible for that problem. 








Bias
has two meanings. (a) The bias of an estimator is the difference
between its mean and the true value. (b) For a neural network,
parameters which are constants (rather than multiplying signals)
are often called biases. 








BIC
has two similar meanings. Akaike (1977, 1978) introduced 'information
criterion B'. Schwarz (1978) introduced something which has
become known as a 'Bayesian information criterion'. Although
most references mean Schwarz's BIC, to avoid confusion this
is also known as SBC('Schwarz Bayes Criterion'). Both penalize
the deviance by log n times the number p of free parameters
for n examples, but Akaike's has O(p) terms not depending on
n. 








Bootstrap
(Efron, 1979) An idea for statistical inference, using training
sets created by resampling with replacement from the original
training set, so examples may occur more than once. 








Box¡¯s M
Statistical test for the equality of the covariance matrices
of the independent variables across the groups of the dependent
variables. If the statistical significance is greater than the
critical level(eg., 0.01), then the equality of the covariance
matrices is supported. If the test shows statistical significance,
then the group are deemed different and the assumption is violated









BranchandBound
bound A technique in combinatorial optimization to rule out
solutions without evaluating them. 





