A different problem which is closely related to interpolation is the approximation of a complicated function by a simple function ( also called regression ) . The main difference between regression and interpolation is that polynomial regression gives a single polynomial that models the entire data set.
22.
By retaining E, P, and Z as separate variables, results from polynomial regression equations can be translated into three-dimensional surfaces, whose properties can be formally tested using procedures set forth by Edwards and Parry ( 1993; see also Edwards, 2002 ).
23.
Analyse-it provides a range of standard parametric and non-parametric procedures, including Descriptive statistics, ANOVA, Mann Whitney, Wilcoxon, chi-square, correlation, linear regression, logistic regression, polynomial regression and advanced model fitting, principal component analysis, and factor analysis.
24.
An example is polynomial regression, which uses a linear predictor function to fit an arbitrary degree polynomial relationship ( up to a given order ) between two sets of data points ( i . e . a single real-valued explanatory variable and a related real-valued dependent variable ), by adding multiple explanatory variables corresponding to various powers of the existing explanatory variable.
25.
In general, for a set of n points ( x 1, y 1 ), ( x 2, y 2 ), ( x 3, y 3 ), . . . ( x n, y n ), you can generate a polynomial in degree ( n-1 ) that will fit all the points " exactly " ( rather than just capturing general trends while avoiding an exact fit as polynomial regression does ), but it's pretty meaningless since you're arbitrarily choosing coefficients to make it do so.