Definition

In electronics, a mathematical-statiĀ­stiĀ­cal method for predicting a time-variable signal in the presence of disturbances. The method exploits the fact that certain characteristic parameters of the process vary slowly with time, so that a best estimate of the signal can be obtained as a function of time.