Javadoc
LocallyWeightedFunction is a generalization of the k-nearest neighbor
concept, also known as "Instance-Based Learning", "Memory-Based Learning",
"Nonparametric Regression", "Case-Based Regression", or
"Kernel-Based Regression". This approach essentially has no up-front
learning time, but creates a local function approximation in response to
a evaluate() call. The local function approximation is created by weighting
the original dataset by a value given by a Kernel against each input sample
in the dataset.
KernelWeightedRobustRegression is different from LocallyWeightedFunction in
that KWRR creates a global function approximator and holds for all inputs.
Thus, up-front learning time for KWRR is relatively high, but evaluation time
is relatively low. On the other hand, LWL creates a local function
approximator in response to each evaluation, and LWL does not create a global
function approximator. As such, LWL has (almost) no up-front learning time,
but each evaluation requires relatively high computation. The cost of LWL
function evaluation depends strongly on the type of learner given to the
algorithm. If you use fast or closed-form learners, then you may not notice
the evaluation time. But if you use some brain-dead iterative technique,
like Gradient Descent, then use LWL at your own risk.
KWRR is more appropriate when you know the general structure of your data,
but it is riddled with outliers. LWL is more appropriate when you don't
know/understand the general trend of your data AND you can afford evaluation
time to be somewhat costly.