@RegressionGAM/predict
Predict new data points using generalized additive model regression object.
yFit = predict (obj, Xfit
returns a vector of
predicted responses, yFit, for the predictor data in matrix Xfit
based on the Generalized Additive Model in obj. Xfit must have
the same number of features/variables as the training data in obj.
RegressionGAM
class object.
[yFit, ySD, yInt] = predict (obj, Xfit
also returns the standard deviations, ySD, and prediction intervals,
yInt, of the response variable yFit, evaluated at each
observation in the predictor data Xfit.
yFit = predict (…, Name, Value)
returns the
aforementioned results with additional properties specified by
Name-Value
pair arguments listed below.
Name | Value | |
---|---|---|
"alpha" | significance level of the prediction
intervals yInt, specified as scalar in range [0,1] . The default
value is 0.05, which corresponds to 95% prediction intervals. | |
"includeinteractions" | a boolean flag to include
interactions to predict new values based on Xfit. By default,
"includeinteractions" is true when the GAM model in obj
contains a obj.Formula or obj.Interactions fields. Otherwise,
is set to false . If set to true when no interactions are
present in the trained model, it will result to an error. If set to
false when using a model that includes interactions, the predictions
will be made on the basic model without any interaction terms. This way you
can make predictions from the same GAM model without having to retrain it. |
See also: fitrgam, @RegressionGAM/RegressionGAM
Source Code: @RegressionGAM/predict
## Declare two different functions f1 = @(x) cos (3 * x); f2 = @(x) x .^ 3; ## Generate 80 samples for f1 and f2 x = [-4*pi:0.1*pi:4*pi-0.1*pi]'; X1 = f1 (x); X2 = f2 (x); ## Create a synthetic response by adding noise rand ("seed", 3); Ytrue = X1 + X2; Y = Ytrue + Ytrue .* 0.2 .* rand (80,1); ## Assemble predictor data X = [X1, X2]; ## Train the GAM and test on the same data a = fitrgam (X, Y, "order", [5, 5]); [ypred, ySDsd, yInt] = predict (a, X); ## Plot the results figure [sortedY, indY] = sort (Ytrue); plot (sortedY, "r-"); xlim ([0, 80]); hold on plot (ypred(indY), "g+") plot (yInt(indY,1), "k:") plot (yInt(indY,2), "k:") xlabel ("Predictor samples"); ylabel ("Response"); title ("actual vs predicted values for function f1(x) = cos (3x) "); legend ({"Theoretical Response", "Predicted Response", "Prediction Intervals"}); ## Use 30% Holdout partitioning for training and testing data C = cvpartition (80, "HoldOut", 0.3); [ypred, ySDsd, yInt] = predict (a, X(test(C),:)); ## Plot the results figure [sortedY, indY] = sort (Ytrue(test(C))); plot (sortedY, 'r-'); xlim ([0, sum(test(C))]); hold on plot (ypred(indY), "g+") plot (yInt(indY,1),'k:') plot (yInt(indY,2),'k:') xlabel ("Predictor samples"); ylabel ("Response"); title ("actual vs predicted values for function f1(x) = cos (3x) "); legend ({"Theoretical Response", "Predicted Response", "Prediction Intervals"}); |