In small sample case, Linear Discriminant Analysis (LDA) may suffer from rank deficiency issue. Applied mathematics has used Tikhonov regularization - also known as \(\ell_2\) regularization/shrinkage - to adjust linear operator. Regularized Linear Discriminant Analysis (RLDA) adopts such idea to stabilize eigendecomposition in LDA formulation.
do.rlda(X, label, ndim = 2, alpha = 1)
an \((n\times p)\) matrix or data frame whose rows are observations and columns represent independent variables.
a length-\(n\) vector of data class labels.
an integer-valued target dimension.
Tikhonow regularization parameter.
a named list containing
an \((n\times ndim)\) matrix whose rows are embedded observations.
a list containing information for out-of-sample prediction.
a \((p\times ndim)\) whose columns are basis for projection.
Friedman JH (1989). “Regularized Discriminant Analysis.” Journal of the American Statistical Association, 84(405), 165.
if (FALSE) {
## use iris data
data(iris)
set.seed(100)
subid = sample(1:150, 50)
X = as.matrix(iris[subid,1:4])
label = as.factor(iris[subid,5])
## try different regularization parameters
out1 <- do.rlda(X, label, alpha=0.001)
out2 <- do.rlda(X, label, alpha=0.01)
out3 <- do.rlda(X, label, alpha=100)
## visualize
opar <- par(no.readonly=TRUE)
par(mfrow=c(1,3))
plot(out1$Y, pch=19, col=label, main="RLDA::alpha=0.1")
plot(out2$Y, pch=19, col=label, main="RLDA::alpha=1")
plot(out3$Y, pch=19, col=label, main="RLDA::alpha=10")
par(opar)
}