Binary Gaussian Classification: LDA vs QDA
You are modeling a 2D binary Gaussian classifier with features (x, y):
-
Class 0: mean μ0 = (0, 0), covariance Σ0 = [[1, 0], [0, 1]].
-
Class 1: mean μ1 = (2, 0), covariance Σ1 = [[4, 0], [0, 1]].
-
Priors: π0 = 0.7, π1 = 0.3.
Assume natural logarithms throughout; additive constants common to both classes can be dropped in discriminants.
Tasks
-
Under the (incorrect) assumption Σ0 = Σ1, write the LDA discriminant and decision rule, and give the resulting linear decision boundary (explicit equation).
-
Using the true covariances, write the QDA discriminant and derive the quadratic decision boundary as an explicit scalar equation in x and y.
-
Classify the point x = (1, 1) under both LDA and QDA by numerically evaluating the discriminants.
-
With n = 60 observations per class, discuss when QDA is preferable vs when LDA is safer. Propose a regularized QDA that shrinks class-specific covariances toward a common Σ with a tunable shrinkage parameter, and explain how to select the parameter via cross-validation.