|
چکیده
|
Functional Principal Component Analysis (FPCA) is a cornerstone technique for dimension
reduction and exploratory analysis of functional data. While classical FPCA produces orthonormal
principal component functions and uncorrelated scores, it can be sensitive to noise, yield overly
complex components, and overlook the need for interpretability in modern, high-dimensional
settings. This thesis develops a unified framework for FPCA that integrates smoothness and
sparsity penalties, extending naturally from univariate curves to multivariate and two-way
functional data.
Starting from the classical low-rank matrix approximation perspective, we incorporate roughness
penalties on principal component functions to enforce smoothness and suppress spurious
high-frequency variation. Sparsity penalties, including lasso and SCAD, are then applied to
highlight the most informative regions of the domain and to set negligible loadings to zero,
improving interpretability and reducing effective dimensionality. The framework is extended to
multivariate FPCA via a penalized singular value decomposition (SVD) formulation, employing
block-diagonal roughness matrices and joint ℓ1-type penalties across multiple functional variables
to extract coherent modes of joint variation.
To address two-way functional data, where both row and column dimensions exhibit a smooth
structure, we propose a novel two-way penalized SVD that imposes smoothness and sparsity
simultaneously on the left (score) and right (loading) singular vectors. Efficient parameter tuning
strategies are developed using conditional generalized cross-validation (GCV) and K-fold
cross-validation (CV), with and without the one-standard-error rule, enabling robust selection of
multiple smoothing and sparsity parameters in high-dimensional applications. We also adapt the
definition of variance explained to account for non-orthogonality introduced by regularization,
ensuring accurate assessment of component importance.
The proposed met
|