In many real-world applications, including data mining, bioinformatics, econometrics, and machine learning, the analysis of high-dimensional data with strong correlations among explanatory variables is associated with challenges such as multicollinearity, instability of coefficient estimates, and reduced generalization ability of classical regression models. Under such circumstances, regularization and feature selection methods play a crucial role in controlling model complexity, reducing the variance of estimators, and improving predictive accuracy and interpretability. Accordingly, regularized linear models such as Ridge regression, LASSO, and Elastic Net have received considerable attention as effective tools for handling high-dimensional data. In this thesis, classical and regularized linear models are examined from theoretical, statistical, and computational perspectives, and their performance under conditions of high dimensionality and correlated predictors is comparatively analyzed. Subsequently, with the aim of simultaneously enhancing feature selection, coefficient stability, and predictive accuracy, an extended model based on the Elastic Net regression framework is proposed. The performance of the proposed model is evaluated using several simulated and real high-dimensional datasets, and the results indicate that it achieves competitive performance in most cases and demonstrates noticeable superiority over classical methods and the standard Elastic Net in many scenarios.