The NFDA-Nonsmooth Feasible Directions Algorithm applied to construction of Pareto Fronts of Ridge and Lasso Regressions

Autores

DOI:

https://doi.org/10.5540/tcam.2024.025.e01767

Palavras-chave:

Ridge Regression. Lasso Regression. Multiobjective Optimization. Pareto Front.

Resumo

Ridge and Lasso regressions are types of linear regression, a machine learning tool for dealing with data. Based on multiobjective optimization theory, we transform Ridge and Lasso regression into bi-objective optimization problems. The Pareto fronts of the resulting problems provide a range of regression models from which the best one can be selected. We employ the NFDA-Nonsmooth Feasible Directions Algorithm devised for solving convex optimization problems to construct the Pareto fronts of Ridge and Lasso when regarded as bi-objective problems.

Referências

T. Hastie, R. Tibshirani, and J. Friedman, The Elements of Statistical Learning. Data Mining, Inference and Prediction. Springer, 2008.

G. James, D. Witten, T. Hastie, and R. Tibshirani, An Introduction to Statistical Learning. With Applications in R. Springer, 2013.

G. Golub and C. Van Loan, Matrix Computations. 1983.

C. Bishop, Pattern Recognition and Machine Learning. Springer, New York, 2006.

J. Copas, “Regression, prediction and shrinkage,” Journal of the Royal Statistical Society, Series B, Methodological, vol. 45, no. 3, pp. 311–354, 1983.

T. Hastie, R. Tibshirani, and M. Wainwright, Statistical Learning with Sparsity. The Lasso and Generalizations. CRC Press, 2016.

S. Arlot and A. Celisse, “A survey of cross-validation procedures for model selection,” Statistics Surveys, vol. 4, pp. 40–79, 2010.

Y. Jung, “Multiple prediction k-fold cross-validation for model selection,” Journal of Nonparametric Statistics, vol. 30, no. 1, pp. 197–215, 2018.

V. Cherkassky and Y. Ma, “Comparison of model selection for regression,”Neural Computation, vol. 15, no. 7, pp. 1691–1714, 2003.

K. Bennet and E. Parrado-Hernandez, “The interplay of optimization and machine learning research,” Journal of Machine Learning Research, vol. 7,pp. 1265–1281, 2006.

T. Suttorp and C. Igel, “Multi-objective optimization of support vector machines,” Studies in Computational Intelligence, vol. 16, pp. 199–220, 2006.

Y. Jin and B. Sendhoff, “Pareto-based multiobjective machine learning : An overview and cases studies,” IEEE Transactions on Systems, Man and Cybernetics - Part C : Applications and Reviews, vol. 38, no. 3, pp. 397–415, 2008.

M. Raimundo, Multi-Objective Optimization in Machine Learning. PhD thesis, State University of Campinas, 2018.

H. Charkhgard and A. Eshragh, “A new approach to select the best subset of predictors in linear regression modelling : Bi-objective mixed integer linear programming,” The Australian and New Zealand Industrial and Applied Mathematical Journal, vol. 61, no. 1, pp. 64–75, 2019.

T. Hastie, J. Taylor, R. Tibshirani, and G. Walther, “Forward stagewise regression and the monotone lasso,” Eletronic Journal of Statistics, vol. 1, pp. 1–29, 2007.

M. Osborne, B. Presnell, and B. Turlach, “On the lasso and its dual,” Journal of Computational and Graphical Statistics, vol. 9, no. 2, pp. 319–337, 2000.

J. Hiriart-Urruty and C. Lemarechal, Convex Analysis and Minimization Algorithms I, II. 1993.

M. Ehrgott, Multicriteria Optimization. Springer, 2005.

K. Miettinen, Nonlinear Multiobjective Optimization. Springer Science + Business Media, LLC, 1998.

J. Jahn, Vector Optimization. Theory, Applications and Extensions. Springer, 2011.

J. Dutta and C. Kaya, “A new scalarization and numerical method for constructing the weak pareto front of multiobjective optimization problems,” Optimization, vol. 60, no. 8-9, pp. 1091–1104, 2011.

R. Burachik, C. Kaya, and M. Rizvi, “A new scalarization technique and new algorithms to generate pareto fronts,” SIAM Journal on Optimization, vol. 27, no. 2, pp. 1010–1034, 2017.

P. Pardalos, A. Zilinskas, and J. Zilinskas, Non-Convex Multi-Objective Optimization. Springer, 2017.

K. Kiwiel, Methods of Descent for Nondifferentiable Optimization. 1985.

M. Makela and P. Neittaanmaki, Nonsmooth Optimization. Analysis and Algorithms with Applications to Optimal Control. 1992.

W. Freire, A Feasible Directions Algorithm for Convex Nondifferentiable Optimization. PhD thesis, Federal University of Rio de Janeiro, 2005.

J. Herskovits, W. Freire, M. Tanaka, and A. Canelas, “A feasible directions method for nonsmooth convex optimization,” Structural and Multidisciplinary Optimization, vol. 44, no. 3, pp. 363–377, 2011.

J. Herskovits, “Feasible directions interior point technique for nonlinear optimization,” Journal of Optimization Theory and Applications, vol. 99, no. 1,pp. 121–146, 1998.

Downloads

Publicado

2024-11-28

Como Citar

Freire, W. P. (2024). The NFDA-Nonsmooth Feasible Directions Algorithm applied to construction of Pareto Fronts of Ridge and Lasso Regressions. Trends in Computational and Applied Mathematics, 25(1), e01767. https://doi.org/10.5540/tcam.2024.025.e01767

Edição

Seção

Artigo Original