Journal of Data Science logo


Login Register

  1. Home
  2. Issues
  3. Volume 23, Issue 2 (2025): Special Issue: the 2024 Symposium on Data Science and Statistics (SDSS)
  4. Rescale Hinge Loss Support Vector Data D ...

Journal of Data Science

Submit your article Information
  • Article info
  • More
    Article info

Rescale Hinge Loss Support Vector Data Description
Volume 23, Issue 2 (2025): Special Issue: the 2024 Symposium on Data Science and Statistics (SDSS), pp. 287–311
Edgard M. Maboudou-Tchao ORCID icon link to view author Edgard M. Maboudou-Tchao details   Emil Agbemade ORCID icon link to view author Emil Agbemade details   Jongik Chung  

Authors

 
Placeholder
https://doi.org/10.6339/25-JDS1185
Pub. online: 23 April 2025      Type: Statistical Data Science      Open accessOpen Access

Received
17 August 2024
Accepted
7 April 2025
Published
23 April 2025

Abstract

Significant attention has been drawn to support vector data description (SVDD) due to its exceptional performance in one-class classification and novelty detection tasks. Nevertheless, all slack variables are assigned the same weight during the modeling process. This can lead to a decline in learning performance if the training data contains erroneous observations or outliers. In this study, an extended SVDD model, Rescale Hinge Loss Support Vector Data Description (RSVDD) is introduced to strengthen the resistance of the SVDD to anomalies. This is achieved by redefining the initial optimization problem of SVDD using a hinge loss function that has been rescaled. As this loss function can increase the significance of samples that are more likely to represent the target class while decreasing the impact of samples that are more likely to represent anomalies, it can be considered one of the variants of weighted SVDD. To efficiently address the optimization challenge associated with the proposed model, the half-quadratic optimization method was utilized to generate a dynamic optimization algorithm. Experimental findings on a synthetic and breast cancer data set are presented to illustrate the new proposed method’s performance superiority over the already existing methods for the settings considered.

Supplementary material

 Supplementary Material
We have provided all the supplementary materials necessary to successfully reproduce this work, including the simulation data, corresponding code, and illustrative examples.

References

 
Boyd S, Vandenberghe L (2004). Convex Optimization. Cambridge University Press.
 
Cha M, Kim JS, Baek JG (2014). Density weighted support vector data description. Expert Systems with Applications, 41(7): 3343–3350. https://doi.org/10.1016/j.eswa.2013.11.025
 
Donoho DL (1982). Breakdown properties of multivariate location estimators. Technical report, Harvard University, Boston. http://www-stat.stanford.edu/~
 
Erfani SM, Rajasegarar S, Karunasekera S, Leckie C (2016). High-dimensional and large-scale anomaly detection using a linear one-class SVM with deep learning. Pattern Recognition, 58: 121–134. https://doi.org/10.1016/j.patcog.2016.03.028
 
Ghasemi E, Shahbahrami A, Hashemi M (2021). Robust support vector data description based on information entropy for one-class classification. Expert Systems with Applications, 170: 114403.
 
Hu W, Hu T, Wei Y, Lou J, Wang S (2021). Global plus local jointly regularized support vector data description for novelty detection. IEEE Transactions on Neural Networks and Learning Systems, 34(9): 6602–6614. https://doi.org/10.1109/TNNLS.2021.3129321
 
Kennedy K, Mac Namee B, Delany SJ (2009). Learning without default: A study of one-class classification and the low-default portfolio problem. In: Bridge D, Doyle D, Hayes P (Eds.), Proceedings of the Irish Conference on Artificial Intelligence and Cognitive Science, 174–187. Springer.
 
Khan NM, Ksantini R, Ahmad IS, Guan L (2014). Covariance-guided one-class support vector machine. Pattern Recognition, 47(6): 2165–2177. https://doi.org/10.1016/j.patcog.2014.01.004
 
Khan SS, Madden MG (2014). One-class classification: Taxonomy of study and review of techniques. Knowledge Engineering Review, 29(3): 345–374. https://doi.org/10.1017/S026988891300043X
 
Kivinen J, Smola AJ, Williamson RC (2004). Online learning with kernels. IEEE Transactions on Signal Processing, 52(8): 2165–2176. https://doi.org/10.1109/TSP.2004.830991
 
Liu W, Pokharel PP, Principe JC (2007). Correntropy: Properties and applications in non-Gaussian signal processing. IEEE Transactions on Signal Processing, 55(11): 5286–5298. https://doi.org/10.1109/TSP.2007.896065
 
Maboudou-Tchao EM (2018). Kernel methods for changes detection in covariance matrices. Communications in Statistics. Simulation and Computation, 47(6): 1704–1721. https://doi.org/10.1080/03610918.2017.1322701
 
Maboudou-Tchao EM (2020). Change detection using least squares one-class classification control chart. Quality Technology & Quantitative Management, 17(5): 609–626. https://doi.org/10.1080/16843703.2019.1711302
 
Maboudou-Tchao EM (2021a). High-dimensional data monitoring using support machines. Communications in Statistics. Simulation and Computation, 50(7): 1927–1942. https://doi.org/10.1080/03610918.2019.1588312
 
Maboudou-Tchao EM (2021b). Monitoring the mean with least-squares support vector data description. Gestão & Produção, 28(3): e019. https://doi.org/10.1590/1806-9649-2021v28e019
 
Maboudou-Tchao EM (2021c). Support tensor data description. Journal of Quality Technology, 53(2): 109–134. https://doi.org/10.1080/00224065.2019.1642815
 
Maboudou-Tchao EM (2023). Least-squares support tensor data description. Communications in Statistics. Simulation and Computation, 52(7): 3026–3042. https://doi.org/10.1080/03610918.2021.1926500
 
Maboudou-Tchao EM, Hampton HD (2025). Deep least squares one-class classification. Journal of Quality Technology, 57(1): 68–92. https://doi.org/10.1080/00224065.2024.2421164
 
Maboudou-Tchao EM, Harrison CW (2021). A comparative study of L1 and L2 norms in support vector data descriptions. In: Chatterjee S, Sarker R, Herrmann JW (Eds.), Control Charts and Machine Learning for Anomaly Detection in Manufacturing, 217–241. Springer.
 
Maboudou-Tchao EM, Silva IR, Diawara N (2018). Monitoring the mean vector with Mahalanobis kernels. Quality Technology & Quantitative Management, 15(4): 459–474. https://doi.org/10.1080/16843703.2016.1226707
 
Nikolova M, Ng MK (2005). Analysis of half-quadratic minimization methods for signal and image recovery. SIAM Journal on Scientific Computing, 27(3): 937–966. https://doi.org/10.1137/030600862
 
Principe JC (2010). Information Theoretic Learning: Renyi’s Entropy and Kernel Perspectives. Springer Science & Business Media.
 
Ruff L, Vandermeulen R, Goernitz N, Deecke L, Siddiqui SA, Binder A, et al. (2018). Deep one-class classification. In: Dy J, Krause A (Eds.), Proceedings of the 35th International Conference on Machine Learning, 4393–4402. PMLR.
 
Schölkopf B, Platt JC, Shawe-Taylor J, Smola AJ, Williamson RC (2001). Estimating the support of a high-dimensional distribution. Neural Computation, 13(7): 1443–1471. https://doi.org/10.1162/089976601750264965
 
Schölkopf B, Williamson RC, Smola A, Shawe-Taylor J, Platt J (1999). Support vector method for novelty detection. Advances in Neural Information Processing Systems, 12.
 
Seliya N, Abdollah Zadeh A, Khoshgoftaar TM (2021). A literature review on one-class classification and its potential applications in big data. Journal of Big Data, 8: 1–31. https://doi.org/10.1186/s40537-020-00387-6
 
Singh A, Pokharel R, Principe J (2014). The c-loss function for pattern classification. Pattern Recognition, 47(1): 441–453. https://doi.org/10.1016/j.patcog.2013.07.017
 
Stahel WA (1981). Robust estimation: Infinitesimal optimality and covariance matrix estimators. Unpublished doctoral dissertation, ETH, Zurich, Switzerland.
 
Sun R, Tsung F (2003). A kernel-distance-based multivariate control chart using support vector methods. International Journal of Production Research, 41(13): 2975–2989. https://doi.org/10.1080/1352816031000075224
 
Tax DM, Duin RP (1999). Data domain description using support vectors. In: Proceedings of the European Symposium on Artificial Neural Networks (ESANN), 251–256.
 
Tax DM, Duin RP (2004). Support vector data description. Machine Learning, 54: 45–66. https://doi.org/10.1023/B:MACH.0000008084.60811.49
 
Wang CD, Lai J (2013). Position regularized support vector domain description. Pattern Recognition, 46(3): 875–884. https://doi.org/10.1016/j.patcog.2012.09.018
 
Wang K, Lan H (2020). Robust support vector data description for novelty detection with contaminated data. Engineering Applications of Artificial Intelligence, 91: 103554. https://doi.org/10.1016/j.engappai.2020.103554
 
Wolberg WH, Mangasarian OL (1990). Multisurface method of pattern separation for medical diagnosis applied to breast cytology. Proceedings of the National Academy of Sciences, 87(23): 9193–9196. https://doi.org/10.1073/pnas.87.23.9193
 
Wright J, Yang AY, Ganesh A, Sastry SS, Ma Y (2008). Robust face recognition via sparse representation. IEEE Transactions on Pattern Analysis and Machine Intelligence, 31(2): 210–227. https://doi.org/10.1109/TPAMI.2008.79
 
Xu G, Cao Z, Hu BG, Principe JC (2017). Robust support vector machines based on the rescaled hinge loss function. Pattern Recognition, 63: 139–148. https://doi.org/10.1016/j.patcog.2016.09.045
 
Yang Y, Cheng X, Gao Y (2017). Enhancing robustness in SVDD for imbalanced and noisy data classification. Applied Intelligence, 47(4): 1066–1078.

PDF XML
PDF XML

Copyright
2025 The Author(s). Published by the School of Statistics and the Center for Applied Statistics, Renmin University of China.
by logo by logo
Open access article under the CC BY license.

Keywords
anomaly detection correntropy loss function hinge loss function robust one-class classification rescaled hinge loss function

Funding
This work is partially supported by Microsoft.

Metrics
since February 2021
108

Article info
views

27

PDF
downloads

Export citation

Copy and paste formatted citation
Placeholder

Download citation in file


Share


RSS

Journal of data science

  • Online ISSN: 1683-8602
  • Print ISSN: 1680-743X

About

  • About journal

For contributors

  • Submit
  • OA Policy
  • Become a Peer-reviewer

Contact us

  • JDS@ruc.edu.cn
  • No. 59 Zhongguancun Street, Haidian District Beijing, 100872, P.R. China
Powered by PubliMill  •  Privacy policy