Applications of Mathematics, Vol. 68, No. 5, pp. 623-642, 2023


A tight bound of modified iterative hard thresholding algorithm for compressed sensing

Jinyao Ma, Haibin Zhang, Shanshan Yang, Jiaojiao Jiang

Received September 13, 2022.   Published online March 21, 2023.

Abstract:  We provide a theoretical study of the iterative hard thresholding with partially known support set (IHT-PKS) algorithm when used to solve the compressed sensing recovery problem. Recent work has shown that IHT-PKS performs better than the traditional IHT in reconstructing sparse or compressible signals. However, less work has been done on analyzing the performance guarantees of IHT-PKS. In this paper, we improve the current RIP-based bound of IHT-PKS algorithm from $\delta_{3s-2k}<\frac1{\sqrt{32}}\approx0.1768$ to $\delta_{3s-2k}<\frac{\sqrt5-1}4\approx0.309$, where $\delta_{3s-2k}$ is the restricted isometric constant of the measurement matrix. We also present the conditions for stable reconstruction using the ${\rm IHT}^{\mu}$-PKS algorithm which is a general form of IHT-PKS. We further apply the algorithm on Least Squares Support Vector Machines (LS-SVM), which is one of the most popular tools for regression and classification learning but confronts the loss of sparsity problem. After the sparse representation of LS-SVM is presented by compressed sensing, we exploit the support of bias term in the LS-SVM model with the ${\rm IHT}^{\mu}$-PKS algorithm. Experimental results on classification problems show that ${\rm IHT}^{\mu}$-PKS outperforms other approaches to computing the sparse LS-SVM classifier.
Keywords:  iterative hard thresholding; signal reconstruction; classification problem; least squares support vector machine
Classification MSC:  34B16, 34C25, 90C31


References:
[1] K. Axiotis, M. Sviridenko: Sparse convex optimization via adaptively regularized hard thresholding. J. Mach. Learn. Res. 22 (2021), Article ID 122, 47 pages. MR 4279773 | Zbl 07370639
[2] R. G. Baraniuk, V. Cevher, M. F. Duarte, C. Hegde: Model-based compressive sensing. IEEE Trans. Inf. Theory 56 (2010), 1982-2001. DOI 10.1109/TIT.2010.2040894 | MR 2654489 | Zbl 1366.94215
[3] A. Besson, D. Perdios, Y. Wiaux, J.-P. Thiran: Joint sparsity with partially known support and application to ultrasound imaging. IEEE Signal Process. Lett. 26 (2019), 84-88. DOI 10.110/LSP.2018.2880571
[4] T. Blumensath, M. E. Davies: Iterative thresholding for sparse approximations. J. Fourier Anal. Appl. 14 (2008), 629-654. DOI 10.1007/s00041-008-9035-z | MR 2461601 | Zbl 1175.94060
[5] T. Blumensath, M. E. Davies: Iterative hard thresholding for compressed sensing. Appl. Comput. Harmon. Anal. 27 (2009), 265-274. DOI 10.1016/j.acha.2009.04.002 | MR 2559726 | Zbl 1174.94008
[6] T. Blumensath, M. E. Davies: Normalized iterative hard thresholding: Guaranteed stability and performance. IEEE J. Selected Topics Signal Process. 4 (2010), 298-309. DOI 10.1109/JSTSP.2010.2042411
[7] E. J. Candès, T. Tao: Decoding by linear programming. IEEE Trans. Inf. Theory 51 (2005), 4203-4215. DOI 10.1109/TIT.2005.858979 | MR 2243152 | Zbl 1264.94121
[8] R. E. Carrillo, L. F. Polania, K. E. Barner: Iterative algorithms for compressed sensing with partially known support. IEEE International Conference on Acoustics, Speech and Signal Processing. IEEE, Los Alamitos (2010), 3654-3657. DOI 10.1109/ICASSP.2010.5495901
[9] R. E. Carrillo, L. F. Polania, K. E. Barner: Iterative hard thresholding for compressed sensing with partially known support. IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, Los Alamitos (2011), 4028-4031. DOI 10.1109/ICASSP.2011.5947236
[10] S. S. Chen, D. L. Donoho, M. A. Saunders: Atomic decomposition by basis pursuit. SIAM Rev. 43 (2001), 129-159. DOI 10.1137/S003614450037906X | MR 1854649 | Zbl 0979.94010
[11] K. Cui, Z. Song, N. Han: Fast thresholding algorithms with feedbacks and partially known support for compressed sensing. Asia-Pac. J. Oper. Res. 37 (2020), Article ID 2050013, 20 pages. DOI 10.1142/S021759592050013X | MR 4107957 | Zbl 1457.94037
[12] S. Foucart: A note on guaranteed sparse recovery via $\ell_1$-minimization. Appl. Comput. Harmon. Anal. 29 (2010), 97-103. DOI 10.1016/j.acha.2009.10.004 | MR 2647014 | Zbl 1198.41011
[13] S. Foucart: Hard thresholding pursuit: An algorithm for compressive sensing. SIAM J. Numer. Anal. 49 (2011), 2543-2563. DOI 10.1137/100806278 | MR 2873246 | Zbl 1242.65060
[14] S. Foucart, H. Rauhut: A Mathematical Introduction to Compressive Sensing. Applied and Numerical Harmonic Analysis. Birkhäuser, New York (2013). DOI 10.1007/978-0-8176-4948-7 | MR 3100033 | Zbl 1315.94002
[15] L. Jacques: A short note on compressed sensing with partially known signal support. Signal Process. 90 (2010), 3308-3312. DOI 10.1016/j.sigpro.2010.05.025 | Zbl 1197.94063
[16] M. A. Khajehnejad, W. Xu, A. S. Avestimehr, B. Hassibi: Weighted $\ell_1$ minimization for sparse recovery with prior information. IEEE International Symposium on Information Theory. IEEE, Los Alamitos (2009), 483-487. DOI 10.1109ISIT.2009.5205716
[17] S. Khera, S. Singh: Estimation of channel for millimeter-wave hybrid massive MIMO systems using Orthogonal Matching Pursuit (OMP). J. Phys., Conf. Ser. 2327 (2022), Article ID 012040, 9 pages. DOI 10.1088/1742-6596/2327/1/012040
[18] Y. Li, C. Lin, W. Zhang: Improved sparse least-squares support vector machine classifiers. Neurocomputing 69 (2006), 1655-1658. DOI 10.1016/j.neucom.2006.03.001
[19] H. Liu, R. Foygel Barber: Between hard and soft thresholding: Optimal iterative thresholding algorithms. Inf. Inference 9 (2020), 899-933. DOI 10.1093/imaiai/iaz027 | MR 4188230 | Zbl 1473.90162
[20] R. Mall, J. A. K. Suykens: Very sparse LSSVM reductions for large-scale data. IEEE Trans. Neural Netw. Learn. Syst. 26 (2015), 1086-1097. DOI 10.1109/TNNLS.2014.2333879 | MR 3454265
[21] Y.-H. Shao, C.-N. Li, M.-Z. Liu, Z. Wang, N.-Y. Deng: Sparse $L_q$-norm least squares support vector machine with feature selection. Pattern Recognition 78 (2018), 167-181. DOI 10.1016/j.patcog.2018.01.016
[22] J. Shen, P. Li: A tight bound of hard thresholding. J. Mach. Learn. Res. 18 (2018), Article ID 208, 42 pages. MR 3827096 | Zbl 1473.62287
[23] J. A. K. Suykens, L. Lukas, J. Vandewalle: Sparse approximation using least squares support vector machines. IEEE International Symposium on Circuits and Systems (ISCAS). Vol. 2. IEEE, Los Alamitos (2000), 757-760. DOI 10.1109/ISCAS.2000.856439
[24] N. Vaswani, W. Lu: Modified-CS: Modifying compressive sensing for problems with partially known support. IEEE Trans. Signal Process. 58 (2010), 4595-4607. DOI 10.1109/TSP.2010.2051150 | MR 2752724 | Zbl 1392.94045
[25] R. von Borries, C. J. Miosso, C. Potes: Compressed sensing using prior information. International Workshop on Computational Advances in Multi-Sensor Adaptive Processing. IEEE, Los Alamitos (2007), 121-124. DOI 10.1109/CAMSAP.2007.4497980
[26] X.-Z. Wang, H.-J. Xing, Y. Li, Q. Hua, C.-R. Dong, W. Pedrycz: A study on relationship between generalization abilities and fuzziness of base classifiers in ensemble learning. IEEE Trans. Fuzzy Syst. 23 (2015), 1638-1654. DOI 10.1109/TFUZZ.2014.2371479
[27] J. Yang, A. Bouzerdoum, S. L. Phung: A training algorithm for sparse LS-SVM using compressive sampling. IEEE International Conference on Acoustics, Speech and Signal Processing. IEEE, Los Alamitos (2010), 2054-2057. DOI 10.1109/ICASSP.2010.5495015
[28] J. Yang, J. Ma: A sparsity-based training algorithm for Least Squares SVM. IEEE Symposium on Computational Intelligence and Data Mining (CIDM). IEEE, Los Alamitos (2014), 345-350. DOI 10.1109/CIDM.2014.7008688
[29] M. Yu, V. Gupta, M. Kolar: Recovery of simultaneous low rank and two-way sparse coefficient matrices, a nonconvex approach. Electron. J. Stat. 14 (2020), 413-457. DOI 10.1214/19-EJS1658 | MR 4054252 | Zbl 1434.90161
[30] X. Zhang, W. Xu, Y. Cui, L. Lu, J. Lin: On recovery of block sparse signals via block compressive sampling matching pursuit. IEEE Access 7 (2019), 175554-175563. DOI 10.1109/ACCESS.2019.2955759
[31] Y.-B. Zhao, Z.-Q. Luo: Improved rip-based bounds for guaranteed performance of two compressed sensing algorithms. Available at https://arxiv.org/abs/2007.01451 (2020), 10 pages.

Affiliations:   Jinyao Ma, Haibin Zhang, Shanshan Yang, Beijing Institute for Scientific and Engineering Computing, Faculty of Science, Beijing University of Technology, No. 100 Ping Le Yuan, Chaoyang District, Beijing 100124, P. R. China, e-mail: 17377341661@163.com, zhanghaibin@bjut.edu.cn, 17865814707@163.com; Jiaojiao Jiang (corresponding author), School of Computer Science and Engineering, University of New South Wales, High St, Sydney NSW 2052, Australia, e-mail: jiaojiao.jiang@unsw.edu.au


 
PDF available at: