ICode9

精准搜索请尝试: 精确搜索
首页 > 其他分享> 文章详细

稀疏优化模型及其正则化方法

2020-10-09 19:33:03  阅读:716  来源: 互联网

标签:linear Transactions IEEE Journal Zhang 稀疏 正则 SIAM 优化


 

https://mayhhu.github.io/ch/pdf/2018_SOM&RM_LWH.pdf

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

参考文献

References

[1] F. Bach, R. Jenatton, J. Mairal, and G. Obozinski. Structured sparsity through convex optimiza- tion. Statistical Science, 27(4):450–468, 11 2012.

[2] J. Barzilai and J. M. Borwein. Two-point step size gradient methods. IMA Journal of Numerical Analysis, 8(1):141–148, 1988.

Rate of Success (%)

18

  1. [3]  A. Beck and M. Teboulle. A fast iterative shrinkage-thresholding algorithm for linear inverse

    problems. SIAM Journal on Imaging Sciences, 2(1):183–202, 2009.

  2. [4]  P. J. Bickel, Y. Ritov, and A. B. Tsybakov. Simultaneous analysis of lasso and dantzig selector.

    Annals of Statistics, 37(4):1705–1732, 2009.

  3. [5]  T. Blumensath and M. E. Davies. Iterative thresholding for sparse approximations. Journal of

    Fourier Analysis and Applications, 14:629–654, 2008.

  4. [6]  T. T. Cai and L. Wang. Orthogonal matching pursuit for sparse signal recovery with noise. IEEE

    Transactions on Information Theory, 57:4680–4688, 2011.

  5. [7]  E. J. Cande`s and T. Tao. Decoding by linear programming. IEEE transactions on Information

    Theory, 51(12):4203–4215, 2005.

  6. [8]  R. Chartrand and V. Staneva. Restricted isometry properties and nonconvex compressive sens-

    ing. Inverse Problems, 24(3):035020, 2008.

  7. [9]  S. S. Chen, D. L. Donoho, and M. A. Saunders. Atomic decomposition by basis pursuit. SIAM

    Review, 43:129–159, 2001.

  8. [10]  I. Daubechies, M. Defrise, and C. De Mol. An iterative thresholding algorithm for linear in- verse problems with a sparsity constraint. Communications on Pure and Applied Mathematics, 57:1413–1457, 2004.

  9. [11]  B. Deplancke and N. Gheldof. Gene Regulatory Networks: Methods and Protocols. Springer, Berlin, 2012.

  10. [12]  D. L. Donoho. Compressed sensing. IEEE Transactions on Information Theory, 52(8):1289– 1306, 2006.

  11. [13]  B. Efron, T. Hastie, I. Johnstone, and R. Tibshirani. Least angle regression. Annals of Statistics, 32:407–499, 2004.

  12. [14]  M. Elad. Sparse and Redundant Representations. Springer, New York, 2010.

  13. [15]  J. Fan and R. Li. Variable selection via nonconcave penalized likelihood and its oracle proper-

    ties. Journal of the American Statistical Association, 96(456):1348–1360, 2001.

  14. [16]  S. A. V. D. Geer and P. Bu ̈hlmann. On the conditions used to prove oracle results for the lasso.

    Electronic Journal of Statistics, 3:2009, 2009.

  15. [17]  D. Han and X. Yuan. Local linear convergence of the alternating direction method of multipliers

    for quadratic programs. SIAM Journal on Numerical Analysis, 51(6):3446–3457, 2013.

  16. [18]  Y. Hu, C. Li, K. Meng, J. Qin, and X. Yang. Group sparse optimization via lp,q regularization. Journal of Machine Learning Research, 18(30):1–52, 2017.

19

  1. [19]  T. Lin, S. Ma, and S. Zhang. On the global linear convergence of the ADMM with multiblock

    variables. SIAM Journal on Optimization, 25(3):1478–1497, 2015.

  2. [20]  B. Natarajan. Sparse approximate solutions to linear systems. SIAM Journal on Computing,

    24(2):227–234, 1995.

  3. [21]  D. Needell and J. Tropp. CoSaMP: Iterative signal recovery from incomplete and inaccurate

    samples. Applied and Computational Harmonic Analysis, 26(3):301–321, 2009.

  4. [22]  H. Nyquist. Certain topics in telegraph transmission theory. Transactions of the American

    Institute of Electrical Engineers, 47:617–644, 1928.

  5. [23]  U.PlattandJ.Stutz.DifferentialOpticalAbsorptionSpectroscopy:PrinciplesandApplications.

    Springer, Berlin, 2008.

  6. [24]  J. Qin, Y. Hu, F. Xu, H. K. Yalamanchili, and J. Wang. Inferring gene regulatory networks by integrating ChIP-seq/chip and transcriptome data via LASSO-type regularization methods. Methods, 67(3):294–303, 2014.

  7. [25]  G. Raskutti, M. J. Wainwright, and B. Yu. Restricted eigenvalue properties for correlated gaus- sian designs. Journal of Machine Learning Research, 11:2241–2259, 2010.

  8. [26]  B. Recht, M. Fazel, and P. A. Parrilo. Guaranteed minimum-rank solutions of linear matrix equations via nuclear norm minimization. SIAM Review, 52(3):471–501, 2010.

  9. [27]  S. Tao, D. Boley, and S. Zhang. Local linear convergence of ISTA and FISTA on the LASSO problem. SIAM Journal on Optimization, 26(1):313–336, 2016.

  10. [28]  R. Tibshirani. Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society, series B, 58(1):267–288, 1996.

  11. [29]  J. A. Tropp. Greed is good: Algorithmic results for sparse approximation. IEEE Transactions on Information Theory, 50(10):2231–2242, 2004.

  12. [30]  E. van den Berg and M. P. Friedlander. Probing the pareto frontier for basis pursuit solutions. SIAM Journal on Scientific Computing, 31(2):890–912, 2008.

  13. [31]  J. Wang, Y. Hu, C. Li, and J.-C. Yao. Linear convergence of CQ algorithms and applications in gene regulatory network inference. Inverse Problems, 33(5):055017, 2017.

  14. [32]  J. Wright, A. Ganesh, S. Rao, and Y. Ma. Robust principal component analysis: Exact recovery of corrupted low-rank matrics by convex optimization. NIPS, 381:2080–2088, 2009.

  15. [33]  Z. Xu, X. Chang, F. Xu, and H. Zhang. L1/2 regularization: A thresholding representation theory and a fast solver. IEEE Transactions on Neural Networks and Learning Systems, 23:1013–1027, 2012.

20
[34] J. Yang and Y. Zhang. Alternating direction algorithms for l1-problems in compressive sensing.

SIAM Journal on Scientific Computing, 33(1):250–278, 2011.
[35] L. Zhang, Y. Hu, C. Li, and J.-C. Yao. A new linear convergence result for the iterative soft

thresholding algorithm. Optimization, 66(7):1177–1189, 2017.
[36] T. Zhang. Adaptive forward-backward greedy algorithm for learning sparse representations.

IEEE Transactions on Information Theory, 57(7):4689–4708, 2011.

   

标签:linear,Transactions,IEEE,Journal,Zhang,稀疏,正则,SIAM,优化
来源: https://www.cnblogs.com/cx2016/p/13787620.html

本站声明: 1. iCode9 技术分享网(下文简称本站)提供的所有内容,仅供技术学习、探讨和分享;
2. 关于本站的所有留言、评论、转载及引用,纯属内容发起人的个人观点,与本站观点和立场无关;
3. 关于本站的所有言论和文字,纯属内容发起人的个人观点,与本站观点和立场无关;
4. 本站文章均是网友提供,不完全保证技术分享内容的完整性、准确性、时效性、风险性和版权归属;如您发现该文章侵犯了您的权益,可联系我们第一时间进行删除;
5. 本站为非盈利性的个人网站,所有内容不会用来进行牟利,也不会利用任何形式的广告来间接获益,纯粹是为了广大技术爱好者提供技术内容和技术思想的分享性交流网站。

专注分享技术,共同学习,共同进步。侵权联系[81616952@qq.com]

Copyright (C)ICode9.com, All Rights Reserved.

ICode9版权所有