tag:blogger.com,1999:blog-6068554977761810119.post3551445616819400482..comments2023-03-01T18:52:45.230+09:00Comments on Welcome to My Sparse Land: Zero, one, and infinity in compressed sensingKitakyushu Fishinghttp://www.blogger.com/profile/01677531634711698554noreply@blogger.comBlogger2125tag:blogger.com,1999:blog-6068554977761810119.post-62421108111568498502012-05-07T13:22:03.510+09:002012-05-07T13:22:03.510+09:00Igor,
Thanks a lot for linking nice pdf's.
A...Igor,<br /><br />Thanks a lot for linking nice pdf's.<br /><br />At p.4 of the tutorial slides, I remember Volkan had a lecture at my university, Kyoto univ, just after ICASSP2012. The lecture was nice on high-dimensional statistics, which is different from one you suggested, but he also used the "<S>parsity" notation (<S> for the trademark of Superman). That's why I remember Volkan's lecture from the tutorial slides.<br /><br />Best regards,<br />MasaakiKitakyushu Fishinghttps://www.blogger.com/profile/01677531634711698554noreply@blogger.comtag:blogger.com,1999:blog-6068554977761810119.post-17785585177879861082012-05-07T06:23:11.270+09:002012-05-07T06:23:11.270+09:00Masaaki,
You might also be interested in this pap...Masaaki,<br /><br />You might also be interested in this paper<br /><br />http://ssg.mit.edu/~venkatc/crpw_lip_preprint10.pdf<br /><br />In the figure "A summary of the recovery bounds obtained using Gaussian width arguments."<br /><br /> there is this info about +/-1 vector being recoverable with gaussian matrices with p/2 rows.<br /><br />of related interest:<br />http://www.lx.it.pt/~mtf/ICASSP%202012%20Tutorial%20[Cevher,%20Figueiredo].pdf<br /><br />Igor.Igorhttps://www.blogger.com/profile/17474880327699002140noreply@blogger.com