mastodon.world is one of the many independent Mastodon servers you can use to participate in the fediverse.
Generic Mastodon server for anyone to use.

Server stats:

9K
active users

#sparse

0 posts0 participants0 posts today
JMLR<p>'The Effect of SGD Batch Size on Autoencoder Learning: Sparsity, Sharpness, and Feature Learning', by Nikhil Ghosh, Spencer Frei, Wooseok Ha, Bin Yu.</p><p><a href="http://jmlr.org/papers/v26/23-1022.html" rel="nofollow noopener noreferrer" translate="no" target="_blank"><span class="invisible">http://</span><span class="ellipsis">jmlr.org/papers/v26/23-1022.ht</span><span class="invisible">ml</span></a> <br> <br><a href="https://sigmoid.social/tags/sgd" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>sgd</span></a> <a href="https://sigmoid.social/tags/autoencoder" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>autoencoder</span></a> <a href="https://sigmoid.social/tags/sparse" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>sparse</span></a></p>
JMLR<p>'Extremal graphical modeling with latent variables via convex optimization', by Sebastian Engelke, Armeen Taeb.</p><p><a href="http://jmlr.org/papers/v26/24-0472.html" rel="nofollow noopener noreferrer" translate="no" target="_blank"><span class="invisible">http://</span><span class="ellipsis">jmlr.org/papers/v26/24-0472.ht</span><span class="invisible">ml</span></a> <br> <br><a href="https://sigmoid.social/tags/multivariate" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>multivariate</span></a> <a href="https://sigmoid.social/tags/graphical" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>graphical</span></a> <a href="https://sigmoid.social/tags/sparse" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>sparse</span></a></p>
JMLR<p>'Rank-one Convexification for Sparse Regression', by Alper Atamturk, Andres Gomez.</p><p><a href="http://jmlr.org/papers/v26/19-159.html" rel="nofollow noopener noreferrer" translate="no" target="_blank"><span class="invisible">http://</span><span class="ellipsis">jmlr.org/papers/v26/19-159.htm</span><span class="invisible">l</span></a> <br> <br><a href="https://sigmoid.social/tags/sparse" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>sparse</span></a> <a href="https://sigmoid.social/tags/lasso" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>lasso</span></a> <a href="https://sigmoid.social/tags/convexification" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>convexification</span></a></p>
BC BY-NC-SA<p>pxp – tats=3D94chliche-chaos<br><a href="https://mastodon.social/tags/blips" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>blips</span></a> <a href="https://mastodon.social/tags/datadriven" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>datadriven</span></a> <a href="https://mastodon.social/tags/experimentalelectronic" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>experimentalelectronic</span></a> <a href="https://mastodon.social/tags/extremecomputermusic" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>extremecomputermusic</span></a> <a href="https://mastodon.social/tags/farmersmanual" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>farmersmanual</span></a> <a href="https://mastodon.social/tags/generateandtest" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>generateandtest</span></a> <a href="https://mastodon.social/tags/internetdreaming" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>internetdreaming</span></a> <a href="https://mastodon.social/tags/networksound" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>networksound</span></a> <a href="https://mastodon.social/tags/sparse" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>sparse</span></a> <a href="https://mastodon.social/tags/synthvoice" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>synthvoice</span></a> <a href="https://mastodon.social/tags/Berlin" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Berlin</span></a><br>CC BY (<a href="https://mastodon.social/tags/CreativeCommons" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>CreativeCommons</span></a> Attribution) <a href="https://mastodon.social/tags/ccmusic" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>ccmusic</span></a><br><a href="https://farmersmanual.bandcamp.com/album/tats-3d94chliche-chaos" rel="nofollow noopener noreferrer" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">farmersmanual.bandcamp.com/alb</span><span class="invisible">um/tats-3d94chliche-chaos</span></a></p>
Hacker News<p>Interprocedural Sparse Conditional Type Propagation</p><p><a href="https://railsatscale.com/2025-02-24-interprocedural-sparse-conditional-type-propagation/" rel="nofollow noopener noreferrer" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">railsatscale.com/2025-02-24-in</span><span class="invisible">terprocedural-sparse-conditional-type-propagation/</span></a></p><p><a href="https://mastodon.social/tags/HackerNews" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>HackerNews</span></a> <a href="https://mastodon.social/tags/Interprocedural" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Interprocedural</span></a> <a href="https://mastodon.social/tags/Sparse" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Sparse</span></a> <a href="https://mastodon.social/tags/Conditional" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Conditional</span></a> <a href="https://mastodon.social/tags/Type" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Type</span></a> <a href="https://mastodon.social/tags/Propagation" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Propagation</span></a> <a href="https://mastodon.social/tags/TypePropagation" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>TypePropagation</span></a> <a href="https://mastodon.social/tags/Programming" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Programming</span></a> <a href="https://mastodon.social/tags/HackerNews" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>HackerNews</span></a> <a href="https://mastodon.social/tags/SoftwareDevelopment" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>SoftwareDevelopment</span></a></p>
Low Rank Jack<p>[New Python code: PyNoiselet] About 15 years ago, I wrote a simple set of matlab functions to compute the <a href="https://mathstodon.xyz/tags/Noiselet" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Noiselet</span></a> transform of Coifman et al (R. Coifman, F. Geshwind, and Y. Meyer, "Noiselets", *Applied and Computational Harmonic Analysis*, 10(1):27–44, 2001). The noiselet transform is used in <a href="https://mathstodon.xyz/tags/CompressiveSensing" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>CompressiveSensing</span></a> applications as well as in <a href="https://mathstodon.xyz/tags/Sparse" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Sparse</span></a> signal coding as noiselets have minimally low coherence with wavelet bases (Haar and Daubechies), which is useful for sparse signal recovery.</p><p>Today, from a code request received yesterday by email, I decided to quickly rewrite this old code in Python (with the useful help of one LLM I admit). </p><p>Here is the result if you need an O(N log N) (butterfly like) algorithm to compute this transformation: </p><p><a href="https://gitlab.com/laurentjacques/PyNoiselet" rel="nofollow noopener noreferrer" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">gitlab.com/laurentjacques/PyNo</span><span class="invisible">iselet</span></a> </p><p>More information also in this old blog post : <a href="https://laurentjacques.gitlab.io/post/some-comments-on-noiselets/" rel="nofollow noopener noreferrer" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">laurentjacques.gitlab.io/post/</span><span class="invisible">some-comments-on-noiselets/</span></a></p><p>Feel free to fork it and improve this non-optimized code.</p>
BC BY-NC-SA<p>The Nekoma Void – 1 Less Throne<br><a href="https://mastodon.social/tags/Electronic" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Electronic</span></a> <a href="https://mastodon.social/tags/Experimental" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Experimental</span></a> <a href="https://mastodon.social/tags/concrete" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>concrete</span></a> <a href="https://mastodon.social/tags/darkambient" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>darkambient</span></a> <a href="https://mastodon.social/tags/darkemo" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>darkemo</span></a> <a href="https://mastodon.social/tags/downtempo" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>downtempo</span></a> <a href="https://mastodon.social/tags/gothpop" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>gothpop</span></a> <a href="https://mastodon.social/tags/sparse" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>sparse</span></a> <a href="https://mastodon.social/tags/wave" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>wave</span></a> <a href="https://mastodon.social/tags/Berlin" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Berlin</span></a><br>CC BY-NC-ND (<a href="https://mastodon.social/tags/CreativeCommons" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>CreativeCommons</span></a> Attribution Non Commercial No Derivatives) <a href="https://mastodon.social/tags/ccmusic" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>ccmusic</span></a><br><a href="https://nekomavoid.bandcamp.com/album/1-less-throne" rel="nofollow noopener noreferrer" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">nekomavoid.bandcamp.com/album/</span><span class="invisible">1-less-throne</span></a></p>
JMLR<p>'Bayesian Sparse Gaussian Mixture Model for Clustering in High Dimensions', by Dapeng Yao, Fangzheng Xie, Yanxun Xu.</p><p><a href="http://jmlr.org/papers/v26/23-0142.html" rel="nofollow noopener noreferrer" translate="no" target="_blank"><span class="invisible">http://</span><span class="ellipsis">jmlr.org/papers/v26/23-0142.ht</span><span class="invisible">ml</span></a> <br> <br><a href="https://sigmoid.social/tags/sparse" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>sparse</span></a> <a href="https://sigmoid.social/tags/clustering" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>clustering</span></a> <a href="https://sigmoid.social/tags/clusters" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>clusters</span></a></p>
LLMsFormulation of Feature Circuits with Sparse Autoencoders in LLM Large Language models (LLMs) have...<br><br><a href="https://towardsdatascience.com/formulation-of-feature-circuits-with-sparse-autoencoders-in-llm/" rel="nofollow noopener noreferrer" target="_blank">https://towardsdatascience.com/formulation-of-feature-circuits-with-sparse-autoencoders-in-llm/</a><br><br><a rel="nofollow noopener noreferrer" class="mention hashtag" href="https://mastodon.social/tags/Large" target="_blank">#Large</a> <a rel="nofollow noopener noreferrer" class="mention hashtag" href="https://mastodon.social/tags/Language" target="_blank">#Language</a> <a rel="nofollow noopener noreferrer" class="mention hashtag" href="https://mastodon.social/tags/Models" target="_blank">#Models</a> <a rel="nofollow noopener noreferrer" class="mention hashtag" href="https://mastodon.social/tags/Explainable" target="_blank">#Explainable</a> <a rel="nofollow noopener noreferrer" class="mention hashtag" href="https://mastodon.social/tags/Ai" target="_blank">#Ai</a> <a rel="nofollow noopener noreferrer" class="mention hashtag" href="https://mastodon.social/tags/Machine" target="_blank">#Machine</a> <a rel="nofollow noopener noreferrer" class="mention hashtag" href="https://mastodon.social/tags/Learning" target="_blank">#Learning</a> <a rel="nofollow noopener noreferrer" class="mention hashtag" href="https://mastodon.social/tags/Neural" target="_blank">#Neural</a> <a rel="nofollow noopener noreferrer" class="mention hashtag" href="https://mastodon.social/tags/Networks" target="_blank">#Networks</a> <a rel="nofollow noopener noreferrer" class="mention hashtag" href="https://mastodon.social/tags/Sparse" target="_blank">#Sparse</a> <a rel="nofollow noopener noreferrer" class="mention hashtag" href="https://mastodon.social/tags/Autoencoder" target="_blank">#Autoencoder</a><br><br><a href="https://awakari.com/pub-msg.html?id=5k34zYeeVmwIurdVpqQRuOI3aOO" rel="nofollow noopener noreferrer" target="_blank">Event Attributes</a>
JMLR<p>'From Sparse to Dense Functional Data in High Dimensions: Revisiting Phase Transitions from a Non-Asymptotic Perspective', by Shaojun Guo, Dong Li, Xinghao Qiao, Yizhu Wang.</p><p><a href="http://jmlr.org/papers/v26/23-1578.html" rel="nofollow noopener noreferrer" translate="no" target="_blank"><span class="invisible">http://</span><span class="ellipsis">jmlr.org/papers/v26/23-1578.ht</span><span class="invisible">ml</span></a> <br> <br><a href="https://sigmoid.social/tags/sparse" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>sparse</span></a> <a href="https://sigmoid.social/tags/nonparametric" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>nonparametric</span></a> <a href="https://sigmoid.social/tags/smoothing" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>smoothing</span></a></p>
JMLR<p>'Selective Inference with Distributed Data', by Sifan Liu, Snigdha Panigrahi.</p><p><a href="http://jmlr.org/papers/v26/23-0309.html" rel="nofollow noopener noreferrer" translate="no" target="_blank"><span class="invisible">http://</span><span class="ellipsis">jmlr.org/papers/v26/23-0309.ht</span><span class="invisible">ml</span></a> <br> <br><a href="https://sigmoid.social/tags/lasso" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>lasso</span></a> <a href="https://sigmoid.social/tags/glm" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>glm</span></a> <a href="https://sigmoid.social/tags/sparse" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>sparse</span></a></p>
M Sandersarterial—Antelope Island, UT (2022)<br> <br> <a href="https://pixelfed.social/discover/tags/utah?src=hash" class="u-url hashtag" rel="nofollow noopener noreferrer" target="_blank">#utah</a> <a href="https://pixelfed.social/discover/tags/campground?src=hash" class="u-url hashtag" rel="nofollow noopener noreferrer" target="_blank">#campground</a> <a href="https://pixelfed.social/discover/tags/landscape?src=hash" class="u-url hashtag" rel="nofollow noopener noreferrer" target="_blank">#landscape</a> <a href="https://pixelfed.social/discover/tags/alienlandscape?src=hash" class="u-url hashtag" rel="nofollow noopener noreferrer" target="_blank">#alienlandscape</a> <a href="https://pixelfed.social/discover/tags/above?src=hash" class="u-url hashtag" rel="nofollow noopener noreferrer" target="_blank">#above</a> <a href="https://pixelfed.social/discover/tags/watersedge?src=hash" class="u-url hashtag" rel="nofollow noopener noreferrer" target="_blank">#watersedge</a> <a href="https://pixelfed.social/discover/tags/greatsaltlake?src=hash" class="u-url hashtag" rel="nofollow noopener noreferrer" target="_blank">#greatsaltlake</a> <a href="https://pixelfed.social/discover/tags/plains?src=hash" class="u-url hashtag" rel="nofollow noopener noreferrer" target="_blank">#plains</a> <a href="https://pixelfed.social/discover/tags/birdseyeview?src=hash" class="u-url hashtag" rel="nofollow noopener noreferrer" target="_blank">#birdseyeview</a> <a href="https://pixelfed.social/discover/tags/sparse?src=hash" class="u-url hashtag" rel="nofollow noopener noreferrer" target="_blank">#sparse</a> <a href="https://pixelfed.social/discover/tags/alone?src=hash" class="u-url hashtag" rel="nofollow noopener noreferrer" target="_blank">#alone</a> <a href="https://pixelfed.social/discover/tags/distant?src=hash" class="u-url hashtag" rel="nofollow noopener noreferrer" target="_blank">#distant</a> <a href="https://pixelfed.social/discover/tags/aerialperspective?src=hash" class="u-url hashtag" rel="nofollow noopener noreferrer" target="_blank">#aerialperspective</a> <a href="https://pixelfed.social/discover/tags/landscapephotography?src=hash" class="u-url hashtag" rel="nofollow noopener noreferrer" target="_blank">#landscapephotography</a> <a href="https://pixelfed.social/discover/tags/fineartphotography?src=hash" class="u-url hashtag" rel="nofollow noopener noreferrer" target="_blank">#fineartphotography</a> <a href="https://pixelfed.social/discover/tags/fujifilm?src=hash" class="u-url hashtag" rel="nofollow noopener noreferrer" target="_blank">#fujifilm</a> <a href="https://pixelfed.social/discover/tags/fujixseries?src=hash" class="u-url hashtag" rel="nofollow noopener noreferrer" target="_blank">#fujixseries</a> <br> <br> 21/365
JMLR<p>'A minimax optimal approach to high-dimensional double sparse linear regression', by Yanhang Zhang, Zhifan Li, Shixiang Liu, Jianxin Yin.</p><p><a href="http://jmlr.org/papers/v25/23-0653.html" rel="nofollow noopener noreferrer" translate="no" target="_blank"><span class="invisible">http://</span><span class="ellipsis">jmlr.org/papers/v25/23-0653.ht</span><span class="invisible">ml</span></a> <br> <br><a href="https://sigmoid.social/tags/sparse" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>sparse</span></a> <a href="https://sigmoid.social/tags/thresholding" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>thresholding</span></a> <a href="https://sigmoid.social/tags/sparsity" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>sparsity</span></a></p>
JMLR<p>'Triple Component Matrix Factorization: Untangling Global, Local, and Noisy Components', by Naichen Shi, Salar Fattahi, Raed Al Kontar.</p><p><a href="http://jmlr.org/papers/v25/24-0400.html" rel="nofollow noopener noreferrer" translate="no" target="_blank"><span class="invisible">http://</span><span class="ellipsis">jmlr.org/papers/v25/24-0400.ht</span><span class="invisible">ml</span></a> <br> <br><a href="https://sigmoid.social/tags/minimization" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>minimization</span></a> <a href="https://sigmoid.social/tags/factorization" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>factorization</span></a> <a href="https://sigmoid.social/tags/sparse" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>sparse</span></a></p>
JMLR<p>'Generalization on the Unseen, Logic Reasoning and Degree Curriculum', by Emmanuel Abbe, Samy Bengio, Aryo Lotfi, Kevin Rizk.</p><p><a href="http://jmlr.org/papers/v25/24-0220.html" rel="nofollow noopener noreferrer" translate="no" target="_blank"><span class="invisible">http://</span><span class="ellipsis">jmlr.org/papers/v25/24-0220.ht</span><span class="invisible">ml</span></a> <br> <br><a href="https://sigmoid.social/tags/sparse" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>sparse</span></a> <a href="https://sigmoid.social/tags/learns" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>learns</span></a> <a href="https://sigmoid.social/tags/generalization" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>generalization</span></a></p>
JMLR<p>'Neural Networks with Sparse Activation Induced by Large Bias: Tighter Analysis with Bias-Generalized NTK', by Hongru Yang, Ziyu Jiang, Ruizhe Zhang, Yingbin Liang, Zhangyang Wang.</p><p><a href="http://jmlr.org/papers/v25/23-0831.html" rel="nofollow noopener noreferrer" translate="no" target="_blank"><span class="invisible">http://</span><span class="ellipsis">jmlr.org/papers/v25/23-0831.ht</span><span class="invisible">ml</span></a> <br> <br><a href="https://sigmoid.social/tags/sparse" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>sparse</span></a> <a href="https://sigmoid.social/tags/gradient" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>gradient</span></a> <a href="https://sigmoid.social/tags/generalization" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>generalization</span></a></p>
BC BY-NC-SA<p>tsx x sue tompkins – recur³<br><a href="https://botsin.space/tags/Dysmorphic" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Dysmorphic</span></a> <a href="https://botsin.space/tags/Electro" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Electro</span></a> <a href="https://botsin.space/tags/Funk" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Funk</span></a> <a href="https://botsin.space/tags/Sparse" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Sparse</span></a> <a href="https://botsin.space/tags/experimental" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>experimental</span></a> <a href="https://botsin.space/tags/extremecomputermusic" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>extremecomputermusic</span></a> <a href="https://botsin.space/tags/mobile" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>mobile</span></a> <a href="https://botsin.space/tags/Berlin" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Berlin</span></a><br>CC BY (<a href="https://botsin.space/tags/CreativeCommons" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>CreativeCommons</span></a> Attribution)<br><a href="https://farmersmanual.bandcamp.com/album/recur" rel="nofollow noopener noreferrer" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">farmersmanual.bandcamp.com/alb</span><span class="invisible">um/recur</span></a></p>
JMLR<p>'Sparse Recovery With Multiple Data Streams: An Adaptive Sequential Testing Approach', by Weinan Wang, Bowen Gang, Wenguang Sun.</p><p><a href="http://jmlr.org/papers/v25/22-1310.html" rel="nofollow noopener noreferrer" translate="no" target="_blank"><span class="invisible">http://</span><span class="ellipsis">jmlr.org/papers/v25/22-1310.ht</span><span class="invisible">ml</span></a> <br> <br><a href="https://sigmoid.social/tags/sparse" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>sparse</span></a> <a href="https://sigmoid.social/tags/screening" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>screening</span></a> <a href="https://sigmoid.social/tags/thresholding" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>thresholding</span></a></p>
JMLR<p>'White-Box Transformers via Sparse Rate Reduction: Compression Is All There Is?', by Yaodong Yu et al.</p><p><a href="http://jmlr.org/papers/v25/23-1547.html" rel="nofollow noopener noreferrer" translate="no" target="_blank"><span class="invisible">http://</span><span class="ellipsis">jmlr.org/papers/v25/23-1547.ht</span><span class="invisible">ml</span></a> <br> <br><a href="https://sigmoid.social/tags/sparse" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>sparse</span></a> <a href="https://sigmoid.social/tags/compressive" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>compressive</span></a> <a href="https://sigmoid.social/tags/encoders" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>encoders</span></a></p>
JMLR<p>'skscope: Fast Sparsity-Constrained Optimization in Python', by Zezhi Wang, Junxian Zhu, Xueqin Wang, Jin Zhu, Huiyang Pen, Peng Chen, Anran Wang, Xiaoke Zhang.</p><p><a href="http://jmlr.org/papers/v25/23-1574.html" rel="nofollow noopener noreferrer" translate="no" target="_blank"><span class="invisible">http://</span><span class="ellipsis">jmlr.org/papers/v25/23-1574.ht</span><span class="invisible">ml</span></a> <br> <br><a href="https://sigmoid.social/tags/sparse" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>sparse</span></a> <a href="https://sigmoid.social/tags/optimization" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>optimization</span></a> <a href="https://sigmoid.social/tags/sparsity" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>sparsity</span></a></p>