闲情居|机器学习领域顶会ICML20精选论文分享


闲情居|机器学习领域顶会ICML20精选论文分享ICML 是 International Conference on Machine Learning的缩写 , 即国际机器学习大会 。 ICML如今已发展为由国际机器学习学会(IMLS)主办的年度机器学习国际顶级会议 。
今年的ICML2020会议由于受疫情的影响改成了线上会议 , 做为人工智能领域的顶级会议之一 , 今年入选的论文一共1088篇 , 入选论文的数量创造了历史之最 , 但接受率却只有21.8% , 低于2019年22.6%和2018年的24.9% 。
本文整理了本次顶会的入选的精选论文 , 分享给大家 。 完整版需要的朋友自取 。
ICML2020录取论文完整版源地址:
精选论文分享
Reverse-engineering deep ReLU networks David Rolnick, Konrad Kording
My Fair Bandit: Distributed Learning of Max-Min Fairness with Multi-player Bandits Ilai Bistritz, Tavor Baharav, Amir Leshem, Nicholas Bambos
Scalable Differentiable Physics for Learning and Control Yi-Ling Qiao, Junbang Liang, Vladlen Koltun, Ming Lin
Generalization to New Actions in Reinforcement Learning Ayush Jain, Andrew Szot, Joseph Lim
Randomized Block-Diagonal Preconditioning for Parallel Learning Celestine Mendler-Dünner, Aurelien Lucchi
Stochastic Flows and Geometric Optimization on the Orthogonal Group Krzysztof Choromanski, David Cheikhi, Jared Davis, Valerii Likhosherstov, Achille Nazaret, Achraf Bahamou, Xingyou Song, Mrugank Akarte, Jack Parker-Holder, Jacob Bergquist, YUAN GAO, Aldo Pacchiano, Tamas Sarlos, Adrian Weller, Vikas Sindhwani
PackIt: A Virtual Environment for Geometric Planning Ankit Goyal, Jia Deng
Soft Threshold Weight Reparameterization for Learnable Sparsity Aditya Kusupati, Vivek Ramanujan, Raghav Somani, Mitchell Wortsman, Prateek Jain, Sham Kakade, Ali Farhadi
Stochastic Latent Residual Video Prediction Jean-Yves Franceschi, Edouard Delasalles, Mickael Chen, Sylvain Lamprier, Patrick Gallinari
Fractional Underdamped Langevin Dynamics: Retargeting SGD with Momentum under Heavy-Tailed Gradient Noise Umut Simsekli, Lingjiong Zhu, Yee Whye Teh, Mert Gurbuzbalaban
Context Aware Local Differential Privacy Jayadev Acharya, Keith Bonawitz, Peter Kairouz, Daniel Ramage, Ziteng Sun
Privately Learning Markov Random Fields Gautam Kamath, Janardhan Kulkarni, Steven Wu, Huanyu Zhang
A Mean Field Analysis Of Deep ResNet And Beyond: Towards Provably Optimization Via Overparameterization From Depth Yiping Lu, Chao Ma, Yulong Lu, Jianfeng Lu, Lexing Ying
Provable Smoothness Guarantees for Black-Box Variational Inference Justin Domke
Enhancing Simple Models by Exploiting What They Already Know Amit Dhurandhar, Karthikeyan Shanmugam, Ronny Luss
Fiduciary Bandits Gal Bahar, Omer Ben-Porat, Kevin Leyton-Brown, Moshe Tennenholtz
Training Deep Energy-Based Models with f-Divergence Minimization Lantao Yu, Yang Song, Jiaming Song, Stefano Ermon
Progressive Graph Learning for Open-Set Domain Adaptation Yadan Luo, Zijian Wang, Zi Huang, Mahsa Baktashmotlagh
Learning De-biased Representations with Biased Representations Hyojin Bahng, SANGHYUK CHUN, Sangdoo Yun, Jaegul Choo, Seong Joon Oh


推荐阅读