AutoML相关论文
本文为Awesome-AutoML-Papers的译文。
1、AutoML简介
Machine Learning几年来取得的不少可观的成绩,越来越多的学科都依赖于它。然而,这些成果都很大程度上取决于人类机器学习专家来完成如下工作:
- 数据预处理 Preprocess the data
- 选择合适的特征 Select appropriate features
- 选择合适的模型族 Select an appropriate model family
- 优化模型参数 Optimize model hyperparameters
- 模型后处理 Postprocess machine learning models
- 分析结果 Critically analyze the results obtained
随着大多数任务的复杂度都远超非机器学习专家的能力范畴,机器学习应用的不断增长使得人们对现成的机器学习方法有了极大的需求。因为这些现成的机器学习方法使用简单,并且不需要专业知识。我们将由此产生的研究领域称为机器学习的逐步自动化。
AutoML借鉴了机器学习的很多知识,主要包括:
- 贝叶斯优化 Bayesian optimization
- 结构化数据的大数据的回归模型 Regression models for structured data and big data
- 元学习 Meta learning
- 迁移学习 Transfer learning
- 组合优化 Combinatorial optimization.
2、目录
- Papers
- Tutorials
- Articles
- Slides
- Books
- Projects
- Prominent Researchers
Papers
Automated Feature Engineering
-
Expand Reduce
- 2017 | AutoLearn — Automated Feature Generation and Selection | Ambika Kaul, et al. | ICDM |
PDF
- 2017 | One button machine for automating feature engineering in relational databases | Hoang Thanh Lam, et al. | arXiv |
PDF
- 2016 | Automating Feature Engineering | Udayan Khurana, et al. | NIPS |
PDF
- 2016 | ExploreKit: Automatic Feature Generation and Selection | Gilad Katz, et al. | ICDM |
PDF
- 2015 | Deep Feature Synthesis: Towards Automating Data Science Endeavors | James Max Kanter, Kalyan Veeramachaneni | DSAA |
PDF
- 2017 | AutoLearn — Automated Feature Generation and Selection | Ambika Kaul, et al. | ICDM |
-
Hierarchical Organization of Transformations
- 2016 | Cognito: Automated Feature Engineering for Supervised Learning | Udayan Khurana, et al. | ICDMW |
PDF
- 2016 | Cognito: Automated Feature Engineering for Supervised Learning | Udayan Khurana, et al. | ICDMW |
-
Meta Learning
- 2017 | Learning Feature Engineering for Classification | Fatemeh Nargesian, et al. | IJCAI |
PDF
- 2017 | Learning Feature Engineering for Classification | Fatemeh Nargesian, et al. | IJCAI |
-
Reinforcement Learning
-
Evolutionary Algorithms
-
Local Search
- 2017 | Simple and Efficient Architecture Search for Convolutional Neural Networks | Thomoas Elsken, et al. | ICLR |
PDF
- 2017 | Simple and Efficient Architecture Search for Convolutional Neural Networks | Thomoas Elsken, et al. | ICLR |
-
Meta Learning
- 2016 | Learning to Optimize | Ke Li, Jitendra Malik | arXiv |
PDF
- 2016 | Learning to Optimize | Ke Li, Jitendra Malik | arXiv |
-
Reinforcement Learning
-
Transfer Learning
-
2017 | Learning Transferable Architectures for Scalable Image Recognition | Barret Zoph, et al. | arXiv |
PDF
Frameworks
-
- 2017 | Google Vizier: A Service for Black-Box Optimization | Daniel Golovin, et al. | KDD |
PDF
- 2017 | ATM: A Distributed, Collaborative, Scalable System for Automated Machine Learning | T. Swearingen, et al. | IEEE |
PDF
-
2015 | AutoCompete: A Framework for Machine Learning Competitions | Abhishek Thakur, et al. | ICML |
PDF
Hyperparameter Optimization
-
Bayesian Optimization
- 2016 | Bayesian Optimization with Robust Bayesian Neural Networks | Jost Tobias Springenberg, et al. | NIPS |
PDF
- 2016 | Scalable Hyperparameter Optimization with Products of Gaussian Process Experts | Nicolas Schilling, et al. | PKDD |
PDF
- 2016 | Taking the Human Out of the Loop: A Review of Bayesian Optimization | Bobak Shahriari, et al. | IEEE |
PDF
- 2016 | Towards Automatically-Tuned Neural Networks | Hector Mendoza, et al. | JMLR |
PDF
- 2016 | Two-Stage Transfer Surrogate Model for Automatic Hyperparameter Optimization | Martin Wistuba, et al. | PKDD |
PDF
- 2015 | Efficient and Robust Automated Machine Learning |
PDF
- 2015 | Hyperparameter Optimization with Factorized Multilayer Perceptrons | Nicolas Schilling, et al. | PKDD |
PDF
- 2015 | Hyperparameter Search Space Pruning - A New Component for Sequential Model-Based Hyperparameter Optimization | Martin Wistua, et al. |
PDF
- 2015 | Joint Model Choice and Hyperparameter Optimization with Factorized Multilayer Perceptrons | Nicolas Schilling, et al. | ICTAI |
PDF
- 2015 | Learning Hyperparameter Optimization Initializations | Martin Wistuba, et al. | DSAA |
PDF
- 2015 | Scalable Bayesian optimization using deep neural networks | Jasper Snoek, et al. | ACM |
PDF
- 2015 | Sequential Model-free Hyperparameter Tuning | Martin Wistuba, et al. | ICDM |
PDF
- 2013 | Auto-WEKA: Combined Selection and Hyperparameter Optimization of Classification Algorithms |
PDF
- 2013 | Making a Science of Model Search: Hyperparameter Optimization in Hundreds of Dimensions for Vision Architectures | J. Bergstra | JMLR |
PDF
- 2012 | Practical Bayesian Optimization of Machine Learning Algorithms |
PDF
- 2011 | Sequential Model-Based Optimization for General Algorithm Configuration(extended version) |
PDF
- 2016 | Bayesian Optimization with Robust Bayesian Neural Networks | Jost Tobias Springenberg, et al. | NIPS |
-
Evolutionary Algorithms
-
Lipschitz Functions
- 2017 | Global Optimization of Lipschitz functions | C´edric Malherbe, Nicolas Vayatis | arXiv |
PDF
- 2017 | Global Optimization of Lipschitz functions | C´edric Malherbe, Nicolas Vayatis | arXiv |
-
Local Search
- 2009 | ParamILS: An Automatic Algorithm Configuration Framework | Frank Hutter, et al. | JAIR |
PDF
- 2009 | ParamILS: An Automatic Algorithm Configuration Framework | Frank Hutter, et al. | JAIR |
-
Meta Learning
- 2008 | Cross-Disciplinary Perspectives on Meta-Learning for Algorithm Selection |
PDF
- 2008 | Cross-Disciplinary Perspectives on Meta-Learning for Algorithm Selection |
-
Particle Swarm Optimization
- 2017 | Particle Swarm Optimization for Hyper-parameter Selection in Deep Neural Networks | Pablo Ribalta Lorenzo, et al. | GECCO |
PDF
- 2008 | Particle Swarm Optimization for Parameter Determination and Feature Selection of Support Vector Machines | Shih-Wei Lin, et al. | Expert Systems with Applications |
PDF
- 2017 | Particle Swarm Optimization for Hyper-parameter Selection in Deep Neural Networks | Pablo Ribalta Lorenzo, et al. | GECCO |
-
Random Search
-
Transfer Learning
- 2016 | Efficient Transfer Learning Method for Automatic Hyperparameter Tuning | Dani Yogatama, Gideon Mann | JMLR |
PDF
- 2016 | Flexible Transfer Learning Framework for Bayesian Optimisation | Tinu Theckel Joy, et al. | PAKDD |
PDF
- 2016 | Hyperparameter Optimization Machines | Martin Wistuba, et al. | DSAA |
PDF
-
2013 | Collaborative Hyperparameter Tuning | R´emi Bardenet, et al. | ICML |
PDF
Miscellaneous
- 2016 | Efficient Transfer Learning Method for Automatic Hyperparameter Tuning | Dani Yogatama, Gideon Mann | JMLR |
- 2018 | Accelerating Neural Architecture Search using Performance Prediction | Bowen Baker, et al. | ICLR |
PDF
2017 | Automatic Frankensteining: Creating Complex Ensembles Autonomously | Martin Wistuba, et al. | SIAM |
PDF
Tutorials
Bayesian Optimization
-
2010 | A Tutorial on Bayesian Optimization of Expensive Cost Functions, with Application to Active User Modeling and Hierarchical Reinforcement Learning |
PDF
Meta Learning
2008 | Metalearning - A Tutorial |
PDF
Articles
Bayesian Optimization
-
2016 | Bayesian Optimization for Hyperparameter Tuning |
Link
Meta Learning
- 2017 | Why Meta-learning is Crucial for Further Advances of Artificial Intelligence? |
Link
2017 | Learning to learn |
Link
Slides
Automated Feature Engineering
-
Automated Feature Engineering for Predictive Modeling | Udyan Khurana, etc al. |
PDF
Hyperparameter Optimization
Bayesian Optimization
- Bayesian Optimisation |
PDF
A Tutorial on Bayesian Optimization for Machine Learning |
PDF
Books
Meta Learning
- 2009 | Metalearning - Applications to Data Mining | Springer |
PDF
Projects
- Advisor |
Python
|Open Source
|Code
- auto-sklearn |
Python
|Open Source
|Code
- Auto-WEKA |
Java
|Open Source
|Code
- Hyperopt |
Python
|Open Source
|Code
- Hyperopt-sklearn |
Python
|Open Source
|Code
- SigOpt |
Python
|Commercial
|Link
- SMAC3 |
Python
|Open Source
|Code
- RoBO |
Python
|Open Source
|Code
- BayesianOptimization |
Python
|Open Source
|Code
- Scikit-Optimize |
Python
|Open Source
|Code
- HyperBand |
Python
|Open Source
|Code
- BayesOpt |
C++
|Open Source
|Code
- Optunity |
Python
|Open Source
|Code
- TPOT |
Python
|Open Source
|Code
- ATM |
Python
|Open Source
|Code
- Cloud AutoML |
Python
|Commercial
|Link
- H2O |
Python
|Commercial
|Link
- DataRobot |
Python
|Commercial
|Link
- MLJAR |
Python
|Commercial
|Link
- MateLabs |
Python
|Commercial
|Link
MARSGGBO原创
2018-7-14

低调大师中文资讯倾力打造互联网数据资讯、行业资源、电子商务、移动互联网、网络营销平台。
持续更新报道IT业界、互联网、市场资讯、驱动更新,是最及时权威的产业资讯及硬件资讯报道平台。
转载内容版权归作者及来源网站所有,本站原创内容转载请注明来源。
- 上一篇
Python全栈工程师(函数嵌套、变量作用域)
ParisGabriel 感谢 大家的支持 每天坚持 一天一篇 点个订阅吧灰常感谢 当个死粉也阔以 Python人工智能从入门到精通 globals()/locals()函数: globals()返回当前全局作用域内变量的字典 locals() 返回当前局部作用域内变量的字典函数变量: 函数名是变量,它在def 语句创建时绑定函数 fa1 = fa 没有括号 绑定函数 fa1 = fa() 返回结果 函数的变量名可以序列交换算法一个函数可以作为另一个函数的实参传递:例如: def myinput(fn): L = [5, 3, 1, 9, 7] return fn(L) print(myinput(max)) print(myinput(min)) print(myinput(sum)) print(myinput(len)) 函数可以作为另一个函数的返回值:例如: def get_op(): s = input("qing shu ru cao zuo") if s == "zui da": retur...
- 下一篇
w1 分数
题目内容: 设计一个表示分数的类Fraction。这个类用两个int类型的变量分别表示分子和分母。 这个类的构造函数是: Fraction(int a, int b) 构造一个a/b的分数。 这个类要提供以下的功能: double toDouble(); 将分数转换为double Fraction plus(Fraction r); 将自己的分数和r的分数相加,产生一个新的Fraction的对象。注意小学四年级学过两个分数如何相加的哈。 Fraction multiply(Fraction r); 将自己的分数和r的分数相乘,产生一个新的Fraction的对象。 void print(); 将自己以“分子/分母”的形式输出到标准输出,并带有回车换行。如果分数是1/1,应该输出1。当分子大于分母时,不需要提出整数部分,即31/30是一个正确的输出。 注意,在创建和做完运算后应该化简分数为最简形式。如2/4应该被化简为1/2。 你写的类要和以下的代码放在一起,并请勿修改这个代码: import java.util.Scanner; public class Main { public st...
相关文章
文章评论
共有0条评论来说两句吧...
文章二维码
点击排行
推荐阅读
最新文章
- Linux系统CentOS6、CentOS7手动修改IP地址
- CentOS关闭SELinux安全模块
- CentOS8安装Docker,最新的服务器搭配容器使用
- SpringBoot2编写第一个Controller,响应你的http请求并返回结果
- Hadoop3单机部署,实现最简伪集群
- SpringBoot2初体验,简单认识spring boot2并且搭建基础工程
- Eclipse初始化配置,告别卡顿、闪退、编译时间过长
- Springboot2将连接池hikari替换为druid,体验最强大的数据库连接池
- Windows10,CentOS7,CentOS8安装Nodejs环境
- 设置Eclipse缩进为4个空格,增强代码规范