这篇文章包含了我目前为止找到的最好的教程内容。这不是一张罗列了所有网上跟机器学习相关教程的清单——不然就太冗长太重复了。我这里并没有包括那些质量一般的内容。我的目标是把能找到的最好的教程与机器学习和自然语言处理的延伸主题们连接到一起。
我这里指的“教程”,是指那些为了简洁地传授一个概念而写的介绍性内容。我尽量避免了教科书里的章节,因为它们涵盖了更广的内容,或者是研究论文,通常对于传授概念来说并不是很有帮助。如果是那样的话,为何不直接买书呢?当你想要学习一个基本主题或者是想要获得更多观点的时候,教程往往很有用。
我把这篇文章分为了四个部分:机器学习,自然语言处理,python和数学。在每个部分中我都列举了一些主题,但是因为材料的数量庞大,我不可能涉及到每一个主题。
如果你发现到我遗漏了哪些好的教程,请告诉我!我尽量把每个主题下的教程控制在五个或者六个,如果超过了这个数字就难免会有重复。每一个链接都包含了与其他链接不同的材料,或使用了不同的方式表达信息(例如:使用代码,幻灯片和长文),或者是来自不同的角度。
机器学习
https://machinelearningmastery.com/start-here/
https://medium.com/@ageitgey/machine-learning-is-fun-80ea3ec3c471
http://martin.zinkevich.org/rules_of_ml/rules_of_ml.pdf
https://ml.berkeley.edu/blog/2016/11/06/tutorial-1/
https://ml.berkeley.edu/blog/2016/12/24/tutorial-2/
https://ml.berkeley.edu/blog/2017/02/04/tutorial-3/
https://www.toptal.com/machine-learning/machine-learning-theory-an-introductory-primer
https://monkeylearn.com/blog/gentle-guide-to-machine-learning/
https://blogs.sas.com/content/subconsciousmusings/2017/04/12/machine-learning-algorithm-use/
https://www.sas.com/content/dam/SAS/en_us/doc/whitepaper1/machine-learning-primer-108796.pdf
https://www.kaggle.com/kanncaa1/machine-learning-tutorial-for-beginners
激活和损失函数
http://neuralnetworksanddeeplearning.com/chap1.html#sigmoid_neurons
https://www.quora.com/What-is-the-role-of-the-activation-function-in-a-neural-network
https://stats.stackexchange.com/questions/115258/comprehensive-list-of-activation-functions-in-neural-networks-with-pros-cons
https://medium.com/towards-data-science/activation-functions-and-its-types-which-is-better-a9a5310cc8f
http://www.exegetic.biz/blog/2015/12/making-sense-logarithmic-loss/
http://cs231n.github.io/neural-networks-2/#losses
http://rishy.github.io/ml/2015/07/28/l1-vs-l2-loss/
http://neuralnetworksanddeeplearning.com/chap3.html#the_cross-entropy_cost_function
偏差
https://stackoverflow.com/questions/2480650/role-of-bias-in-neural-networks/2499936#2499936
http://makeyourownneuralnetwork.blogspot.com/2016/06/bias-nodes-in-neural-networks.html
https://www.quora.com/What-is-bias-in-artificial-neural-network
感知机
http://neuralnetworksanddeeplearning.com/chap1.html#perceptrons
https://natureofcode.com/book/chapter-10-neural-networks/#chapter10_figure3
http://computing.dcu.ie/~humphrys/Notes/Neural/single.neural.html
https://www.toptal.com/machine-learning/an-introduction-to-deep-learning-from-perceptrons-to-deep-networks
回归
http://people.duke.edu/~rnau/regintro.htm
http://ufldl.stanford.edu/tutorial/supervised/LinearRegression/
http://ml-cheatsheet.readthedocs.io/en/latest/linear_regression.html
https://ml-cheatsheet.readthedocs.io/en/latest/logistic_regression.html
http://machinelearningmastery.com/simple-linear-regression-tutorial-for-machine-learning/
https://machinelearningmastery.com/logistic-regression-tutorial-for-machine-learning/
http://ufldl.stanford.edu/tutorial/supervised/SoftmaxRegression/
梯度下降
http://neuralnetworksanddeeplearning.com/chap1.html#learning_with_gradient_descent
http://iamtrask.github.io/2015/07/27/python-network-part2/
http://www.kdnuggets.com/2017/04/simple-understand-gradient-descent-algorithm.html
http://sebastianruder.com/optimizing-gradient-descent/
http://cs231n.github.io/optimization-1/
生成学习
http://cs229.stanford.edu/notes/cs229-notes2.pdf
https://monkeylearn.com/blog/practical-explanation-naive-bayes-classifier/
支持向量机
https://monkeylearn.com/blog/introduction-to-support-vector-machines-svm/
http://cs229.stanford.edu/notes/cs229-notes3.pdf
http://cs231n.github.io/linear-classify/
深度学习
http://yerevann.com/a-guide-to-deep-learning/
https://github.com/floodsung/Deep-Learning-Papers-Reading-Roadmap
http://nikhilbuduma.com/2014/12/29/deep-learning-in-a-nutshell/
http://ai.stanford.edu/~quocle/tutorial1.pdf
https://machinelearningmastery.com/what-is-deep-learning/
https://blogs.nvidia.com/blog/2016/07/29/whats-difference-artificial-intelligence-machine-learning-deep-learning-ai/
https://gluon.mxnet.io/
优化和降维
https://www.knime.org/blog/seven-techniques-for-data-dimensionality-reduction
http://cs229.stanford.edu/notes/cs229-notes10.pdf
http://cs229.stanford.edu/notes/cs229-notes10.pdf
http://rishy.github.io/ml/2017/01/05/how-to-train-your-dnn/
长短期记忆(LSTM)
https://machinelearningmastery.com/gentle-introduction-long-short-term-memory-networks-experts/
http://colah.github.io/posts/2015-08-Understanding-LSTMs/
http://blog.echen.me/2017/05/30/exploring-lstms/
http://iamtrask.github.io/2015/11/15/anyone-can-code-lstm/
卷积神经网络
http://neuralnetworksanddeeplearning.com/chap6.html#introducing_convolutional_networks
https://medium.com/@ageitgey/machine-learning-is-fun-part-3-deep-learning-and-convolutional-neural-networks-f40359318721
http://colah.github.io/posts/2014-07-Conv-Nets-Modular/
http://colah.github.io/posts/2014-07-Understanding-Convolutions/
递归神经网络
http://www.wildml.com/2015/09/recurrent-neural-networks-tutorial-part-1-introduction-to-rnns/
http://distill.pub/2016/augmented-rnns/
http://karpathy.github.io/2015/05/21/rnn-effectiveness/
http://nikhilbuduma.com/2015/01/11/a-deep-dive-into-recurrent-neural-networks/
强化学习
https://www.analyticsvidhya.com/blog/2017/01/introduction-to-reinforcement-learning-implementation/
https://web.mst.edu/~gosavia/tutorial.pdf
http://www.wildml.com/2016/10/learning-reinforcement-learning/
http://karpathy.github.io/2016/05/31/rl/
生成对抗网络(GANs)
https://aaai18adversarial.github.io/slides/AML.pptx
https://blogs.nvidia.com/blog/2017/05/17/generative-adversarial-network/
https://medium.com/@ageitgey/abusing-generative-adversarial-networks-to-make-8-bit-pixel-art-e45d9b96cee7
http://blog.aylien.com/introduction-generative-adversarial-networks-code-tensorflow/
https://www.oreilly.com/learning/generative-adversarial-networks-for-beginners
多任务学习
http://sebastianruder.com/multi-task/index.html
自然语言处理
https://medium.com/@ageitgey/natural-language-processing-is-fun-9a0bff37854e
http://u.cs.biu.ac.il/~yogo/nnlp.pdf
https://monkeylearn.com/blog/the-definitive-guide-to-natural-language-processing/
https://blog.algorithmia.com/introduction-natural-language-processing-nlp/
http://www.vikparuchuri.com/blog/natural-language-processing-tutorial/
https://arxiv.org/pdf/1103.0398.pdf
深度学习和自然语言处理
https://arxiv.org/pdf/1703.03091.pdf
https://nlp.stanford.edu/courses/NAACL2013/NAACL2013-Socher-Manning-DeepLearning.pdf
http://www.wildml.com/2015/11/understanding-convolutional-neural-networks-for-nlp/
http://colah.github.io/posts/2014-07-NLP-RNNs-Representations/
-
Embed, encode, attend, predict: The new deep learning formula for state-of-the-art NLP models (explosion.ai)
https://explosion.ai/blog/deep-learning-formula-nlp
https://devblogs.nvidia.com/parallelforall/understanding-natural-language-deep-neural-networks-using-torch/
http://pytorch.org/tutorials/beginner/deep_learning_nlp_tutorial.html
词向量
https://www.kaggle.com/c/word2vec-nlp-tutorial
http://sebastianruder.com/word-embeddings-1/index.html
http://sebastianruder.com/word-embeddings-softmax/index.html
http://sebastianruder.com/secret-word2vec/index.html
https://blog.acolyer.org/2016/04/21/the-amazing-power-of-word-vectors/
https://arxiv.org/pdf/1411.2738.pdf
http://mccormickml.com/2016/04/19/word2vec-tutorial-the-skip-gram-model/
http://mccormickml.com/2017/01/11/word2vec-tutorial-part-2-negative-sampling/
编码器-解码器
http://www.wildml.com/2016/01/attention-and-memory-in-deep-learning-and-nlp/
https://www.tensorflow.org/tutorials/seq2seq
https://papers.nips.cc/paper/5346-sequence-to-sequence-learning-with-neural-networks.pdf
https://medium.com/@ageitgey/machine-learning-is-fun-part-5-language-translation-with-deep-learning-and-the-magic-of-sequences-2ace0acca0aa
https://google.github.io/seq2seq/
Python
https://developers.google.com/machine-learning/crash-course/
https://github.com/josephmisiti/awesome-machine-learning#python
http://www.kdnuggets.com/2015/11/seven-steps-machine-learning-python.html
http://nbviewer.jupyter.org/github/rhiever/Data-Analysis-and-Machine-Learning-Projects/blob/master/example-data-science-notebook/Example%20Machine%20Learning%20Notebook.ipynb
https://www.tutorialspoint.com/machine_learning_with_python/machine_learning_with_python_quick_guide.htm
范例
http://machinelearningmastery.com/implement-perceptron-algorithm-scratch-python/
http://www.wildml.com/2015/09/implementing-a-neural-network-from-scratch/
http://iamtrask.github.io/2015/07/12/basic-python-network/
http://www.kdnuggets.com/2016/01/implementing-your-own-knn-using-python.html
https://github.com/eriklindernoren/ML-From-Scratch
https://github.com/rasbt/python-machine-learning-book-2nd-edition
Scipy and numpy
http://www.scipy-lectures.org/
http://cs231n.github.io/python-numpy-tutorial/
https://engineering.ucsb.edu/~shell/che210d/numpy.pdf
http://nbviewer.jupyter.org/gist/rpmuller/5920182#ii.-numpy-and-scipy
scikit-learn
http://nbviewer.jupyter.org/github/jakevdp/sklearn_pycon2015/blob/master/notebooks/Index.ipynb
https://github.com/mmmayo13/scikit-learn-classifiers/blob/master/sklearn-classifiers-tutorial.ipynb
http://scikit-learn.org/stable/tutorial/index.html
https://github.com/mmmayo13/scikit-learn-beginners-tutorials
Tensorflow
https://www.tensorflow.org/tutorials/
https://medium.com/@erikhallstrm/hello-world-tensorflow-649b15aed18c
https://blog.metaflow.fr/tensorflow-a-primer-4b3fa0978be3
http://www.wildml.com/2016/08/rnns-in-tensorflow-a-practical-guide-and-undocumented-features/
http://www.wildml.com/2015/12/implementing-a-cnn-for-text-classification-in-tensorflow/
http://pavel.surmenok.com/2016/10/15/how-to-run-text-summarization-with-tensorflow/
PyTorch
http://pytorch.org/tutorials/
http://blog.gaurav.im/2017/04/24/a-gentle-intro-to-pytorch/
https://iamtrask.github.io/2017/01/15/pytorch-tutorial/
https://github.com/jcjohnson/pytorch-examples
https://github.com/MorvanZhou/PyTorch-Tutorial
https://github.com/yunjey/pytorch-tutorial
数学
https://people.ucsc.edu/~praman1/static/pub/math-for-ml.pdf
http://www.umiacs.umd.edu/~hal/courses/2013S_ML/math4ml.pdf
线性代数
https://betterexplained.com/articles/linear-algebra-guide/
https://betterexplained.com/articles/matrix-multiplication/
https://betterexplained.com/articles/cross-product/
https://betterexplained.com/articles/vector-calculus-understanding-the-dot-product/
http://www.cedar.buffalo.edu/~srihari/CSE574/Chap1/LinearAlgebra.pdf
https://medium.com/towards-data-science/linear-algebra-cheat-sheet-for-deep-learning-cd67aba4526c
http://cs229.stanford.edu/section/cs229-linalg.pdf
概率
https://betterexplained.com/articles/understanding-bayes-theorem-with-ratios/
http://cs229.stanford.edu/section/cs229-prob.pdf
https://see.stanford.edu/materials/aimlcs229/cs229-prob.pdf
http://www.cedar.buffalo.edu/~srihari/CSE574/Chap1/Probability-Theory.pdf
http://www.cs.toronto.edu/~urtasun/courses/CSC411_Fall16/tutorial1.pdf
微积分
https://betterexplained.com/articles/how-to-understand-derivatives-the-quotient-rule-exponents-and-logarithms/
https://betterexplained.com/articles/derivatives-product-power-chain/
https://betterexplained.com/articles/vector-calculus-understanding-the-gradient/
http://web.stanford.edu/class/cs224n/lecture_notes/cs224n-2017-review-differential-calculus.pdf
http://ml-cheatsheet.readthedocs.io/en/latest/calculus.html
原文发布时间为:2018-10-28c
本文来自云栖社区合作伙伴“大数据地盘”,了解相关信息可以关注“大数据地盘”。