分享

李沐《动手学深度学习》PyTorch 实现版开源,瞬间登上 GitHub 热榜!

 西北望msm66g9f 2019-09-12

重磅干货,第一时间送达

李沐,亚马逊 AI 主任科学家,名声在外!半年前,由李沐、Aston Zhang 等人合力打造的《动手学深度学习》正式上线,免费供大家阅读。这是一本面向中文读者的能运行、可讨论的深度学习教科书!

之前,红色石头就分享过这份资源,再次附上:

在线预览地址:

https://zh./

GitHub 项目地址:

https://github.com/d2l-ai/d2l-zh

课程视频地址:

https://space.bilibili.com/209599371/channel/detail?cid=23541

我们知道,作为 MXNet 的作者之一,李沐的这本《动手学深度学习》也是使用 MXNet 框架写成的。但是很多入坑机器学习的萌新们使用的却是 PyTorch。如果有教材对应的 PyTorch 实现代码就更好了!

撒花!今天就给大家带来这本书的 PyTorch 实现源码。最近,来自印度理工学院的数据科学小组,把《动手学深度学习》从 MXNet “翻译”成了 PyTorch,经过 3 个月的努力,这个项目已经基本完成,并登上了 GitHub 热榜。

首先放上这份资源的 GitHub 地址:

https://github.com/dsgiitr/d2l-pytorch

详细目录如下:

  • Ch02 Installation

    • Installation

  • Ch03 Introduction

    • Introduction

  • Ch04 The Preliminaries: A Crashcourse

    • 4.1 Data Manipulation

    • 4.2 Linear Algebra

    • 4.3 Automatic Differentiation

    • 4.4 Probability and Statistics

    • 4.5 Naive Bayes Classification

    • 4.6 Documentation

  • Ch05 Linear Neural Networks

    • 5.1 Linear Regression

    • 5.2 Linear Regression Implementation from Scratch

    • 5.3 Concise Implementation of Linear Regression

    • 5.4 Softmax Regression

    • 5.5 Image Classification Data (Fashion-MNIST)

    • 5.6 Implementation of Softmax Regression from Scratch

    • 5.7 Concise Implementation of Softmax Regression

  • Ch06 Multilayer Perceptrons

    • 6.1 Multilayer Perceptron

    • 6.2 Implementation of Multilayer Perceptron from Scratch

    • 6.3 Concise Implementation of Multilayer Perceptron

    • 6.4 Model Selection Underfitting and Overfitting

    • 6.5 Weight Decay

    • 6.6 Dropout

    • 6.7 Forward Propagation Backward Propagation and Computational Graphs

    • 6.8 Numerical Stability and Initialization

    • 6.9 Considering the Environment

    • 6.10 Predicting House Prices on Kaggle

  • Ch07 Deep Learning Computation

    • 7.1 Layers and Blocks

    • 7.2 Parameter Management

    • 7.3 Deferred Initialization

    • 7.4 Custom Layers

    • 7.5 File I/O

    • 7.6 GPUs

  • Ch08 Convolutional Neural Networks

    • 8.1 From Dense Layers to Convolutions

    • 8.2 Convolutions for Images

    • 8.3 Padding and Stride

    • 8.4 Multiple Input and Output Channels

    • 8.5 Pooling

    • 8.6 Convolutional Neural Networks (LeNet)

  • Ch09 Modern Convolutional Networks

    • 9.1 Deep Convolutional Neural Networks (AlexNet)

    • 9.2 Networks Using Blocks (VGG)

    • 9.3 Network in Network (NiN)

    • 9.4 Networks with Parallel Concatenations (GoogLeNet)

    • 9.5 Batch Normalization

    • 9.6 Residual Networks (ResNet)

    • 9.7 Densely Connected Networks (DenseNet)

  • Ch10 Recurrent Neural Networks

    • 10.1 Sequence Models

    • 10.2 Language Models

    • 10.3 Recurrent Neural Networks

    • 10.4 Text Preprocessing

    • 10.5 Implementation of Recurrent Neural Networks from Scratch

    • 10.6 Concise Implementation of Recurrent Neural Networks

    • 10.7 Backpropagation Through Time

    • 10.8 Gated Recurrent Units (GRU)

    • 10.9 Long Short Term Memory (LSTM)

    • 10.10 Deep Recurrent Neural Networks

    • 10.11 Bidirectional Recurrent Neural Networks

    • 10.12 Machine Translation and DataSets

    • 10.13 Encoder-Decoder Architecture

    • 10.14 Sequence to Sequence

    • 10.15 Beam Search

  • Ch11 Attention Mechanism

    • 11.1 Attention Mechanism

    • 11.2 Sequence to Sequence with Attention Mechanism

    • 11.3 Transformer

  • Ch12 Optimization Algorithms

    • 12.1 Optimization and Deep Learning

    • 12.2 Convexity

    • 12.3 Gradient Descent

    • 12.4 Stochastic Gradient Descent

    • 12.5 Mini-batch Stochastic Gradient Descent

    • 12.6 Momentum

    • 12.7 Adagrad

    • 12.8 RMSProp

    • 12.9 Adadelta

    • 12.10 Adam

其中,每一小节都是可以运行的 Jupyter 记事本,你可以自由修改代码和超参数来获取及时反馈,从而积累深度学习的实战经验。

目前,PyTorch 代码还有 6 个小节没有完成,但整体的完成度已经很高了!开发团队希望更多的爱好者加入进来,贡献一份力量!

最后,再次附上 GitHub 地址:

https://github.com/dsgiitr/d2l-pytorch


    本站是提供个人知识管理的网络存储空间,所有内容均由用户发布,不代表本站观点。请注意甄别内容中的联系方式、诱导购买等信息,谨防诈骗。如发现有害或侵权内容,请点击一键举报。
    转藏 分享 献花(0

    0条评论

    发表

    请遵守用户 评论公约

    类似文章 更多