致富彩票

2020台大《应用深度学习》课程

  • 名称:2020台大《应用深度学习
  • 分类:人工智能  
  • 观看人数:加载中
  • 时间:2020/12/5 20:59:39
分享到:

    本课程主要讲解如何利用深度学习算法来解决各种实际应用场景问题,学生学习如何使用这些深度学习算法,以及为什么要使用这些算法。本课程希望学生在课堂上学习理论,并通过做作业和最后的项目来学习实施方法。 注意:如果已修过类似的课程,例如,李宏毅老师的课程,则无需修此课程。

课程涵盖了深度学习和表示学习中的最新技术,重点包括监督/自监督学习、嵌入方法、度量学习、卷积网络和循环网络,并应用于计算机视觉、自然语言理解和语音识别。

This course is enable students to learn how and why to apply deep learning to tackle various practical problems, where the students are expected to learn the theory during the class and learn the implementation by doing assignments and final projects.

Lecture 0 2019/02/19 Course Logistics [slides]


Registration: [Google Form]

Lecture 1 2019/02/26 Introduction [slides] (video)

Guest Lecture (R103) [PyTorch Tutorial]

Lecture 2 2019/03/05 Neural Network Basics [slides] (video)

Suggested Readings:

[Linear Algebra]

[Linear Algebra Slides]

[Linear Algebra Quick Review]

A1 2019/03/05 A1: Dialogue Response Selection [A1 pages]

Lecture 3 2019/03/12 Backpropagation [slides] (video)

Word Representation [slides] (video)

Suggested Readings:

[Learning Representations]

[Vector Space Models of Semantics]

[RNNLM: Recurrent Neural Nnetwork Language Model]

[Extensions of RNNLM]

[Optimzation]

Lecture 4 2019/03/19 Recurrent Neural Network [slides] (video)

Basic Attention [slides] (video)

Suggested Readings:

[RNN for Language Understanding]

[RNN for Joint Language Understanding]

[Sequence-to-Sequence Learning]

[Neural Conversational Model]

[Neural Machine Translation with Attention]

[Summarization with Attention]

[Normalization]

A2 2019/03/19 A2: Contextual Embeddings [A2 pages]

Lecture 5 2019/03/26 Word Embeddings [slides] (video)

Contextual Embeddings - ELMo [slides] (video)

Suggested Readings:

[Estimation of Word Representations in Vector Space]

[GloVe: Global Vectors for Word Representation]

[Sequence Tagging with BiLM]

[Learned in Translation: Contextualized Word Vectors]

[ELMo: Embeddings from Language Models]

[More Embeddings]

2019/04/02 Spring Break A1 Due

Lecture 6 2019/04/09 Transformer [slides] (video)


Contextual Embeddings - BERT [slides] (video)


Gating Mechanism [slides] (video)

Suggested readings:

[Contextual Word Representations Introduction]

[Attention is all you need]

[BERT: Pre-training of Bidirectional Transformers]

[GPT: Improving Understanding by Unsupervised Learning]

[Long Short-Term Memory]

[Gated Recurrent Unit]

[More Transformer]

Lecture 7 2019/04/16 Reinforcement Learning Intro [slides] (video)

Basic Q-Learning [slides] (video)

Suggested Readings:

[Reinforcement Learning Intro]

[Stephane Ross' thesis]

[Playing Atari with Deep Reinforcement Learning]

[Deep Reinforcement Learning with Double Q-learning]

[Dueling Network Architectures for Deep Reinforcement Learning]

A3 2019/04/16 A3: RL for Game Playing [A3 pages]

Lecture 8 2019/04/23 Policy Gradient [slides] (video)

Actor-Critic (video)

More about RL [slides] (video) Suggested Readings:

[Asynchronous Methods for Deep Reinforcement Learning]

[Deterministic Policy Gradient Algorithms]

[Continuous Control with Deep Reinforcement Learning]

A2 Due

Lecture 9 2019/04/30 Generative Adversarial Networks [slides] (video)

(Lectured by Prof. Hung-Yi Lee)

Lecture 10 2019/05/07 Convolutional Neural Networks [slides]

A4 2019/05/07 A4: Drawing [A4 pages]

2019/05/14 Break A3 Due

Lecture 11 2019/05/21 Unsupervised Learning [slides]

NLP Examples [slides]

Project Plan [slides]

Special 2019/05/28 Company Workshop Registration: [Google Form]

2019/06/04 Break A4 Due

Lecture 12 2019/06/11 Project Progress Presentation

Course and Career Discussion

Special 2019/06/18 Company Workshop Registration: [Google Form]

Lecture 13 2019/06/25 Final Presentation