You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Text Classification | [_Hierarchical Attention Networks for Document Classification_](https://www.semanticscholar.org/paper/Hierarchical-Attention-Networks-for-Document-Yang-Yang/1967ad3ac8a598adc6929e9e6b9682734f789427) | [a PyTorch Tutorial to Text Classification](https://github.com/sgrvinod/a-PyTorch-Tutorial-to-Text-Classification) | • hierarchical attention | 🟡*code complete*
23
-
Super-Resolution | [_Photo-Realistic Single Image Super-Resolution Using a Generative Adversarial Network_](https://arxiv.org/abs/1609.04802) | [a PyTorch Tutorial to Super-Resolution](https://github.com/sgrvinod/a-PyTorch-Tutorial-to-Super-Resolution) | •**GANs**! — this is also a GAN tutorial! <br/><br/>• residual connections <br/><br/>• sub-pixel convolution <br/><br/>• perceptual loss | 🟢*complete*
24
-
Machine Translation | [_Attention Is All You Need_](https://arxiv.org/abs/1706.03762) | [a PyTorch Tutorial to Machine Translation](https://github.com/sgrvinod/a-PyTorch-Tutorial-to-Machine-Translation) | •**transformers**! — this is also a transformer tutorial! <br/><br/>• multi-head attention <br/><br/>• positional embeddings <br/><br/>• encoder-decoder architecture <br/><br/>• byte pair encoding <br/><br/>• beam search | 🟢*complete*
25
-
Semantic Segmentation | [_SegFormer: Simple and Efficient Design for Semantic Segmentation with Transformers_](https://arxiv.org/abs/2105.15203) | a PyTorch Tutorial to Semantic Segmentation | N/A | 🔴*planned*
19
+
Image Captioning | [_Show, Attend, and Tell_](https://arxiv.org/abs/1502.03044) | [a PyTorch Tutorial to Image Captioning](https://github.com/sgrvinod/a-PyTorch-Tutorial-to-Image-Captioning) | • encoder-decoder architecture<br/><br/>• attention <br/><br/>• transfer learning <br/><br/>• beam search | 🟢<br/>*complete*
20
+
Sequence Labeling | [_Empower Sequence Labeling with Task-Aware Neural Language Model_](https://arxiv.org/abs/1709.04109) | [a PyTorch Tutorial to Sequence Labeling](https://github.com/sgrvinod/a-PyTorch-Tutorial-to-Sequence-Labeling) | • language models<br/><br/>• character RNNs <br/><br/>• multi-task learning <br/><br/>• conditional random fields <br/><br/>• Viterbi decoding <br/><br/>• highway networks | 🟢<br/>*complete*
Text Classification | [_Hierarchical Attention Networks for Document Classification_](https://www.semanticscholar.org/paper/Hierarchical-Attention-Networks-for-Document-Yang-Yang/1967ad3ac8a598adc6929e9e6b9682734f789427) | [a PyTorch Tutorial to Text Classification](https://github.com/sgrvinod/a-PyTorch-Tutorial-to-Text-Classification) | • hierarchical attention | 🟡<br/>*code complete*
23
+
Super-Resolution | [_Photo-Realistic Single Image Super-Resolution Using a Generative Adversarial Network_](https://arxiv.org/abs/1609.04802) | [a PyTorch Tutorial to Super-Resolution](https://github.com/sgrvinod/a-PyTorch-Tutorial-to-Super-Resolution) | •**GANs**! — this is also a GAN tutorial! <br/><br/>• residual connections <br/><br/>• sub-pixel convolution <br/><br/>• perceptual loss | 🟢<br/>*complete*
24
+
Machine Translation | [_Attention Is All You Need_](https://arxiv.org/abs/1706.03762) | [a PyTorch Tutorial to Machine Translation](https://github.com/sgrvinod/a-PyTorch-Tutorial-to-Machine-Translation) | •**transformers**! — this is also a transformer tutorial! <br/><br/>• multi-head attention <br/><br/>• positional embeddings <br/><br/>• encoder-decoder architecture <br/><br/>• byte pair encoding <br/><br/>• beam search | 🟢<br/>*complete*
25
+
Semantic Segmentation | [_SegFormer: Simple and Efficient Design for Semantic Segmentation with Transformers_](https://arxiv.org/abs/2105.15203) | a PyTorch Tutorial to Semantic Segmentation | N/A | 🔴<br/>*planned*
0 commit comments