Attention in Long Short-Term Memory Recurrent Neural Networks
http://machinelearningmastery.com/attention-long-short-term-memory-recurrent-neural-networks/
http://machinelearningmastery.com/attention-long-short-term-memory-recurrent-neural-networks/
MachineLearningMastery.com
Attention in Long Short-Term Memory Recurrent Neural Networks - MachineLearningMastery.com
The Encoder-Decoder architecture is popular because it has demonstrated state-of-the-art results across a range of domains. A limitation of the architecture is that it encodes the input sequence to a fixed length internal representation. This imposes limits…
Автоэнкодеры в Keras, Часть 5: GAN(Generative Adversarial Networks) и tensorflow
https://habrahabr.ru/post/332000/
https://habrahabr.ru/post/332000/
Habr
Автоэнкодеры в Keras, Часть 5: GAN(Generative Adversarial Networks) и tensorflow
Содержание Часть 1: Введение Часть 2: Manifold learning и скрытые (latent) переменные Часть 3: Вариационные автоэнкодеры (VAE) Часть 4: Conditional VAE...
Автоэнкодеры в Keras, часть 6: VAE + GAN
https://habrahabr.ru/post/332074/
https://habrahabr.ru/post/332074/
Habr
Автоэнкодеры в Keras, часть 6: VAE + GAN
Содержание Часть 1: Введение Часть 2: Manifold learning и скрытые ( latent ) переменные Часть 3: Вариационные автоэнкодеры ( VAE ) Часть 4: Conditional VAE Часть 5: GAN (Generative Adversarial...
Gentle Introduction to the Adam Optimization Algorithm for Deep Learning
http://machinelearningmastery.com/adam-optimization-algorithm-for-deep-learning/
http://machinelearningmastery.com/adam-optimization-algorithm-for-deep-learning/
2nd Place Solution to 2017 Data Science Bowl
https://dhammack.github.io/kaggle-ndsb2017/
https://dhammack.github.io/kaggle-ndsb2017/
dhammack.github.io
2nd Place Solution to 2017 DSB - Daniel Hammack and Julian de Wit
Foreword
Tips for Training Recurrent Neural Networks
http://danijar.com/tips-for-training-recurrent-neural-networks/
http://danijar.com/tips-for-training-recurrent-neural-networks/
How to Visualize Your Recurrent Neural Network with Attention in Keras
https://medium.com/datalogue/attention-in-keras-1892773a4f22
https://medium.com/datalogue/attention-in-keras-1892773a4f22
Medium
How to Visualize Your Recurrent Neural Network with Attention in Keras
A technical discussion and tutorial
Improving LSTM-CTC based ASR performance in domains with limited training data
https://github.com/jb1999/eesen/blob/master/papers/LSTM-CTCperf2017.pdf
https://github.com/jb1999/eesen/blob/master/papers/LSTM-CTCperf2017.pdf
GitHub
jb1999/eesen
eesen - The official repository of the Eesen project
YellowFin: An automatic tuner for momentum SGD
http://cs.stanford.edu/~zjian/project/YellowFin/
http://cs.stanford.edu/~zjian/project/YellowFin/
Tensorflow Implementation of PathNet: Evolution Channels Gradient Descent in Super Neural Networks
https://github.com/jaesik817/pathnet
https://github.com/jaesik817/pathnet
GitHub
jaesik817/pathnet
pathnet - Tensorflow Implementation of PathNet: Evolution Channels Gradient Descent in Super Neural Networks
A TensorFlow implementation of “A neural autoregressive topic model”
http://blog.aylien.com/tensorflow-implementation-neural-autoregressive-topic-model-docnade/
http://blog.aylien.com/tensorflow-implementation-neural-autoregressive-topic-model-docnade/
AYLIEN
A TensorFlow implementation of "A neural autoregressive topic model" (DocNADE) - AYLIEN
In this post we give a brief overview of the DocNADE model, and provide a TensorFlow implementation.
When not to use deep learning
http://hyperparameter.space/blog/when-not-to-use-deep-learning/
http://hyperparameter.space/blog/when-not-to-use-deep-learning/
hyperparameter.space
When not to use deep learning
I know it’s a weird way to start a blog with a negative, but there was a wave of discussion in the last few days that I think serves as a good hook for some topics on which I’ve been thinking recently. It all started with a post in the Simply Stats blog by…
Evaluation code for various automated metrics for Natural Language Generation
https://github.com/Maluuba/nlg-eval
https://github.com/Maluuba/nlg-eval
GitHub
GitHub - Maluuba/nlg-eval: Evaluation code for various unsupervised automated metrics for Natural Language Generation.
Evaluation code for various unsupervised automated metrics for Natural Language Generation. - Maluuba/nlg-eval
What are the implications of relational reasoning (as talked about here) on NLP?
https://www.youtube.com/watch?v=vzg5Qe0pTKk&feature=youtu.be
https://www.youtube.com/watch?v=vzg5Qe0pTKk&feature=youtu.be
YouTube
DeepMind's AI Learns Superhuman Relational Reasoning | Two Minute Papers #168
The paper "A simple neural network module for relational reasoning" is available here: https://arxiv.org/abs/1706.01427 Details on our Patreon page: https://...
Building a Sound Classifier from scratch using Neural Networks
https://www.skcript.com/svr/building-audio-classifier-nueral-network/
https://www.skcript.com/svr/building-audio-classifier-nueral-network/
Skcript
How to teach Neural Networks to detect everyday sounds | Skcript
How to build a Sound Classifier from scratch using Neural Networks. Recognizing day-to-day sounds using Artificial Intelligence. Learn more.
Pedestrian Alignment for Person Re-identification
https://github.com/layumi/Pedestrian_Alignment
https://github.com/layumi/Pedestrian_Alignment
GitHub
layumi/Pedestrian_Alignment
TCSVT2018 Pedestrian Alignment Network for Large-scale Person Re-identification - layumi/Pedestrian_Alignment