Tensorflow implementation of Deepmind Interaction Networks for Learning about Objects, Relations and Physics
https://github.com/jaesik817/Interaction-networks_tensorflow
https://github.com/jaesik817/Interaction-networks_tensorflow
GitHub
jaesik817/Interaction-networks_tensorflow
Interaction-networks_tensorflow - Tensorflow Implementation of Interaction Networks for Learning about Objects, Relations and Physics
[Discussion] Read-through: Hyperparameter Optimization: A Spectral Approach
http://www.alexirpan.com/2017/06/27/hyperparam-spectral.html
http://www.alexirpan.com/2017/06/27/hyperparam-spectral.html
Alexirpan
Read-through: Hyperparameter Optimization: A Spectral Approach
Similar to Wasserstein GAN,
this is another theory-motivated paper with neat
applications to deep learning. Once again, if you are looking for proof
details, you are better off reading the original paper. The goal
of this post is to give background and motivation.
this is another theory-motivated paper with neat
applications to deep learning. Once again, if you are looking for proof
details, you are better off reading the original paper. The goal
of this post is to give background and motivation.
Dynamic routing in artificial neural networks (ICML2017)
https://www.youtube.com/watch?v=NHQsDaycwyQ&feature=youtu.be
https://www.youtube.com/watch?v=NHQsDaycwyQ&feature=youtu.be
YouTube
Dynamic Routing in Artificial Neural Networks (Video Abstract)
[Project Resources] ICML 2017 Paper (Preprint): https://arxiv.org/abs/1703.06217 Poster: https://www.dropbox.com/s/svh5610fpfh7np1/drann-poster.pdf?dl=0 Soft...
Two ways to improve model accuracy and reduce training time -Explained
https://hackernoon.com/training-your-deep-model-faster-and-sharper-e85076c3b047
https://hackernoon.com/training-your-deep-model-faster-and-sharper-e85076c3b047
Hackernoon
Train your deep model faster and sharper — two novel techniques
A short Conditional DCGAN tensorflow implementation
https://github.com/Eyyub/tensorflow-cdcgan
https://github.com/Eyyub/tensorflow-cdcgan
GitHub
Eyyub/tensorflow-cdcgan
tensorflow-cdcgan - A short Conditional DCGAN tensorflow implementation.
Zero to One — A Ton of Awe-Inspiring Deep Learning Demos with Code for Beginners
https://medium.com/@SamPutnam/deep-learning-download-and-run-a9a1e374d2d9
https://medium.com/@SamPutnam/deep-learning-download-and-run-a9a1e374d2d9
Medium
Zero to One — A Ton of Awe-Inspiring Deep Learning Demos with Code for Beginners
Check it out!
How to actually build a neural network from blocks? - with notMNIST in Keras [webinar]
https://www.crowdcast.io/e/neural-network-blocks/register
https://www.crowdcast.io/e/neural-network-blocks/register
Crowdcast
How to actually build a neural network from blocks? - Crowdcast
Deep learning is about creating machine learning models from compossible blocks, called layers. In this webinar you will learn convolutional neural network architectures for image classification.
Projected Gradient Descent for finding Max(Min) Eigenvalues
https://sudeepraja.github.io/PGD/
https://sudeepraja.github.io/PGD/
Sudeep Raja
Projected Gradient Descent - Max(Min) Eigenvalues(vectors)
This post is about finding the minimum and maximum eigenvalues and the corresponding eigenvectors of a matrix \(A\) using Projected Gradient Descent. There are several algorithms to find the eigenvalues of a given matrix (See Eigenvalue algorithms). Although…
Modeling documents with Generative Adversarial Networks
http://blog.aylien.com/modeling-documents-generative-adversarial-networks/
http://blog.aylien.com/modeling-documents-generative-adversarial-networks/
AYLIEN
Modeling documents with Generative Adversarial Networks - AYLIEN
In this post I provide a brief overview of the Generative Adversarial Networks paper and walk through some of the code.
Interpreting neurons in an LSTM network
http://yerevann.github.io/2017/06/27/interpreting-neurons-in-an-LSTM-network/
http://yerevann.github.io/2017/06/27/interpreting-neurons-in-an-LSTM-network/
Performance RNN: Generating Music with Expressive Timing and Dynamics
https://magenta.tensorflow.org/performance-rnn
https://magenta.tensorflow.org/performance-rnn
Magenta
Performance RNN: Generating Music with Expressive Timing and Dynamics
We present Performance RNN, an LSTM-based recurrent neural network designed to model polyphonic music with expressive timing and dynamics. Here’s an example...
Using keras-vis to debug self-driving model, toy example.
https://github.com/raghakot/keras-vis/blob/master/applications/self_driving/visualize_attention.ipynb
https://github.com/raghakot/keras-vis/blob/master/applications/self_driving/visualize_attention.ipynb
GitHub
raghakot/keras-vis
Neural network visualization toolkit for keras. Contribute to raghakot/keras-vis development by creating an account on GitHub.
OpenAI open sources a high-performance Python library for robotic simulation
https://blog.openai.com/faster-robot-simulation-in-python/
https://blog.openai.com/faster-robot-simulation-in-python/
OpenAI Blog
Faster Physics in Python
We're open-sourcing a high-performance Python library for robotic simulation using the MuJoCo engine, developed over our past year of robotics research. View CodeView Docs This library is one of our core tools for deep learning robotics research, which we've…
Attention in Long Short-Term Memory Recurrent Neural Networks
http://machinelearningmastery.com/attention-long-short-term-memory-recurrent-neural-networks/
http://machinelearningmastery.com/attention-long-short-term-memory-recurrent-neural-networks/
MachineLearningMastery.com
Attention in Long Short-Term Memory Recurrent Neural Networks - MachineLearningMastery.com
The Encoder-Decoder architecture is popular because it has demonstrated state-of-the-art results across a range of domains. A limitation of the architecture is that it encodes the input sequence to a fixed length internal representation. This imposes limits…
Автоэнкодеры в Keras, Часть 5: GAN(Generative Adversarial Networks) и tensorflow
https://habrahabr.ru/post/332000/
https://habrahabr.ru/post/332000/
Habr
Автоэнкодеры в Keras, Часть 5: GAN(Generative Adversarial Networks) и tensorflow
Содержание Часть 1: Введение Часть 2: Manifold learning и скрытые (latent) переменные Часть 3: Вариационные автоэнкодеры (VAE) Часть 4: Conditional VAE...
Автоэнкодеры в Keras, часть 6: VAE + GAN
https://habrahabr.ru/post/332074/
https://habrahabr.ru/post/332074/
Habr
Автоэнкодеры в Keras, часть 6: VAE + GAN
Содержание Часть 1: Введение Часть 2: Manifold learning и скрытые ( latent ) переменные Часть 3: Вариационные автоэнкодеры ( VAE ) Часть 4: Conditional VAE Часть 5: GAN (Generative Adversarial...
Gentle Introduction to the Adam Optimization Algorithm for Deep Learning
http://machinelearningmastery.com/adam-optimization-algorithm-for-deep-learning/
http://machinelearningmastery.com/adam-optimization-algorithm-for-deep-learning/
2nd Place Solution to 2017 Data Science Bowl
https://dhammack.github.io/kaggle-ndsb2017/
https://dhammack.github.io/kaggle-ndsb2017/
dhammack.github.io
2nd Place Solution to 2017 DSB - Daniel Hammack and Julian de Wit
Foreword
Tips for Training Recurrent Neural Networks
http://danijar.com/tips-for-training-recurrent-neural-networks/
http://danijar.com/tips-for-training-recurrent-neural-networks/
How to Visualize Your Recurrent Neural Network with Attention in Keras
https://medium.com/datalogue/attention-in-keras-1892773a4f22
https://medium.com/datalogue/attention-in-keras-1892773a4f22
Medium
How to Visualize Your Recurrent Neural Network with Attention in Keras
A technical discussion and tutorial