MatchGAN: A Self-supervised Semi-supervised Conditional Generative Adversarial Network
Extensive empirical evaluation demonstrates the effectiveness of our proposed method over competitive baselines and existing arts. In particular, our method is able to surpass the baseline with only 20% of the labelled examples used to train the baseline..
Github: https://github.com/justin941208/MatchGAN
Paper: https://arxiv.org/abs/2006.06614v1
Extensive empirical evaluation demonstrates the effectiveness of our proposed method over competitive baselines and existing arts. In particular, our method is able to surpass the baseline with only 20% of the labelled examples used to train the baseline..
Github: https://github.com/justin941208/MatchGAN
Paper: https://arxiv.org/abs/2006.06614v1
Building effective FAQ with Knowledge Bases, BERT and Sentence Clustering
https://towardsdatascience.com/building-effective-faq-with-knowledge-bases-bert-and-sentence-clustering-b0c15727bbdb
https://towardsdatascience.com/building-effective-faq-with-knowledge-bases-bert-and-sentence-clustering-b0c15727bbdb
Medium
Building effective FAQ with Knowledge Bases, BERT and Sentence Clustering
How to identify and expose the knowledge that matters
Fast Gradient Boosting with CatBoost
https://heartbeat.fritz.ai/fast-gradient-boosting-with-catboost-38779b0d5d9a
https://heartbeat.fritz.ai/fast-gradient-boosting-with-catboost-38779b0d5d9a
Using Selective Attention in Reinforcement Learning Agents
http://ai.googleblog.com/2020/06/using-selective-attention-in.html
http://ai.googleblog.com/2020/06/using-selective-attention-in.html
Googleblog
Using Selective Attention in Reinforcement Learning Agents
Fourier Features Let Networks Learn
High Frequency Functions in Low Dimensional Domains
https://people.eecs.berkeley.edu/~bmild/fourfeat/index.html
High Frequency Functions in Low Dimensional Domains
https://people.eecs.berkeley.edu/~bmild/fourfeat/index.html
Neural Manifold Ordinary Differential Equations
Article: https://arxiv.org/abs/2006.10254
Github: https://github.com/CUVL/Neural-Manifold-Ordinary-Differential-Equations
Article: https://arxiv.org/abs/2006.10254
Github: https://github.com/CUVL/Neural-Manifold-Ordinary-Differential-Equations
How to Avoid Data Leakage When Performing Data Preparation
https://machinelearningmastery.com/data-preparation-without-data-leakage/
https://machinelearningmastery.com/data-preparation-without-data-leakage/
MachineLearningMastery.com
How to Avoid Data Leakage When Performing Data Preparation - MachineLearningMastery.com
Data preparation is the process of transforming raw data into a form that is appropriate for modeling. A naive approach to preparing data applies the transform on the entire dataset before evaluating the performance of the model. This results in a problem…
PULSE: Self-Supervised Photo Upsampling via Latent Space Exploration of Generative Models
Given a low-resolution input image, PULSE searches the outputs of a generative model for high-resolution images that are perceptually realistic and downscale correctly.
Github: https://github.com/adamian98/pulse
Paper: https://arxiv.org/abs/2003.03808v1
Given a low-resolution input image, PULSE searches the outputs of a generative model for high-resolution images that are perceptually realistic and downscale correctly.
Github: https://github.com/adamian98/pulse
Paper: https://arxiv.org/abs/2003.03808v1
Google & DeepMind Researchers Revamp ImageNet
https://syncedreview.com/2020/06/23/google-deepmind-researchers-revamp-imagenet/
ImageNet: https://arxiv.org/pdf/2006.07159.pdf
https://syncedreview.com/2020/06/23/google-deepmind-researchers-revamp-imagenet/
ImageNet: https://arxiv.org/pdf/2006.07159.pdf
Synced | AI Technology & Industry Review
Google & DeepMind Researchers Revamp ImageNet | Synced
Google Brain in Zürich and DeepMind London researchers believe one of the world's most popular image databases may need a makeover.
A state-of-the-art, self-supervised framework for video understanding
https://ai.facebook.com/blog/a-state-of-the-art-self-supervised-framework-for-video-understanding/
https://ai.facebook.com/blog/a-state-of-the-art-self-supervised-framework-for-video-understanding/
Facebook
A state-of-the-art, self-supervised framework for video understanding
Generalized Data Transformations give us a systematic way of robustly learning the relationship between audio and visual information in order to learn about the structure of the world.
Learning Semantically Enhanced Feature for Fine-Grained Image Classification
Gitgub: https://github.com/cswluo/SEF
Paper: https://arxiv.org/abs/2006.13457v1
Gitgub: https://github.com/cswluo/SEF
Paper: https://arxiv.org/abs/2006.13457v1
Building AI Trading Systems
Lessons learned building a profitable algorithmic trading system using Reinforcement Learning techniques.
https://dennybritz.com/blog/ai-trading/
Lessons learned building a profitable algorithmic trading system using Reinforcement Learning techniques.
https://dennybritz.com/blog/ai-trading/
Dennybritz
Building AI Trading Systems
Lessons learned building a profitable algorithmic trading system using Reinforcement Learning techniques.
Tensor Programs II: Neural Tangent Kernel for Any Architecture
which shows the tangent kernel of any randomly initialized neural network converges in the large width limit.
Github: https://github.com/thegregyang/NTK4A
Paper: https://arxiv.org/abs/2006.14548
which shows the tangent kernel of any randomly initialized neural network converges in the large width limit.
Github: https://github.com/thegregyang/NTK4A
Paper: https://arxiv.org/abs/2006.14548
8 Top Books on Data Cleaning and Feature Engineering
https://machinelearningmastery.com/books-on-data-cleaning-data-preparation-and-feature-engineering/
https://machinelearningmastery.com/books-on-data-cleaning-data-preparation-and-feature-engineering/
SmartReply for YouTube Creators
https://ai.googleblog.com/2020/07/smartreply-for-youtube-creators.html
https://ai.googleblog.com/2020/07/smartreply-for-youtube-creators.html
Googleblog
SmartReply for YouTube Creators
Announcing CUDA on Windows Subsystem for Linux 2
https://developer.nvidia.com/blog/announcing-cuda-on-windows-subsystem-for-linux-2/
https://developer.nvidia.com/blog/announcing-cuda-on-windows-subsystem-for-linux-2/
NVIDIA Technical Blog
Announcing CUDA on Windows Subsystem for Linux 2
In response to popular demand, Microsoft announced a new feature of the Windows Subsystem for Linux 2 (WSL 2)—GPU acceleration—at the Build conference in May 2020. This feature opens the gate for many…