Deep Gravity – Telegram
Deep Gravity
393 subscribers
60 photos
35 videos
17 files
495 links
AI

Contact:
DeepL.Gravity@gmail.com
Download Telegram
P-CapsNets: a General Form of #ConvolutionalNeuralNetworks

We propose Pure CapsNets (P-CapsNets) which is a generation of normal #CNNs structurally. Specifically, we make three modifications to current CapsNets. First, we remove routing procedures from CapsNets based on the observation that the coupling coefficients can be learned implicitly. Second, we replace the convolutional layers in CapsNets to improve efficiency. Third, we package the capsules into rank-3 tensors to further improve efficiency. The experiment shows that P-CapsNets achieve better performance than CapsNets with varied routing procedures by using significantly fewer parameters on MNIST&CIFAR10. The high efficiency of P-CapsNets is even comparable to some deep compressing models. For example, we achieve more than 99% percent accuracy on MNIST by using only 3888 parameters. We visualize the capsules as well as the corresponding correlation matrix to show a possible way of initializing CapsNets in the future. We also explore the adversarial robustness of P-CapsNets compared to CNNs.

Paper

🔭 @DeepGravity
DerainCycleGAN: An Attention-guided Unsupervised Benchmark for Single Image Deraining and Rainmaking

Single image deraining (SID) is an important and challenging topic in emerging vision applications, and most of emerged deraining methods are supervised relying on the ground truth (i.e., paired images) in recent years. However, in practice it is rather common to have no un-paired images in real deraining task, in such cases how to remove the rain streaks in an unsupervised way will be a very challenging task due to lack of constraints between images and hence suffering from low-quality recovery results. In this paper, we explore the unsupervised SID task using unpaired data and propose a novel net called Attention-guided Deraining by Constrained CycleGAN (or shortly, DerainCycleGAN), which can fully utilize the constrained transfer learning abilitiy and circulatory structure of #CycleGAN. Specifically, we design an unsu-pervised attention guided rain streak extractor (U-ARSE) that utilizes a memory to extract the rain streak masks with two constrained cycle-consistency branches jointly by paying attention to both the rainy and rain-free image domains. As a by-product, we also contribute a new paired rain image dataset called Rain200A, which is constructed by our network automatically. Compared with existing synthesis datasets, the rainy streaks in Rain200A contains more obvious and diverse shapes and directions. As a result, existing supervised methods trained on Rain200A can perform much better for processing real rainy images. Extensive experiments on synthesis and real datasets show that our net is superior to existing unsupervised deraining networks, and is also very competitive to other related supervised networks.

Paper

🔭 @DeepGravity
17 #Statistical Hypothesis Tests in #Python (Cheat Sheet)

Link

🔭 @DeepGravity
Progressive #VAE Training on Highly Sparse and Imbalanced Data

In this paper, we present a novel approach for training a #VariationalAutoencoder (VAE) on a highly imbalanced data set. The proposed training of a high-resolution VAE model begins with the training of a low-resolution core model, which can be successfully trained on imbalanced data set. In subsequent training steps, new convolutional, upsampling, deconvolutional, and downsampling layers are iteratively attached to the model. In each iteration, the additional layers are trained based on the intermediate pretrained model - a result of previous training iterations. Thus, the resolution of the model is progressively increased up to the required resolution level. In this paper, the progressive VAE training is exploited for learning a latent representation with imbalanced, highly sparse data sets and, consequently, generating routes in a constrained 2D space. Routing problems (e.g., vehicle routing problem, travelling salesman problem, and arc routing) are of special significance in many modern applications (e.g., route planning, network maintenance, developing high-performance nanoelectronic systems, and others) and typically associated with sparse imbalanced data. In this paper, the critical problem of routing billions of components in nanoelectronic devices is considered. The proposed approach exhibits a significant training speedup as compared with state-of-the-art existing VAE training methods, while generating expected image outputs from unseen input data. Furthermore, the final progressive VAE models exhibit much more precise output representation, than the #GenerativeAdversarialNetwork ( #GAN ) models trained with comparable training time. The proposed method is expected to be applicable to a wide range of applications, including but not limited image impainting, sentence interpolation, and semi-supervised learning.

Paper

🔭 @DeepGravity
Image Data Augmentation for #TensorFlow 2, #Keras and #PyTorch with Albumentations in #Python

TL;DR Learn how to create new examples for your dataset using image augmentation techniques. Load a scanned document image and apply various augmentations. Create an augmented dataset for Object Detection.

Article

🔭 @DeepGravity
A Gentle Introduction to Imbalanced #Classification

After completing this tutorial, you will know:

* Imbalanced classification is the problem of classification when there is an unequal distribution of classes in the training dataset.
* The imbalance in the class distribution may vary, but a severe imbalance is more challenging to model and may require specialized techniques.
* Many real-world classification problems have an imbalanced class distribution, such as fraud detection, spam detection, and churn prediction.

Link

🔭 @DeepGravity
Deep Gravity pinned «Our reinforcement learning architect designs have been just published on #NeurIPS2019 AI Art Gallery: https://lnkd.in/dFZ37BN Seems it draws like a baby now, but is growing and hopefully would be a skillful #RL artist very soon. #reinforcementlearning…»
5 #Financial Services Tech Trends to Watch in 2020

1. The Role of #ArtificialIntelligence in Finance Will Expand
2. Financial Services Firms Will Grow Their Use of #Data Analytics
3. #Blockchain Will Be a Key Security Solution
4. More #Bank Branches Will Undergo Digital Transformations
5. Automation Will Take Over More Financial Services

Link

🔭 @DeepGravity
Inside #DeepMind 's epic mission to solve science's trickiest problem
DeepMind's AI has beaten chess grandmasters and #Go champions. But founder and CEO Demis Hassabis now has his sights set on bigger, real-world problems that could change lives

Link

🔭 @DeepGravity
Explaining #ReinforcementLearning: Active vs Passive

We examine the required elements to solve an RL problem, compare passive and active reinforcement learning, and review common active and passive RL techniques.

Article

🔭 @DeepGravity
Prediction of Physical Load Level by #MachineLearning Analysis of Heart Activity after Exercises

Paper

🔭 @DeepGravity
Triple #GenerativeAdversarialNetworks

Generative adversarial networks (GANs) have shown promise in image generation and classification given limited supervision. Existing methods extend the unsupervised GAN framework to incorporate supervision heuristically. Specifically, a single discriminator plays two incompatible roles of identifying fake samples and predicting labels and it only estimates the data without considering the labels. The formulation intrinsically causes two problems: (1) the generator and the discriminator (i.e., the classifier) may not converge to the data distribution at the same time; and (2) the generator cannot control the semantics of the generated samples. In this paper, we present the triple generative adversarial network (Triple-GAN), which consists of three players—a generator, a classifier, and a discriminator. The generator and the classifier characterize the conditional distributions between images and labels, and the discriminator solely focuses on identifying fake image-label pairs. We design compatible objective functions to ensure that the distributions characterized by the generator and the classifier converge to the data distribution. We evaluate Triple-GAN in two challenging settings, namely, semi-supervised learning and the extreme low data regime. In both settings, Triple-GAN can achieve state-of-the-art classification results among deep generative models and generate meaningful samples in a specific class simultaneously.

Paper

🔭 @DeepGravity
Hyperparameter Tuning On #Google Cloud Platform With #Scikit_Learn

Google Cloud Platform’s AI Platform (formerly ML Engine) offers a hyperparameter tuning service for your models. Why should you take the extra time and effort to learn how to use it instead of just running the code you already have on a virtual machine? Are the benefits worth the extra time and effort?

Link

🔭 @DeepGravity