Deep Gravity – Telegram
Deep Gravity
393 subscribers
60 photos
35 videos
17 files
495 links
AI

Contact:
DeepL.Gravity@gmail.com
Download Telegram
Hyperparameter Tuning On #Google Cloud Platform With #Scikit_Learn

Google Cloud Platform’s AI Platform (formerly ML Engine) offers a hyperparameter tuning service for your models. Why should you take the extra time and effort to learn how to use it instead of just running the code you already have on a virtual machine? Are the benefits worth the extra time and effort?

Link

🔭 @DeepGravity
secml: A #Python Library for Secure and Explainable #MachineLearning

We present secml, an open-source Python library for secure and explainable machine learning. It implements the most popular attacks against machine learning, including not only test-time evasion attacks to generate adversarial examples against deep neural networks, but also training-time poisoning attacks against support vector machines and many other algorithms. These attacks enable evaluating the security of learning algorithms and of the corresponding defenses under both white-box and black-box threat models. To this end, secml provides built-in functions to compute security evaluation curves, showing how quickly classification performance decreases against increasing adversarial perturbations of the input data. secml also includes explainability methods to help understand why adversarial attacks succeed against a given model, by visualizing the most influential features and training prototypes contributing to each decision. It is distributed under the Apache License 2.0, and hosted at https://gitlab.com/secml/secml.

Paper

🔭 @DeepGravity
A Gentle Introduction to #ProbabilityDensityEstimation

After completing this tutorial, you will know:

* Histogram plots provide a fast and reliable way to visualize the probability density of a data sample.
* Parametric probability density estimation involves selecting a common distribution and estimating the parameters for the density function from a data sample.
* Nonparametric probability density estimation involves using a technique to fit a model to the arbitrary distribution of the data, like kernel density estimation.

Link

🔭 @DeepGravity
#Evolution of #NeuralNetworks


Today, #AI lives its golden age whereas neural networks make a great contribution to it. Neural networks change our lifes without even realizing it. It lies behind the image, face and speech recognition, also language translation, even in future predictions. However, it is not coming to the present form in a day. Let’s travel to the past and monitor its previous forms.

Link

🔭 @DeepGravity
unnamed.gif
11.3 MB
#MultiAgent Manipulation via Locomotion using Hierarchical Sim2Real

Link

🔭 @DeepGravity
Model-Based #ReinforcementLearning:
Theory and Practice

Article

#Berkeley

🔭 @DeepGravity
Positive-Unlabeled #RewardLearning

Learning #Reward functions from data is a promising path towards achieving scalable #ReinforcementLearning ( #RL ) for #robotics. However, a major challenge in training agents from learned reward models is that the agent can learn to exploit errors in the reward model to achieve high reward behaviors that do not correspond to the intended task. These reward delusions can lead to unintended and even dangerous behaviors. On the other hand, adversarial imitation learning frameworks tend to suffer the opposite problem, where the discriminator learns to trivially distinguish agent and expert behavior, resulting in reward models that produce low reward signal regardless of the input state. In this paper, we connect these two classes of reward learning methods to positive-unlabeled (PU) learning, and we show that by applying a large-scale PU learning algorithm to the reward learning problem, we can address both the reward under- and over-estimation problems simultaneously. Our approach drastically improves both GAIL and supervised reward learning, without any additional assumptions.

Paper

🔭 @DeepGravity
Yoshua #Bengio, Revered Architect of #AI, Has Some Ideas About What to Build Next

Article

🔭 @DeepGravity
AI Debate 2019: Yoshua Bengio vs Gary Marcus

This is an #AI Debate between Yoshua #Bengio and #GaryMarcus from Dec 23, 2019, organized by Montreal.AI and Mila - Institut Québécois d'Intelligence Artificielle.
Facebook video: https://www.facebook.com/MontrealAI/v...
Reading material: http://www.montreal.ai/aidebate.pdf

YouTube

🔭 @DeepGravity
#TensorNetworks in #NeuralNetworks

Here, we have a small toy example of how to use a TN inside of a fully connected neural network.

Colab

🔭 @DeepGravity
#MNIST- Exploration to Execution

Outline.
Understanding the stats/distribution of data set
Dimensional Reduction Visualization.
Best Model finding/fine tuning.
Optimizes comparisons on the data set.
Understanding of trained Weights distribution
Trained model gradient visualization.
Visualizing the trained hidden layers.
Gan training.
Transfer learning on MNIST.

Link

🔭 @DeepGravity
This media is not supported in your browser
VIEW IN TELEGRAM
Stroke of Genius: #GauGAN Turns Doodles into Stunning, Photorealistic Landscapes

Link

🔭 @DeepGravity