Deep Gravity – Telegram
Deep Gravity
393 subscribers
60 photos
35 videos
17 files
495 links
AI

Contact:
DeepL.Gravity@gmail.com
Download Telegram
How to Develop a #Pix2Pix #GAN for Image-to-Image Translation

Link to the paper

🔭 @DeepGravity
Mastering Atari, Go, Chess and Shogi by Planning with a Learned Model

Constructing agents with planning capabilities has long been one of the main challenges in the pursuit of artificial intelligence. Tree-based planning methods have enjoyed huge success in challenging domains, such as chess and Go, where a perfect simulator is available. However, in real-world problems the dynamics governing the environment are often complex and unknown. In this work we present the MuZero algorithm which, by combining a tree-based search with a learned model, achieves superhuman performance in a range of challenging and visually complex domains, without any knowledge of their underlying dynamics. MuZero learns a model that, when applied iteratively, predicts the quantities most directly relevant to planning: the reward, the action-selection policy, and the value function. When evaluated on 57 different Atari games - the canonical video game environment for testing AI techniques, in which model-based planning approaches have historically struggled - our new algorithm achieved a new state of the art. When evaluated on Go, chess and shogi, without any knowledge of the game rules, MuZero matched the superhuman performance of the AlphaZero algorithm that was supplied with the game rules.

Link to the main paper

Link to a related article

#MuZero
#DeepMind
#ReinforcementLearning

🔭 @DeepGravity
OpenAI releases Safety Gym for reinforcement learning

To study constrained #RL for safe exploration, we developed a new set of environments and tools called #SafetyGym. By comparison to existing environments for constrained RL, Safety #Gym environments are richer and feature a wider range of difficulty and complexity.

Link to the Safety Gym

Link to a related article

#OpenAI
#ReinforcementLearning

🔭 @DeepGravity
#DeepLearning with #PyTorch

Download a free copy of the book and learn how to get started with #AI / #ML development using PyTorch

#Python

🔭 @DeepGravity
#MachineLearning for Scent: Learning Generalizable Perceptual Representations of Small Molecules

Predicting the relationship between a molecule’s structure and its odor remains a difficult, decades-old task. This problem, termed quantitative structure-odor relationship (QSOR) modeling, is an important challenge in chemistry, impacting human nutrition, manufacture of synthetic fragrance, the environment, and sensory neuroscience. We propose the use of graph neural networks for QSOR, and show they significantly outperform prior methods on a novel data set labeled by olfactory experts. Additional analysis shows that the learned embeddings from graph neural networks capture a meaningful odor space representation of the underlying relationship between structure and odor, as demonstrated by a strong performance on two challenging transfer learning tasks. Machine learning has already had a large impact on the senses of sight and sound. Based on these early results with graph neural networks for molecular properties, we hope machine learning can eventually do for olfaction what it has already done for vision and hearing.

Link to the paper by #Google Research and ...

🔭 @DeepGravity
#NVIDIA Makes 3D #DeepLearning Research Easy with #Kaolin #PyTorch Library

At its core, Kaolin consists of an efficient suite of geometric functions that allow manipulation of 3D content. It can wrap into PyTorch tensors 3D datasets implemented as polygon meshes, point clouds, signed distance functions or voxel grids.

With their 3D dataset ready for deep learning, researchers can choose a neural network model from a curated collection that Kaolin supplies. The interface provides a rich repository of models, both baseline and state of the art, for classification, segmentation, 3D reconstruction, super-resolution and more.

Link to the article

🔭 @DeepGravity
This respository contains my exploration of the newly released TensorFlow 2.0. #TensorFlow team introduced a lot of new and useful changes in this release; automatic mixed precision training, flexible custom training, distributed GPU training, enhanced ops for the high-level #Keras API are some of my personal favorites. You can see all of the new changes here.

🔭 @DeepGravity
The Ultimate guide to #AI, #DataScience & #MachineLearning, Articles, Cheatsheets and Tutorials ALL in one place

This is a carefully curated compendium of articles & tutorials covering all things AI, Data Science & Machine Learning for the beginner to advanced practitioner. I will be periodically updating this document with popular topics from time to time. My hope is that you find something of use and/or the content will generate ideas for you to pursue.

Link to the article

🔭 @DeepGravity
This media is not supported in your browser
VIEW IN TELEGRAM
#DeepFovea: #Neural Reconstruction for Foveated Rendering and Video Compression using Learned #Statistics of Natural Videos

Link to the paper

#FacebookAI

🔭 @DeepGravity
Depth Predictions in #Art

Painters throughout art history have used various techniques to represent our three-dimensional world on a two-dimensional canvas. Using linear and atmospheric perspective, hard and soft edges, overlay of shapes, and nuanced hue and saturation, painters can render convincing illusions of depth on flat surfaces. These painted images, with varying degrees of “depth illusion" can also be interpreted by something entirely different: #MachineLearning models.

#DeepLearning

Link to the article

🔭 @DeepGravity
Deep Gravity pinned Deleted message
Andriy Burkov’s Journey to Writing the Ultimate 100-Page #MachineLearning #Book

Have you seen most of the recommended books on Machine Learning only to feel overwhelmed by their thickness and the amount of effort it will take to read those books?

If you feel that way – don’t worry! You are not alone. A lot of people face this situation but do very little about it. Not Andriy Burkov! Andriy saw this and thought that the ideal Machine Learning book for beginners should be written within 100 pages.

Link to the article

🔭 @DeepGravity
Gentle Introduction to Vector #Norms in #MachineLearning

After completing this tutorial, you will know:

* The L1 norm that is calculated as the sum of the absolute values of the vector.
* The L2 norm that is calculated as the square root of the sum of the squared vector values.
* The max norm that is calculated as the maximum vector values.

Link to the article

🔭 @DeepGravity
Starting with the basics is a difficult (and frustrating) approach for learning programming. Instead, find a program that does something cool - a game or a website - and tinker with the code. Experiment and try to get it to do what you want.

Over time, you'll pick up the basics you need by solving problems and modifying/writing useful programs. It's tedious to learn the elements of code by themselves, but much more enjoyable when you're using them in the context of solving a problem.

Strings, lists, and functions are not inspiring. Code that helps to photograph a black hole or discover gravitational waves is definitely inspiring.

Here's a Python notebook you can run right now that shows how to process the data for observing gravitational waves

Here's some of the Python code used for photographing a black hole for the first time

In other words, learn to code from the top-down. First get a look at the big picture: what can I do with code? and then learn the fundamentals as you need them.

We learn code to solve problems and build things, not for the exercise itself!

Tweets by Will Koehrsen a Data scientist at Cortex Intel.

#Programming
#Learning

🔭 @DeepGravity