Deep Gravity – Telegram
Deep Gravity
393 subscribers
60 photos
35 videos
17 files
495 links
AI

Contact:
DeepL.Gravity@gmail.com
Download Telegram
This respository contains my exploration of the newly released TensorFlow 2.0. #TensorFlow team introduced a lot of new and useful changes in this release; automatic mixed precision training, flexible custom training, distributed GPU training, enhanced ops for the high-level #Keras API are some of my personal favorites. You can see all of the new changes here.

🔭 @DeepGravity
The Ultimate guide to #AI, #DataScience & #MachineLearning, Articles, Cheatsheets and Tutorials ALL in one place

This is a carefully curated compendium of articles & tutorials covering all things AI, Data Science & Machine Learning for the beginner to advanced practitioner. I will be periodically updating this document with popular topics from time to time. My hope is that you find something of use and/or the content will generate ideas for you to pursue.

Link to the article

🔭 @DeepGravity
This media is not supported in your browser
VIEW IN TELEGRAM
#DeepFovea: #Neural Reconstruction for Foveated Rendering and Video Compression using Learned #Statistics of Natural Videos

Link to the paper

#FacebookAI

🔭 @DeepGravity
Depth Predictions in #Art

Painters throughout art history have used various techniques to represent our three-dimensional world on a two-dimensional canvas. Using linear and atmospheric perspective, hard and soft edges, overlay of shapes, and nuanced hue and saturation, painters can render convincing illusions of depth on flat surfaces. These painted images, with varying degrees of “depth illusion" can also be interpreted by something entirely different: #MachineLearning models.

#DeepLearning

Link to the article

🔭 @DeepGravity
Deep Gravity pinned Deleted message
Andriy Burkov’s Journey to Writing the Ultimate 100-Page #MachineLearning #Book

Have you seen most of the recommended books on Machine Learning only to feel overwhelmed by their thickness and the amount of effort it will take to read those books?

If you feel that way – don’t worry! You are not alone. A lot of people face this situation but do very little about it. Not Andriy Burkov! Andriy saw this and thought that the ideal Machine Learning book for beginners should be written within 100 pages.

Link to the article

🔭 @DeepGravity
Gentle Introduction to Vector #Norms in #MachineLearning

After completing this tutorial, you will know:

* The L1 norm that is calculated as the sum of the absolute values of the vector.
* The L2 norm that is calculated as the square root of the sum of the squared vector values.
* The max norm that is calculated as the maximum vector values.

Link to the article

🔭 @DeepGravity
Starting with the basics is a difficult (and frustrating) approach for learning programming. Instead, find a program that does something cool - a game or a website - and tinker with the code. Experiment and try to get it to do what you want.

Over time, you'll pick up the basics you need by solving problems and modifying/writing useful programs. It's tedious to learn the elements of code by themselves, but much more enjoyable when you're using them in the context of solving a problem.

Strings, lists, and functions are not inspiring. Code that helps to photograph a black hole or discover gravitational waves is definitely inspiring.

Here's a Python notebook you can run right now that shows how to process the data for observing gravitational waves

Here's some of the Python code used for photographing a black hole for the first time

In other words, learn to code from the top-down. First get a look at the big picture: what can I do with code? and then learn the fundamentals as you need them.

We learn code to solve problems and build things, not for the exercise itself!

Tweets by Will Koehrsen a Data scientist at Cortex Intel.

#Programming
#Learning

🔭 @DeepGravity
Training a #MachineLearning Engineer

There is no clear outline on how to study Machine Learning/ #DeepLearning due to which many individuals apply all the possible algorithms that they have heard of and hope that one of implemented algorithms work for their problem in hand. Below, I've listed out some of the steps that one should adopt while solving a machine learning problem.

Link to the article

🔭 @DeepGravity
@DeepGravity - A very cool intro to Keras and CNN.rar
120.8 MB
Download a very cool intro to #Keras and #CNNs

Syllabus:
Keras 1, What is Keras
Keras 2, Installations for #DeepLearning, #Anaconda, #Jupyter Notebook, #Tensorflow, Keras
Keras 3, #NeuralNetwork Regression Model with Keras
Keras 4, Breast Cancer Diagnosis with Neural Networks
Keras 5, Understanding #ConvolutionalNeuralNetworks, Making a Handwritten Digit Calculator

Watch more videos on the related YouTube channel

🔭 @DeepGravity
This media is not supported in your browser
VIEW IN TELEGRAM
This Video Will Show You a demonstration of a Finger Counter Program created using #Keras and #OpenCV.

YouTube

#GitHub repo

🔭 @DeepGravity
Making progress in #MachineLearning is not possible without being strong in #Mathematics behind it.

To learn math behind ML just Google

🔭 @DeepGravity
#Quantum #GenerativeAdversarialNetworks for learning and loading random distributions

Abstract
Quantum algorithms have the potential to outperform their classical counterparts in a variety of tasks. The realization of the advantage often requires the ability to load classical data efficiently into quantum states. However, the best known methods require O(2n) gates to load an exact representation of a generic data structure into an n-qubit state. This scaling can easily predominate the complexity of a quantum algorithm and, thereby, impair potential quantum advantage. Our work presents a hybrid quantum-classical algorithm for efficient, approximate quantum state loading. More precisely, we use quantum Generative Adversarial Networks (qGANs) to facilitate efficient learning and loading of generic probability distributions - implicitly given by data samples - into quantum states. Through the interplay of a quantum channel, such as a variational quantum circuit, and a classical neural network, the qGAN can learn a representation of the probability distribution underlying the data samples and load it into a quantum state. The loading requires O(poly(n)) gates and can thus enable the use of potentially advantageous quantum algorithms, such as Quantum Amplitude Estimation. We implement the qGAN distribution learning and loading method with Qiskit and test it using a quantum simulation as well as actual quantum processors provided by the IBM Q Experience. Furthermore, we employ quantum simulation to demonstrate the use of the trained quantum channel in a quantum finance application.

Link to the paper on Nature

🔭 @DeepGravity