Cutting Edge Deep Learning – Telegram
Cutting Edge Deep Learning
253 subscribers
193 photos
42 videos
51 files
363 links
📕 Deep learning
📗 Reinforcement learning
📘 Machine learning
📙 Papers - tools - tutorials

🔗 Other Social Media Handles:
https://linktr.ee/cedeeplearning
Download Telegram
Nice article by Dat Tran about some mathematicians trying to make sense of neural networks. Some of the findings are quite obvious to machine learning practitioners/researchers like deeper network with many layers and fewer neurons aka ResNet are better than shallow networks with few layers but many neurons per layer. It's still interesting though to see that there's an effort in trying to build a "general theory" of neural networks which one usually obtains from experiences and a lot of trial and error. Maybe this will help in the future to do less trial and error.

Dat Tran (https://www.linkedin.com/in/dat-tran-a1602320/)

#deeplearning
#machinelearning
#ml
#article

@machinelearning_tuts

image

https://www.quantamagazine.org/foundations-built-for-a-general-theory-of-neural-networks-20190131/
Which Deep Learning Framework is Growing Fastest? Read a comparison between major Deep learning frameworks in terms of demand, usage, and popularity https://www.kdnuggets.com/2019/05/which-deep-learning-framework-growing-fastest.html
New AI Strategy Mimics How Brains Learn to Smell

Today’s artificial intelligence systems, including the artificial neural networks broadly inspired by the neurons and connections of the nervous system, perform wonderfully at tasks with known constraints. They also tend to require a lot of computational power and vast quantities of training data. That all serves to make them great at playing chess or Go, at detecting if there’s a car in an image, at differentiating between depictions of cats and dogs. “But they are rather pathetic at composing music or writing short stories,” said Konrad Kording, a computational neuroscientist at the University of Pennsylvania. “They have great trouble reasoning meaningfully in the world.”

#deeplearning
#machinelearning
#brainmimic
#smelling

@machinelearning_tuts

For more information:
https://www.quantamagazine.org/new-ai-strategy-mimics-how-brains-learn-to-smell-20180918/
The mission of Papers With Code is to create a free and open resource with Machine Learning papers, code and evaluation tables.

Browse this awesome portal for State-of-the-Art Machine Learning and Deep Learning Algorithms — 700+ leaderboards • 1000+ tasks • 800+ datasets • 10,000+ papers with code: https://paperswithcode.com/sota
Math Reference Tables 📗

1. General 📘
Number Notation
Addition Table
Multiplication Table
Fraction-Decimal Conversion
Interest
Units & Measurement Conversion
2. Algebra 📘
Basic Identities
Conic Sections
Polynomials
Exponents
Algebra Graphs
Functions
3. Geometry 📘
Areas, Volumes, Surface Areas
Circles
4. Trig 📘
Identities
Tables
Hyperbolics
Graphs
Functions
5. Discrete/Linear 📘
Vectors
Recursive Formulas
Linear Algebra
6. Other 📘
Constants
Complexity
Miscellaneous
Graphs
Functions
7. Stat 📘
Distributions
8. Calc 📘
Integrals
Derivatives
Series Expansions
9. Advanced 📘
Fourier Series
Transforms

📍 http://math2.org/

———————————
|@machinelearning_tuts|
———————————
N-shot learning
You may be asking, what the heck is a shot, anyway? Fair question.A shot is nothing more than a single example available for training, so in N-shot learning, we have N examples for training. For more information read 👇🏿

https://blog.floydhub.com/n-shot-learning/

——————————
@machinelearning_tuts
NUSCCF

A new efficient subspace and K-means clustering based method to improve Collaborative Filtering

https://github.com/soran-ghadri/NUSCCF

@machinelearning_tuts
Detectron

Detectron is Facebook AI Research's software system that implements state-of-the-art object detection algorithms, including Mask R-CNN. It is written in Python and powered by the Caffe2 deep learning framework.

https://github.com/facebookresearch/Detectron

@machinelearning_tuts
Keras is a machine learning framework that might be your new best friend if you have a lot of data and/or you’re after the state-of-the-art in AI: deep learning. Plus, it’s the most minimalist approach to using TensorFlow, Theano, or CNTK is the high-level Keras shell.

Key Things to Know:
🔴
Keras is usable as a high-level API on top of other popular lower level libraries such as Theano and CNTK in addition to Tensorflow.
🔴 Prototyping here is facilitated to the limit. Creating massive models of deep learning in Keras is reduced to single-line functions. But this strategy makes Keras a less configurable environment than low-level frameworks.

#machinelearning
#deeplearning
#keras

@deeplearning_tuts
Channel photo updated
Channel name was changed to «Cutting Edge deep learning»
approximately $146,085
The average machine learning salary, according to Indeed's research, is approximately $146,085 (an astounding 344% increase since 2015). The average machine learning engineer salary far outpaced other technology jobs on the list.

https://www.springboard.com/blog/machine-learning-engineer-salary-guide/

#deeplearning
#machinelearning
#averagesalary

@cedeeplearning
The adoption of ML by enterprises has reached new heights as highlighted in a recent machine learning report. Adoption has been happening at break-neck speed as companies attempt to leverage the technology to get ahead of the competition. Factors that drive the development include machine learning capabilities like risk management, performance analysis, and reporting and automation. Below are statistics on ML adoption.
✔️The increase in ML adoption is seen to drive the cloud computing market’s growth. (teks.co.in)
✔️1/3 of IT leaders are planning to use ML for business analytics. (statista.com)
✔️25% of IT leaders plan to use ML for security purposes (statista.com)
✔️16% of IT leaders want to use ML in sales and marketing. (statista.com)
✔️Capsule networks are seen to replace neural networks. (teks.co.in)

https://financesonline.com
#machinelarning
#adoption

@cedeeplearning
🔻61% of marketers say AI is the most critical aspect of their data strategy. (memsql.com)
🔻87% of companies who use AI plan to use them in sales forecasting and email marketing. (statista.com)
🔻2000 – The estimated number of Amazon Go stores in the US by 2021. (teks.co.in)
🔻49% of consumers are willing to purchase more frequently when AI is present. (twitter.com)
🔻$1 billion – The amount Netflix saved from the use of machine learning algorithm. (technisider.com)
🔻15 minutes – Amazon’s ship time after it started using machine learning. (aiindex.org)

https://financesonline.com/

#machinelearning
#marketofML

@cedeeplearning
1_640_50fps_FINAL_VERSION.gif
12.1 MB
🔻Fast and Easy Infinitely Wide Networks with Neural Tangents

One of the key theoretical insights that has allowed us to make progress in recent years has been that increasing the width of DNNs results in more regular behavior, and makes them easier to understand. A number of recent results have shown that DNNs that are allowed to become infinitely wide converge to another, simpler, class of models called Gaussian processes. In this limit, complicated phenomena (like Bayesian inference or gradient descent dynamics of a convolutional neural network) boil down to simple linear algebra equations. Insights from these infinitely wide networks frequently carry over to their finite counterparts.

Left: A schematic showing how deep neural networks induce simple input / output maps as they become infinitely wide. Right: As the width of a neural network increases , we see that the distribution of
#deeplearning
#CNN

@cedeeplearning
This media is not supported in your browser
VIEW IN TELEGRAM
🔻More Efficient NLP Model Pre-training with ELECTRA
Recent advances in language pre-training have led to substantial gains in the field of natural language processing, with state-of-the-art models such as BERT, RoBERTa, XLNet, ALBERT, and T5, among many others. These methods, though they differ in design, share the same idea of leveraging a large amount of unlabeled text to build a general model of language understanding before being fine-tuned on specific NLP tasks such as sentiment analysis and question answering.
https://ai.googleblog.com/

#NLP
#deeplearning
#pretraining

@cedeeplearning