Deep Gravity – Telegram
Deep Gravity
393 subscribers
60 photos
35 videos
17 files
495 links
AI

Contact:
DeepL.Gravity@gmail.com
Download Telegram
Fully hardware-implemented memristor convolutional neural network

Abstract
Memristor-enabled neuromorphic computing systems provide a fast and energy-efficient approach to training neural networks1,2,3,4. However, convolutional neural networks (CNNs)—one of the most important models for image recognition5—have not yet been fully hardware-implemented using memristor crossbars, which are cross-point arrays with a memristor device at each intersection. Moreover, achieving software-comparable results is highly challenging owing to the poor yield, large variation and other non-ideal characteristics of devices6,7,8,9. Here we report the fabrication of high-yield, high-performance and uniform memristor crossbar arrays for the implementation of CNNs, which integrate eight 2,048-cell memristor arrays to improve parallel-computing efficiency. In addition, we propose an effective hybrid-training method to adapt to device imperfections and improve the overall system performance. We built a five-layer memristor-based CNN to perform MNIST10 image recognition, and achieved a high accuracy of more than 96 per cent. In addition to parallel convolutions using different kernels with shared inputs, replication of multiple identical kernels in memristor arrays was demonstrated for processing different inputs in parallel. The memristor-based CNN neuromorphic system has an energy efficiency more than two orders of magnitude greater than that of state-of-the-art graphics-processing units, and is shown to be scalable to larger networks, such as residual neural networks. Our results are expected to enable a viable memristor-based non-von Neumann hardware solution for deep neural networks and edge computing.

Paper

🔭 @DeepGravity
OpenAI→PyTorch

e are standardizing OpenAI’s deep learning framework on PyTorch. In the past, we implemented projects in many frameworks depending on their relative strengths. We’ve now chosen to standardize to make it easier for our team to create and share optimized implementations of our models.

Link

🔭 @DeepGravity
#HiPlot : High-dimensional interactive plots made easy

HiPlot is a lightweight interactive visualization tool to help AI researchers discover correlations and patterns in high-dimensional data. It uses parallel plots and other graphical ways to represent information more clearly, and it can be run quickly from a Jupyter notebook with no setup required. HiPlot enables machine learning (ML) researchers to more easily evaluate the influence of their hyperparameters, such as learning rate, regularizations, and architecture. It can also be used by researchers in other fields, so they can observe and analyze correlations in data relevant to their work.

#FacebookAI

Link

🔭 @DeepGravity
Curriculum for #ReinforcementLearning

A curriculum is an efficient tool for humans to progressively learn from simple concepts to hard problems. It breaks down complex knowledge by providing a sequence of learning steps of increasing difficulty. In this post, we will examine how the idea of curriculum can help reinforcement learning models learn to solve complicated tasks.

Article

🔭 @DeepGravity
Visualizing Convolution Neural Networks using #Pytorch

Link

🔭 @DeepGravity
Explainable Artificial Intelligence and Machine Learning: A reality rooted perspective

We are used to the availability of big data generated in nearly all fields of science as a consequence of technological progress. However, the analysis of such data possess vast challenges. One of these relates to the explainability of artificial intelligence (AI) or machine learning methods. Currently, many of such methods are non-transparent with respect to their working mechanism and for this reason are called black box models, most notably deep learning methods. However, it has been realized that this constitutes severe problems for a number of fields including the health sciences and criminal justice and arguments have been brought forward in favor of an explainable AI. In this paper, we do not assume the usual perspective presenting explainable AI as it should be, but rather we provide a discussion what explainable AI can be. The difference is that we do not present wishful thinking but reality grounded properties in relation to a scientific theory beyond physics.

Paper

🔭 @DeepGravity
What is Game Theory?
... game theory can easily become one of the strongest fields in the following decades.

Link

🔭 @DeepGravity
Idea-Driven vs Goal-Driven Research

Article

🔭 @DeepGravity
How to Develop a Cost-Sensitive Neural Network for Imbalanced Classification

After completing this tutorial, you will know:

How the standard neural network algorithm does not support imbalanced classification.
How the neural network training algorithm can be modified to weight misclassification errors in proportion to class importance.
How to configure class weight for neural networks and evaluate the effect on model performance.

Link

🔭 @DeepGravity
A Gentle Introduction to Cross-Entropy for Machine Learning

After completing this tutorial, you will know:

How to calculate cross-entropy from scratch and using standard machine learning libraries.
Cross-entropy can be used as a loss function when optimizing classification models like logistic regression and artificial neural networks.
Cross-entropy is different from KL divergence but can be calculated using KL divergence, and is different from log loss but calculates the same quantity when used as a loss function.

Link
🔭 @DeepGravity
Methods in Computational Neuroscience

Course Date: August 2 – August 28, 2020
Deadline: March 16, 2020

Link

🔭 @DeepGravity