This media is not supported in your browser
VIEW IN TELEGRAM
generative elements of interior decoration by richard lord
Salesforce opensourced AI-framework for economic RL
AI Economist is capable of learning dynamic tax policies that optimize equality along with productivity in simulated economies, outperforming alternative tax systems.
Github: https://github.com/salesforce/ai-economist
Blog post with results: https://blog.einstein.ai/the-ai-economist/
Blog post with release: https://blog.einstein.ai/the-ai-economist-moonshot/
#Salesforce #gym #RL #economics #AIEconomics #animalcrossing #AIEconomist
AI Economist is capable of learning dynamic tax policies that optimize equality along with productivity in simulated economies, outperforming alternative tax systems.
Github: https://github.com/salesforce/ai-economist
Blog post with results: https://blog.einstein.ai/the-ai-economist/
Blog post with release: https://blog.einstein.ai/the-ai-economist-moonshot/
#Salesforce #gym #RL #economics #AIEconomics #animalcrossing #AIEconomist
YouTube
Introducing the AI Economist
See how Salesforce Research is using AI to drive positive, social change, with the AI Economist.
announcing scann: efficient vector similarity search
ruiqi guo, philip sun, erik lindgren, quan geng, david simcha, felix chern, & sanjiv kumar @ google research
scann is a method for efficient vector similarity search at scale. them implements includes search space pruning & quantization for maximum inner product search & also supports other distance functions such as euclidean distance
the implementation is designed for x86 processors with avx2 support
scann achieves sota performance on ann-benchmarks.com as shown on the
blog post: https://ai.googleblog.com/2020/07/announcing-scann-efficient-vector.html
paper: https://arxiv.org/abs/1908.10396
github: https://github.com/google-research/google-research/tree/master/scann
#icml2020 #similarity #scann #annoy
ruiqi guo, philip sun, erik lindgren, quan geng, david simcha, felix chern, & sanjiv kumar @ google research
scann is a method for efficient vector similarity search at scale. them implements includes search space pruning & quantization for maximum inner product search & also supports other distance functions such as euclidean distance
the implementation is designed for x86 processors with avx2 support
scann achieves sota performance on ann-benchmarks.com as shown on the
glove-100-angular dataset on the attachedblog post: https://ai.googleblog.com/2020/07/announcing-scann-efficient-vector.html
paper: https://arxiv.org/abs/1908.10396
github: https://github.com/google-research/google-research/tree/master/scann
#icml2020 #similarity #scann #annoy
Gentle reminder that comments are available for some posts.
Click button 'Comments' and ask questions or share your opinion.
Click button 'Comments' and ask questions or share your opinion.
This media is not supported in your browser
VIEW IN TELEGRAM
in the walls
by matt bierner
to make a scene from a horror movie when a face comes out of a wall like the 1st season of "Very Strange Things"
based on the arkit with ar & facetracking from the front camera
on the app store only: https://apps.apple.com/ru/app/in-the-walls/id1522257130?l=en
#arkit #ar #app
by matt bierner
to make a scene from a horror movie when a face comes out of a wall like the 1st season of "Very Strange Things"
based on the arkit with ar & facetracking from the front camera
on the app store only: https://apps.apple.com/ru/app/in-the-walls/id1522257130?l=en
#arkit #ar #app
Forwarded from Graph Machine Learning
Simple scalable graph neural networks
Michael Bronstein continues a marathon of great blog posts on GML. In a new post he describes their recent work on scaling GNNs to large network. There is a good introduction to sampling-based methods (e.g. SAGE, GraphSAINT, ClusterGCN), which sample a subgraph of a large graph and then train GNN only on a subgraph.
Then, he describes that it can be beneficial just precompute r-hop matrices, A^r X, and use MLP on these features. This way, you use topology of your graph and you apply mini-batch training with MLP.
What's cool is that the algorithm is already available in pytorch-geometric as a transform, which is implemented via sparseTensor matrix multiplication.
Michael Bronstein continues a marathon of great blog posts on GML. In a new post he describes their recent work on scaling GNNs to large network. There is a good introduction to sampling-based methods (e.g. SAGE, GraphSAINT, ClusterGCN), which sample a subgraph of a large graph and then train GNN only on a subgraph.
Then, he describes that it can be beneficial just precompute r-hop matrices, A^r X, and use MLP on these features. This way, you use topology of your graph and you apply mini-batch training with MLP.
What's cool is that the algorithm is already available in pytorch-geometric as a transform, which is implemented via sparseTensor matrix multiplication.
Medium
Simple scalable graph neural networks
One of the practical challenges of graph neural networks in scalability to large graphs. We present a simple solution for scalable GNNs.
👍1
📝 Post "Simple scalable graph neural networks" published, discuss!
NeuralCam Live release on the #ProductHunt
App turns iPhone into the better camera for Zoom calls with auto bluring in case of unwanted gestures.
It was clear that global pandemic and pressure on the remote culture will be foundation for new ideas and solutions, such as this.
There is nothing groundbraking about technology, but execution and market is what matters. Apple or Google might even buy this startup instead of simply copying the features and making default cameras more smart.
ProductHunt: https://www.producthunt.com/posts/neuralcam-live
#aiproduct #dataproduct #camera #aicamera #cv #DL
App turns iPhone into the better camera for Zoom calls with auto bluring in case of unwanted gestures.
It was clear that global pandemic and pressure on the remote culture will be foundation for new ideas and solutions, such as this.
There is nothing groundbraking about technology, but execution and market is what matters. Apple or Google might even buy this startup instead of simply copying the features and making default cameras more smart.
ProductHunt: https://www.producthunt.com/posts/neuralcam-live
#aiproduct #dataproduct #camera #aicamera #cv #DL
Dont hesitate to click «Comment» button and share your ideas or links to other pandemic solutions
Anonymous Poll
49%
Will share
51%
Not today
Forwarded from Recent AI News
Google AI Blog: On-device Supermarket Product Recognition http://feedproxy.google.com/~r/blogspot/gJZg/~3/uEq7NDB-AgY/on-device-supermarket-product.html
📝 Post "Google AI Blog: On-device Supermarket Product…" published, discuss!
nlp newsletter 14: nlp beyond english, big bird, monitoring ml models, breaking into nlp, arxiv dataset,…
by elvis saravia @dair.ai
in our point of view in this newsletter showcase the next interesting links
* demos and applications gpt3
* monitoring ml models
* Big Bird: Transformers for Longer Sequences by reducing the complexity of the attention mechanism to linear complexity in the number of tokens
* competition contradictory, my dear watson: detecting contradiction and entailment in multilingual text using tpus
* competition hate speech and offensive content identification in indo-european languages
* why u should do nlp beyond :en: by sebastian ruder
* covost v2: expanding the largest, most diverse multilingual speech-to-text translation data set
* panel discussion about the future of conversational ai systems
* …
blog post: https://dair.ai/NLP_Newsletter_14-en/
#nlp #news
by elvis saravia @dair.ai
in our point of view in this newsletter showcase the next interesting links
* demos and applications gpt3
* monitoring ml models
* Big Bird: Transformers for Longer Sequences by reducing the complexity of the attention mechanism to linear complexity in the number of tokens
* competition contradictory, my dear watson: detecting contradiction and entailment in multilingual text using tpus
* competition hate speech and offensive content identification in indo-european languages
* why u should do nlp beyond :en: by sebastian ruder
* covost v2: expanding the largest, most diverse multilingual speech-to-text translation data set
* panel discussion about the future of conversational ai systems
* …
blog post: https://dair.ai/NLP_Newsletter_14-en/
#nlp #news
REALM: Integrating Retrieval into Language Representation Models
by google research
A new paper from google with a novel approach for language model pre-training, which augments a language representation model with a knowledge retriever.
The idea is the following: we take a sentence or a piece of text and augment it with additional knowledge (pass original text and additional texts to the model).
An example:
The masked text is:
Knowledge retriever could add the following information to it:
blog post: https://ai.googleblog.com/2020/08/realm-integrating-retrieval-into.html
paper: https://arxiv.org/abs/2002.08909
github: https://github.com/google-research/language/tree/master/language/realm
#nlp #languagemodel #knowledgeretriever #icml2020
by google research
A new paper from google with a novel approach for language model pre-training, which augments a language representation model with a knowledge retriever.
The idea is the following: we take a sentence or a piece of text and augment it with additional knowledge (pass original text and additional texts to the model).
An example:
The masked text is:
We paid twenty __ at the Buckingham Palace gift shop.
Knowledge retriever could add the following information to it:
Buckingham Palace is the London residence of the British monarchy.The official currency of the United Kingdom is the Pound.blog post: https://ai.googleblog.com/2020/08/realm-integrating-retrieval-into.html
paper: https://arxiv.org/abs/2002.08909
github: https://github.com/google-research/language/tree/master/language/realm
#nlp #languagemodel #knowledgeretriever #icml2020
Forwarded from Находки в опенсорсе
A utility tool powered by fzf for using git interactively.
This tool is designed to help you use git more efficiently. It's lightweight and easy to use.
Also integrates with: diff-so-fancy, delta, bat, emoji-cli.
https://github.com/wfxr/forgit
#shell #git
This tool is designed to help you use git more efficiently. It's lightweight and easy to use.
Also integrates with: diff-so-fancy, delta, bat, emoji-cli.
https://github.com/wfxr/forgit
#shell #git
Forwarded from Graph Machine Learning
The Quantum Graph Recurrent Neural Network
This demonstration by pennylane investigates quantum graph recurrent neural networks (QGRNN), which are the quantum analogue of a classical graph recurrent neural network, and a subclass of the more general quantum graph neural network ansatz. Both the QGNN and QGRNN were introduced in this paper (2019) by Google X.
This demonstration by pennylane investigates quantum graph recurrent neural networks (QGRNN), which are the quantum analogue of a classical graph recurrent neural network, and a subclass of the more general quantum graph neural network ansatz. Both the QGNN and QGRNN were introduced in this paper (2019) by Google X.
Clothing Dataset: Call for Action
Help to collect a public-domain dataset with images of clothes
Medium post: https://medium.com/data-science-insider/clothing-dataset-call-for-action-3cad023246c1
#dataset #clothing #cv #calltoarms
Help to collect a public-domain dataset with images of clothes
Medium post: https://medium.com/data-science-insider/clothing-dataset-call-for-action-3cad023246c1
#dataset #clothing #cv #calltoarms
GANs used to create photorealistic images of Roman Emperors
Project post: https://voshart.com/ROMAN-EMPEROR-PROJECT
Medium: https://medium.com/@voshart/photoreal-roman-emperor-project-236be7f06c8f
#GAN #Art #history #DL
Project post: https://voshart.com/ROMAN-EMPEROR-PROJECT
Medium: https://medium.com/@voshart/photoreal-roman-emperor-project-236be7f06c8f
#GAN #Art #history #DL
mingpt – a minimal pytorch re-implementation of the openai generative pretrained transformer training
by karpathy
small, clean, interpretable and educational, as most of the currently available ones are a bit sprawling. this implementation is appropriately about 300 lines of code, including boilerplate and a totally unnecessary custom causal self-attention module. all that's going on is that a sequence of indices goes into a sequence of transformer blocks, and a probability distribution of the next index comes out.
with a bpe encoder, distributed training and maybe fp16 this implementation may be able to reproduce gpt-1/gpt-2 results, though they haven't tried $$$. gpt-3 is likely out of reach as his understanding is that it does not fit into gpu memory and requires a more careful model-parallel treatment.
https://twitter.com/karpathy/status/1295410274095095810?s=20
#nlp #karpathy #gpt #torch
by karpathy
small, clean, interpretable and educational, as most of the currently available ones are a bit sprawling. this implementation is appropriately about 300 lines of code, including boilerplate and a totally unnecessary custom causal self-attention module. all that's going on is that a sequence of indices goes into a sequence of transformer blocks, and a probability distribution of the next index comes out.
with a bpe encoder, distributed training and maybe fp16 this implementation may be able to reproduce gpt-1/gpt-2 results, though they haven't tried $$$. gpt-3 is likely out of reach as his understanding is that it does not fit into gpu memory and requires a more careful model-parallel treatment.
https://twitter.com/karpathy/status/1295410274095095810?s=20
#nlp #karpathy #gpt #torch
Twitter
Andrej Karpathy
I wrote a minimal/educational GPT training library in PyTorch, am calling it minGPT as it is only around ~300 lines of code: https://t.co/79S9lShJRN +demos for addition and character-level language model. (quick weekend project, may contain sharp edges)