Data Phoenix – Telegram
Data Phoenix
1.45K subscribers
641 photos
3 videos
1 file
1.33K links
Data Phoenix is your best friend in learning and growing in the data world!
We publish digest, organize events and help expand the frontiers of your knowledge in ML, CV, NLP, and other aspects of AI. Idea and implementation: @dmitryspodarets
Download Telegram
Webinar "Introduction to Graph Neural Networks"
Date: April 6 at 16.00 CET
Speaker: Ekaterina Sirazitdinova, Senior Data Scientist at NVIDIA
Graph Neural Networks (GNNs) are AI models designed to derive insights from unstructured data described by graphs. For different segments and industries, GNNs find suitable applications such as molecular analysis, drug discovery, prediction of developments in stock market, thermodynamics analysis, and even modelling of human brain. Unlike conventional CNNs, GNNs address the challenge of working with data in irregular domains. In this talk, I will provide an introductory overview of the theory behind GNNs, take a closer look at the types of problems that GNNs are well suited for, and discuss several approaches to model unstructured problems as classification or regression at various levels.
Registration link: https://www.eventbrite.com/e/webinar-introduction-to-graph-neural-networks-tickets-578344823937
🔥1
📢 We are live and starting Data Phoenix webinar "Introduction to Graph Neural Networks" Ekaterina Sirazitdinova (Senior Data Scientist at NVIDIA). Join us: https://youtube.com/live/6jRGIGulQWs
🔥1
Attention AI enthusiasts! How many of you are interested in the sustainability topic?
ESGentle team is throwing a big party this Saturday and they're inviting all environmental enthusiasts, climate tech entrepreneurs, AI experts, investors, local artists, and wellness lovers to join them in Earth Day celebration. They'll have techno music, environmental talks and good vibes only ☺️ If you're down to grab a cocktail and mingle, please sign up here: https://lu.ma/1o1m79ms
👍3
BloombergGPT: A Large Language Model for Finance
BloombergGPT is a 50 billion parameter language model that is trained on a wide range of financial data. It is validated on standard LLM benchmarks, open financial benchmarks, and a suite of internal benchmarks that most accurately reflect our intended usage.
https://dataphoenix.info/bloomberggpt-a-large-language-model-for-finance/
🔥10
[Article] DumBO: The simplest Bayesian optimizer
Learn the basic concepts of Bayesian optimization by building a simplistic optimizer.
https://dataphoenix.info/dumbo-the-simplest-bayesian-optimizer/
🔥3
[News] OpenAI's Bug Bounty Program
OpenAI is doing an excellent job of partnering with security experts to ensure its systems are robust and secure.
https://dataphoenix.info/openais-bug-bounty-program/
[Event] MLOps Zoomcamp 2023 by DataTalks.Club
Join DataTalks.Club hands-on MLOps course and learn the must-know skills for the industry. The course will cover ML pipelines, experiment tracking, model deployment & monitoring, best practices, and more.
https://dataphoenix.info/mlops-zoomcamp-2023-by-datatalks-club/
🔥4
[News] StableLM: Stability AI Language Models
With EleutherAI, a non-profit research hub, Stability AI had success in open-sourcing earlier language models, and their release of StableLM builds on that experience.
https://dataphoenix.info/stablelm-stability-ai-language-models/
🔥1
[News] dstack 0.8: Local Hub, Hugging Face integration, and more
The new release of dstack contains Local Hub, which gives you access to a dashboard, job queue handling, secure cloud credential management, and more. In addition, now you can run dstack Hub on Hugging Face in one click.
https://dataphoenix.info/dstack-0-8-local-hub-hugging-face-integration-and-more/
🔥2🎉1
[News] Ray 2.4.0: Infrastructure for LLM training, tuning, inference, and serving
The new Ray release features various enhancements, including updates to Ray data, which include stability, observability, and ease of use.
https://dataphoenix.info/ray-2-4-0-infrastructure-for-llm-training-tuning-inference-and-serving/
🔥2
[News] Google introduces PaLM 2
Google has introduced its next-generation large language model PaLM 2, which offers improved multilingual, reasoning, coding capabilities, and more.
https://dataphoenix.info/google-introduces-palm-2/
👍2🔥1
[Papers] Neural Preset for Color Style Transfer
Neural Preset is a technique that uses AI to generate and transfer color styles. It can extract color styles from given reference images, store them as presets, and apply them to other images and videos, producing output with target color styles. Check it out!
https://dataphoenix.info/neural-preset-for-color-style-transfer/
👍2
[Articles] AI for India: How Cognitive Computing Can Transform Industries
AI with cognitive capabilities is transforming sectors in India, such as healthcare, education, logistics, and agriculture. Startups like Locus.sh, Qure.ai, Cuemath and AgroStar are leveraging AI to optimize operations, provide accurate diagnoses, personalize learning, and enhance farming practices.
https://dataphoenix.info/ai-for-india-how-cognitive-computing-can-transform-industries/
[News] Google I/O 2023 AI Updates
Here are a few highlights of several new AI tools and technologies which was announced on Google I/O 2023.
https://dataphoenix.info/google-io-2023-ai-updates/
🔥1
[Papers] BundleSDF: Neural 6-DoF Tracking and 3D Reconstruction of Unknown Objects
BundleSDF is a near real-time method for 6-DoF tracking of an unknown object from a monocular RGBD video sequence, while performing neural 3D reconstruction of the object. The method significantly outperforms existing approaches.
https://dataphoenix.info/bundlesdf-neural-6-dof-tracking-and-3d-reconstruction-of-unknown-objects/
🔥1
[News] MosaicML Inference: Secure, Private, and Affordable Deployment for Large Models
With its Starter and Enterprise tiers, and support for multiple cloud platforms, it provides a flexible and secure solution for deploying large machine learning models.
https://dataphoenix.info/mosaicml-inference-secure-private-and-affordable-deployment-for-large-models/
🔥1
[News] AI2 is developing a large language model called AI2 OLMo
AI2 announcing OLMo, an open language model at 70 billion parameters made by scientists, for scientists.
https://dataphoenix.info/ai2-is-developing-a-large-language-model-called-ai2-olmo/
👍3
[Papers] Segment Anything Model
SAM is a promptable segmentation system with zero-shot generalization to unfamiliar objects and images, without the need for additional training. The model was trained on Meta AI’s SA-1B dataset for 3-5 days on 256 A100 GPUs. Make sure that you try it!
https://dataphoenix.info/segment-anything-model/
👍2
[Event] 3rd International Conference on Artificial Intelligence and Knowledge Processing (AIKP'23)
School of Business, Woxsen University in collaboration with University of St. Thomas, MN, USA is organizing 3rd International Conference on Artificial Intelligence and Knowledge Processing - AIKP'23 (Scopus Indexed Conference) on 6th to 8th October 2023.
https://dataphoenix.info/3rd-international-conference-on-artificial-intelligence-and-knowledge-processing-aikp23/
👍1🔥1
⚡️ DataPhoenix presented the new issue of the digest!
Data Phoenix Digest is back every week after a short break, upcoming webinars about Lakehouse and LLM, video records of past webinars, GPT in 60 lines of NumPy, how to train your own LLM, SegGPT, PaLM-E, CodeT5+, StackLLaMA, and more.
https://dataphoenix.info/data-phoenix-digest-issue-6-2023/
🔥2