ML Research Hub – Telegram
ML Research Hub
32.7K subscribers
3.99K photos
226 videos
23 files
4.29K links
Advancing research in Machine Learning – practical insights, tools, and techniques for researchers.

Admin: @HusseinSheikho || @Hussein_Sheikho
Download Telegram
Prompt-Free Diffusion: Taking "Text" out of Text-to-Image Diffusion Models

The performance of Text2Image is largely dependent on text prompts. In Prompt-Free Diffusion, no prompt is needed, just a reference images.

🖥 Github: https://github.com/shi-labs/prompt-free-diffusion

🔎 Demo: https://huggingface.co/spaces/shi-labs/Prompt-Free-Diffusion

Paper: https://arxiv.org/abs/2305.16223v1

📌 Dataset: https://paperswithcode.com/dataset/ffhq

https://news.1rj.ru/str/DataScienceT
❤‍🔥21👍1
Large Language Models as Tool Makers

In this work, we take an initial step towards removing this dependency by proposing a closed-loop framework, referred to as LLMs A s Tool Makers (LATM), where LLMs create their own reusable tools for problem-solving.

🖥 Github: https://github.com/ctlllll/llm-toolmaker

Paper: https://arxiv.org/pdf/2305.17126v1.pdf

📌 Dataset: https://paperswithcode.com/dataset/big-bench

https://news.1rj.ru/str/DataScienceT
❤‍🔥1👍1
🖥 A Practical Toolkit for Multilingual Question and Answer Generation

Multilingual/multidomain question generation datasets, models, and python library for question generation.

🖥 Github: https://github.com/asahi417/lm-question-generation

Paper: https://arxiv.org/abs/2305.17416v1

📌 Dataset: https://paperswithcode.com/dataset/squad

https://news.1rj.ru/str/DataScienceT
👍1
🦙 BigTrans 🚀

BigTrans which adapts LLaMA that covers only 20 languages and enhances it with multilingual translation capability on more than 100 languag

🖥 Github: https://github.com/ZNLP/BigTrans/tree/main

Paper: https://arxiv.org/abs/2305.18098v1

📌 Dataset: https://paperswithcode.com/dataset/flores-200

https://news.1rj.ru/str/DataScienceT
This media is not supported in your browser
VIEW IN TELEGRAM
🔥 GPT4Tools: Teaching LLM to Use Tools via Self-instruction

GPT4Tools is a centralized system that can control multiple visual foundation models. It is based on Vicuna (LLaMA), and 71K self-built instruction data.

🖥 Github: https://github.com/stevengrove/gpt4tools

Paper: https://arxiv.org/abs/2305.18752v1

📌 Project: https://gpt4tools.github.io/

https://news.1rj.ru/str/DataScienceT
This media is not supported in your browser
VIEW IN TELEGRAM
Introducing BERTopic Integration with the Hugging Face Hub

BERTopic provides a powerful tool for users to uncover significant topics within text collections, thereby gaining valuable insights.

pip install bertopic

🤗 Hugging face: https://huggingface.co/blog/bertopic

🖥 Github: https://github.com/MaartenGr/BERTopic

Colab: https://colab.research.google.com/#fileId=https://huggingface.co/spaces/davanstrien/blog_notebooks/blob/main/BERTopic_hub_starter.ipynb

📌 Docs: https://maartengr.github.io/BERTopic/getting_started/quickstart/quickstart.html

https://news.1rj.ru/str/DataScienceT
Hiera: A Hierarchical Vision Transformer without the Bells-and-Whistles

Hiera is a hierarchical vision transformer that is fast, powerful, and, above all, simple. It outperforms the state-of-the-art across a wide array of image and video tasks while being much faster.

pip install hiera-transformer

🖥 Github: https://github.com/stevengrove/gpt4tools

Paper: https://arxiv.org/abs/2306.00989v1

📌 Dataset: https://paperswithcode.com/dataset/inaturalist

https://news.1rj.ru/str/DataScienceT
❤‍🔥3👍1
Wuerstchen: Efficient Pretraining of Text-to-Image Models

Novel technique for text-to-image synthesis that unites competitive performance with unprecedented cost-effectiveness and ease of training on constrained hardwar

🖥 Github: https://github.com/dome272/wuerstchen

Paper: https://arxiv.org/abs/2306.00637v1

📌 Colab: https://colab.research.google.com/drive/1UTP9Xn2UIrVbAXyL-SKEvyLmgVWdw-Vy

https://news.1rj.ru/str/DataScienceT
❤‍🔥3
If you’re a developer wanting to use large language model tools, our new course is for you.

You’ll learn how to use different prompts at various stages in the system-building process, strategies for parsing long documents, and much more!

Join for free:
https://learn.deeplearning.ai/chatgpt-building-system

More reaction = more posts

@CodeProgrammer ♥️
❤‍🔥5
🔭 GRES: Generalized Referring Expression Segmentation

New benchmark (GRES), which extends the classic RES to allow expressions to refer to an arbitrary number of target objects.

🖥 Github: https://github.com/henghuiding/ReLA

Paper: https://arxiv.org/abs/2306.00968

🔎 Project: https://henghuiding.github.io/GRES/

📌 New dataset: https://github.com/henghuiding/gRefCOCO

https://news.1rj.ru/str/DataScienceT
❤‍🔥3
🦍 Gorilla: Large Language Model Connected with Massive APIs

Gorilla a finetuned LLaMA-based model that surpasses the performance of GPT-4 on writing API calls.

🖥 Github: https://github.com/ShishirPatil/gorilla

📕 Paper: https://arxiv.org/abs/2305.15334

🔗 Demo: https://drive.google.com/file/d/1E0k5mG1mTiaz0kukyK1PdeohJipTFh6j/view?usp=share_link

👉 Project: https://shishirpatil.github.io/gorilla/

⭐️ Colab: https://colab.research.google.com/drive/1DEBPsccVLF_aUnmD0FwPeHFrtdC0QIUP?usp=sharing

https://news.1rj.ru/str/DataScienceT
👍3❤‍🔥2😍1
Segment Anything 3D

SAM-3D: A toolbox transfers 2D SAM segments into 3D scene-level point clouds.

🖥 Github: https://github.com/pointcept/segmentanything3d

Paper: https://arxiv.org/abs/2306.03908v1

📌 Dataset: https://paperswithcode.com/dataset/scannet

https://news.1rj.ru/str/DataScienceT
❤‍🔥2👍1
🐼 PandaLM: ReProducible and Automated Language Model Assessment

Judge large language model, named PandaLM, which is trained to distinguish the superior model given several LLMs. PandaLM's focus extends beyond just the objective correctness of responses, which is the main focus of traditional evaluation datasets.

🖥 Github: https://github.com/weopenml/pandalm

📕 Paper: https://arxiv.org/abs/2306.05087v1

🔗 Dataset: https://github.com/tatsu-lab/stanford_alpaca#data-release

https://news.1rj.ru/str/DataScienceT
❤‍🔥2