PythonHub – Telegram
PythonHub
2.45K subscribers
2.35K photos
49.6K links
News & links about Python programming.
https://pythonhub.dev/
Download Telegram
How prompt caching works - Paged Attention and Automatic Prefix Caching plus practical tips

Prompt caching in large language models (LLMs) is an optimization technique that stores and reuses intermediate computational states (key-value caches) of repeated prompt prefixes, significantly reducing redundant processing and speeding up responses. By breaking prompts into fixed-size token blocks and utilizing a hash-based prefix matching system, prompt caching enables multiple reques...

https://sankalp.bearblog.dev/how-prompt-caching-works
Can LLMs give us AGI if they are bad at arithmetic?

Wes McKinney's post questions whether large language models (LLMs) can achieve artificial general intelligence (AGI) given their persistent struggles with basic arithmetic tasks like adding single-digit numbers, even in top models. Through experiments and analysis, he shows that while LLMs perform inconsistently on simple math (e.g., summing ~10 numbers), this reveals deeper limitations ...

https://wesmckinney.com/blog/llms-arithmetic/
Modernising Django Packages Without Breaking Everything

To successfully modernize a mature Django package without breaking user code, the maintainer should phase in new tools to consolidate configuration into a single pyproject.toml file. Key strategies involve streamlining the developer experience with fast tools like uv and Ruff, using a Justfile for memorable commands, and automating releases with Towncrier for clean changelog management.

https://lincolnloop.com/blog/modernising-django-packages-without-breaking-everything/
Django 6.0 released

Django 6.0 introduces major new features: built-in support for template partials (for cleaner, reusable templates), a native background-task framework, a built-in Content Security Policy (CSP) system, and a more modern, Unicode-friendly email API. This release marks the end of mainstream support for Django 5.2; developers are encouraged to upgrade to 6.0 to benefit from the new features ...

https://www.djangoproject.com/weblog/2025/dec/03/django-60-released/
Can Google's ADK Replace LangChain and MCP?

Christina Lin (Google) demos Agent Development Kit (ADK), open-source Python framework for agentic pipelines: assemble LLMs + tools (via MCP servers/function calling) + prompts for complex workflows like version control or Friday night bookings, with grounding for cited real-time data to cut hallucinations/token costs.

https://www.youtube.com/watch?v=nMnQ63YkftE
Stop Hardcoding Everything: Use Dependency Injection

The video explains Dependency Injection (DI) in Python with a practical data pipeline example, showing how DI improves code flexibility, testability, and separation of concerns by injecting dependencies like loaders, transformers, and exporters rather than hardcoding them. It covers manual DI with functions and classes, abstraction with protocols, building a simple DI container, and DI u...

https://www.youtube.com/watch?v=Xhzn1eAxoXk
Context Data Platform for Self-learning Agents

One Place for Agents to Store, Observe, and Learn. Designed to simplify context engineering, improve agent reliability and task success rates.

https://github.com/memodb-io/Acontext
kubesdk — async-first, fully typed Python SDK for Kubernetes

Open-source Python SDK with fully typed models, async client, and multi-cluster support for Kubernetes automation.

https://github.com/puzl-cloud/kubesdk
We Got Claude to Fine-Tune an Open Source LLM

We gave Claude the ability to fine-tune language models using a new tool called Hugging Face Skills. Not just write training noscripts, but to actually submit jobs to cloud GPUs, monitor progress, and push finished models to the Hugging Face Hub. This tutorial shows you how it works and how to use it yourself.

https://huggingface.co/blog/hf-skills-training
Learn NLP Research: 7 Papers Implemented

This video traces the evolution of neural machine translation from RNNs and LSTMs to attention mechanisms, Transformers, and multilingual models like GNMT. It includes PyTorch implementations of 7 landmark papers, mathematical explanations, and tools like Transformer Playground for hands-on learning.

https://www.youtube.com/watch?v=kRv2ElPNAdY