Находки в опенсорсе: Python – Telegram
Находки в опенсорсе: Python
1.01K subscribers
4 photos
157 links
Легкие задачки в опенсорсе из мира Python

Чат: @opensource_findings_chat
Download Telegram
🚀 New issue to ag2ai/faststream by @GrigoriyKuzevanov
📝 Bug: min_idle_time ignored when group and consumer are specified (#2678)


Describe the bug
When a StreamSub has both 'group' and 'consumer', and 'min_idle_time' specified, Faststream uses 'XREADGROUP' instead of 'XAUTOCALIM'

How to reproduce
Include source code:

from faststream import FastStream
from faststream.redis import RedisBroker, StreamSub

broker = RedisBroker("redis://localhost:6379")

@broker.subscriber(
stream=StreamSub(
"orders",
group="processors",
consumer="claimer",
min_idle_time=10000, # Should trigger XAUTOCLAIM
)
)
async def claiming_handler(msg):
print("Should use XAUTOCLAIM, but uses XREADGROUP")


app = FastStream(broker)

Redis MONITOR output shows:

XREADGROUP GROUP processors claimer BLOCK 100 STREAMS orders >

Expected behavior

XAUTOCLAIM orders processors claimer 10000 0-0 COUNT 1

Observed behavior
I suppose that a root cause in faststream/redis/subscriber/use_cases/stream_subscriber, method _StreamHandlerMixin.start():

if stream.group and stream.consumer:  # ← Checked FIRST
# Uses XREADGROUP
...
elif self.stream_sub.min_idle_time is None:
# Uses XREAD
...
else:
# Uses XAUTOCLAIM ← Never reached when group is set!
...

Or i just misunderstand the logic.

Environment
Running FastStream 0.6.3 with CPython 3.12.3 on Linux


#bug #good_first_issue #faststream #ag2ai
sent via relator
🚀 New issue to ag2ai/faststream by @gaby
📝 bug: Usage of custom logger results in no logs (#2677)


Is your feature request related to a problem? Please describe.
The built-logger is configured to always add colors, even when passing a logger to faststream. This is hardcoded here https://github.com/ag2ai/faststream/blob/main/faststream/_internal/logger/logging.py#L80 This affects systems collecting logs from faststream hosts. This makes loga generated by faststream to show in raw text as "\033[36mDEBUG\033[0m" instead of DEBUG.

Describe the solution you'd like
Make the use_colors param configurable instead of a hardcoded value.

Describe alternatives you've considered
Writing a custom log parser.


#enhancement #good_first_issue #faststream #ag2ai
sent via relator
🚀New issues to taskiq-python project

Hello everyone interested in contributing to open-source projects! We appreciate your intentions and ask for your help.

The taskiq-python project aims to migrate from Poetry to UV and drop support for Python 3.9. Since Taskiq has many different repositories, we would greatly appreciate your assistance. Here's a list of issues:

- https://github.com/taskiq-python/taskiq-fastapi/issues/24
- https://github.com/taskiq-python/taskiq-redis/issues/108
- https://github.com/taskiq-python/taskiq-psqlpy/issues/10
- https://github.com/taskiq-python/taskiq-valkey/issues/3
- https://github.com/taskiq-python/taskiq-pipelines/issues/28
- https://github.com/taskiq-python/taskiq-litestar/issues/3
- https://github.com/taskiq-python/taskiq-aiogram/issues/4
- https://github.com/taskiq-python/taskiq-aiohttp/issues/7

#good_first_issue #taskiq #enchancement
🔥11🫡1
🚀 New issue to ag2ai/faststream by @nectarindev
📝 feat: merge `Broker(context=...)` and `FastStream(context=...)` at broker-level (#2693)


I recently migrated my project to faststream 0.6. I was very interested in how I could add my dependencies to the context. Prior to version 0.6, I did something like this:

from faststream.annotations import ContextRepo
from faststream.kafka import KafkaBroker
from faststream.utils.context import context

broker = KafkaBroker()


@broker.subscriber("my_topic", group_id="my_group")
async def handle(
context: ContextRepo,
):
print("dependency: ", context.get("dependency")) # 42


async def lifespan(*args, **kwargs):
context.set_global("dependency", 42)

await broker.start()
try:
yield
finally:
await broker.stop()

I launched the broker as part of my application in lifespan without using the FastStream class.

For version 0.6, I saw examples where it was suggested to pass the context to FastStream, but that solution did not suit me. I discovered that the broker also accepts context, and that solves my problem:

broker = KafkaBroker(context=ContextRepo({"dependency": 42}))

...

async def lifespan(*args, **kwargs):
await broker.start()
try:
yield
finally:
await broker.stop()

But I also discovered that if I create a FastStream instance, its context will be used, even though I didn't use it to start the broker.

from fastapi import FastAPI
from faststream import ContextRepo, FastStream
from faststream.kafka import KafkaBroker

broker = KafkaBroker(context=ContextRepo({"broker_dependency": 2}))
app = FastStream(broker, context=ContextRepo({"application_dependency": 1}))


@broker.subscriber("my_topic", group_id="my_group")
async def handle(
context: ContextRepo,
):
print("broker_dependency: ", context.get("broker_dependency")) # None
print("application_dependency: ", context.get("application_dependency")) # 1


async def lifespan(*args, **kwargs):
await broker.start()
try:
yield
finally:
await broker.stop()


asgi = FastAPI(lifespan=lifespan)

I'm not sure that's normal behavior. It would make much more sense if only the broker's dependency were available.

⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯

Running FastStream 0.6.3 with CPython 3.12.4 on Linux


#enhancement #good_first_issue #core #faststream #ag2ai
sent via relator
1
🚀 New issue to wemake-services/django-modern-rest by @sobolevn
📝 Unskip `test_msgspec` files on 3.14 with new released version (#334)


msgspec@0.20.0 is out and updated in #333. Now all tests in test_plugins/test_msgpspec can be executed on 3.14 as well.
We need a PR with the fix that removes all pytest.skip directives for 3.14 from these files.


#enhancement #good_first_issue #help_wanted #django_modern_rest
sent via relator
🚀 New issue to wemake-services/wemake-python-styleguide by @luminoso
📝 False positive for WPS457 when using asyncio to control loops (#3573)


What's wrong

For the code:

import asyncio


async def infinite_loop() -> None:
try:
while True:
await asyncio.sleep(1)
print("I'm alive. And doing work.")
except asyncio.CancelledError:
print("Loop cancelled")


t = asyncio.create_task(infinite_loop())
await asyncio.sleep(5)
t.cancel()

WPS457: Found an infinite while loop is raised. But infinite loop is being handled. Just not within the while loop.

How it should be

Not to raise WPS457.

Not 100% sure here if it is the best practice or not.
Also probably the solution is too complex and is just easier to add a noqa in the code.

Flake8 version and plugins

{
"platform": {
"python_implementation": "CPython",
"python_version": "3.13.9",
"system": "Linux"
},
"plugins": [
{
"plugin": "mccabe",
"version": "0.7.0"
},
{
"plugin": "pycodestyle",
"version": "2.14.0"
},
{
"plugin": "pyflakes",
"version": "3.4.0"
},
{
"plugin": "wemake-python-styleguide",
"version": "1.4.0"
}
],
"version": "7.3.0"
}

pip information

pip 25.3 from /var/home/luminoso/.local/lib/python3.14/site-packages/pip (python 3.14)
anyio==4.11.0
archspec==0.2.5
argcomplete==3.6.3
asttokens==3.0.0
attrs==25.4.0
boto3==1.42.4
botocore==1.42.4
Brlapi==0.8.7
Brotli==1.1.0
certifi==2025.7.9
charset-normalizer==3.4.3
click==8.1.7
conda==25.11.0
conda-package-handling==2.4.0
conda_package_streaming==0.11.0
cupshelpers==1.0
dasbus==1.7
dbus-python==1.4.0
decorator==5.2.1
distro==1.9.0
executing==2.2.1
fedora-third-party==0.10
file-magic==0.4.0
frozendict==2.4.6
gbinder-python==1.1.2
h11==0.16.0
httpcore==1.0.9
httpx==0.28.1
idna==3.10
ipython==8.37.0
jedi==0.19.2
jmespath==1.0.1
jsonpatch==1.33
jsonpointer==2.4
jsonschema==4.23.0
jsonschema-specifications==2024.10.1
langtable==0.0.69
louis==3.33.0
matplotlib-inline==0.1.7
menuinst==2.3.1
mercurial==7.1.1
mutagen==1.47.0
nftables==0.1
olefile==0.47
packaging==25.0
parso==0.8.5
pexpect==4.9.0
pillow==11.3.0
platformdirs==4.2.2
pluggy==1.6.0
progressbar2==4.5.0
prompt_toolkit==3.0.41
psutil==7.0.0
ptyprocess==0.7.0
pure_eval==0.2.3
PyAudio==0.2.13
pycairo==1.28.0
pyclip==0.7.0
pycosat==0.6.6
pycryptodomex==3.23.0
pycups==2.0.4
pyenchant==3.2.2
pygdbmi==0.11.0.0
Pygments==2.19.1
PyGObject==3.54.5
pyinotify==0.9.6
PySocks==1.7.1
python-dateutil==2.9.0.post0
python-linux-procfs==0.7.3
python-utils==3.9.1
pyudev==0.24.3
pyxdg==0.27
PyYAML==6.0.2
pyynl @ file:///builddir/build/BUILD/kernel-6.17.10-build/kernel-6.17.10/linux-6.17.10-300.fc43.x86_64/tools/net/ynl
RapidFuzz==3.12.2
referencing==0.36.2
regex==2025.10.23
requests==2.32.5
rpds-py==0.27.0
rpm==6.0.0
rpmautospec==0.8.3
rpmautospec-core==0.1.5
ruamel.yaml==0.18.16
ruamel.yaml.clib==0.2.12
s3transfer==0.16.0
selinux @ file:///builddir/build/BUILD/libselinux-3.9-build/libselinux-3.9/src
sentry-sdk==2.35.0
sepolicy @ file:///builddir/build/BUILD/policycoreutils-3.9-build/selinux-3.9/python/sepolicy
setools==4.6.0
setuptools==78.1.1
six==1.17.0
sniffio==1.3.1
sos==4.10.1
stack_data==0.6.3
tqdm==4.67.1
traitlets==5.14.3
typing_extensions==4.15.0
urllib3==2.5.0
wcwidth==0.2.13
websockets==15.0.1
yt-dlp==2025.10.22
zstandard==0.25.0

OS information

$ lsb_release -a
LSB Version: n/a
Distributor ID: Fedora
Denoscription: Fedora Linux 43.20251209.0 (Kinoite)
Release: 43
Codename: n/a


#bug #help_wanted #levelstarter #good_first_issue #wemake_python_styleguide #wps
sent via relator
1
🚀 New issue to ag2ai/faststream by @esinevgeny
📝 Bug: ValueError while calling redis client.xautoclaim (#2736)


Describe the bug
ValueError while calling xautoclaim

How to reproduce
Use the next route and launch the faststream, if there are messages with PENDING status in stream the issue will be observed

@broker.subscriber(
stream=StreamSub("test:test", group="workers",
consumer="worker",
min_idle_time=5000)
)
async def worker(msg: RedisStreamMessage, redis: Redis):
logger.error(f"Claim {msg.correlation_id}")
await msg.ack(redis=redis, group="workers")

Actual
Traceback (most recent call last):
File "/venv/lib64/python3.12/site-packages/faststream/redis/subscriber/usecases/basic.py", line 91, in _consume
await self._get_msgs(*args)
File "/venv/lib64/python3.12/site-packages/faststream/redis/subscriber/usecases/stream_subscriber.py", line 341, in _get_msgs
for stream_name, msgs in await read(self.last_id):
^^^^^^^^^^^^^^^^^^^^^^^^
File "/venv/lib64/python3.12/site-packages/faststream/redis/subscriber/usecases/stream_subscriber.py", line 140, in read
(next_id, messages, _) = stream_message
^^^^^^^^^^^^^^^^^^^^^^
ValueError: not enough values to unpack (expected 3, got 2)

Environment
Running FastStream 0.6.5 with CPython 3.12.11 on Linux
redis 7.1.0

Also checked on redis 5.3.0


#bug #good_first_issue #faststream #ag2ai
sent via relator
🚀 New issue to wemake-services/django-modern-rest by @sobolevn
📝 Support `pyrefly` (#367)


https://pyrefly.org is already quite stable.

We use it in https://github.com/typeddjango/django-stubs/blob/master/pyrefly.toml with no major complains.
I really want to add this in CI as soon as possible.


#enhancement #good_first_issue #help_wanted #django_modern_rest
sent via relator
🤯5😱2
🚀 New issue to wemake-services/django-modern-rest by @sobolevn
📝 Create `json_dumps` internal helper (#383)


We have to dump json in some internal places of our code:

django-modern-rest/django_modern_rest/openapi/renderers/base.py

Lines 16 to 31 in 9e6132e

But, right now we don't use msgspec there if it is available. But, we should :)
So, let's create a new internal/json file and make a helper there. Then use it in base.py


#enhancement #good_first_issue #help_wanted #django_modern_rest
sent via relator
Необходимо добавить метаинформацию в новые исключения, которые кидаются при построенни графа зависимостей и были добавлены в связи с реализацией новой фичи активации провайдеров.

https://github.com/reagento/dishka/issues/643

#dishka