PHP Reddit – Telegram
PHP Reddit
34 subscribers
294 photos
39 videos
25K links
Channel to sync with /r/PHP /r/Laravel /r/Symfony. Powered by awesome @r_channels and @reddit2telegram
Download Telegram
Secure the database credentials in Jaxon DbAdmin with Infisical

Hi,

I just published a blog post about how the credentials of the databases managed with Jaxon DbAdmin can be securely stored in the Infisical server.

I've used Infisical but any other secret management service can be used instead.

https://www.jaxon-php.org/blog/2026/01/secure-the-jaxon-dbadmin-database-credentials-with-infisical.html

https://redd.it/1qr7n3f
@r_php
Live walkthrough: Skills in Laravel Boost 2.0

Hey all!

I’m doing a live stream tomorrow (1/29) at 11 AM ET with Pushpak Chhajed on Laravel Boost 2.0.

We’ll be walking through:

* what changed in Boost 2.0 and why
* what Skills are and how they give LLMs better context in Laravel apps
* how Skills and guidelines work together in real workflows

If you have questions about Boost or the updates, feel free to drop them here ahead of time or ask in chat during the stream!

Stream link:
[https://www.youtube.com/watch?v=NWQjC20rWLg](https://www.youtube.com/watch?v=NWQjC20rWLg)


https://redd.it/1qpyil8
@r_php
Non AI things

Is there anything out there that isn't related to AI these days?

I have been building with Laravel and other tooling for 9 years but lately everything is related to AI.

https://redd.it/1qpoowj
@r_php
Laravel Cloud does not support static asset caching

According to Laravel Cloud's documentation automatically applies edge caching via CloudFlare and that:

>Laravel Cloud uses a long cache lifespan to ensure your static assets are served at the edge as much as possible.

However, no matter what I did, Google Lighthouse and Pingdom would always complain that none of my static assets had any TTL set (either via `Cache-Control` or `Expires`) and I could verify their absence in the browser myself.

At first I thought it was the existence of the `Set-Cookie` header (which Laravel Cloud states will be block caching) however this was set via CloudFlare was outside of my control.

Today I finally got confirmation from the Laravel Cloud team that their documentation is wrong and that there's no TTL set at all.

https://preview.redd.it/w1hy28hjsigg1.png?width=848&format=png&auto=webp&s=20f74bed36be3a1c60086b179461853118f8925a

I trust they're working on a fix, but I also needed to share my frustration at the time I've wasted trying to fix an issue (that is affecting my SEO and user experience) that was outside of my control.

If you're in a similar boat, or are just trusting that Laravel Cloud is taking care of your asset caching for you, now you know.

https://preview.redd.it/5ig9ph6hsigg1.png?width=1422&format=png&auto=webp&s=43598b8b07edeff82f2f8658076fa2634f3d4979

(Note: This doesn't affect static assets like images that are stored in a bucket. You will manually need to set those headers yourself.)

https://redd.it/1qrbnoi
@r_php
Preprocessing php code with C preprocessor?

I have some php code, a SQLite3 client module, that has a mess of semver conditional logic in it for using more recent features (upsert, NOROWID, that sort of thing), because I have a few users with legacy server configs.

I’m thinking of using the venerable C preprocessor ( https://www.man7.org/linux/man-pages/man1/cpp.1.html ) \#ifdef feature set to let me make production versions of my code without the conditional logic,:to make it smaller and faster for most of my users. It seems wise to do this without just hacking out the legacy code.

This seems to work. I’ll need some CI/CD and installation stuff to deploy it.

**Are there any pitfalls to this that I might be missing** ?

**Is there a better way to do this** ?

I’m grateful for any advice.

https://redd.it/1qri7te
@r_php
Laravel Fuse: A Circuit Breaker Package for Queue Jobs
https://redd.it/1qs7h6s
@r_php
Desktop applications using PHP

Hello :)

So Wednesday I was bored in a meeting and I had an idea. PHP can already create desktop applications, but only cli.

Since we can use stdin and stdout, what if there was a middleware that could use those and communicate with a real desktop window.

I did some digging and prototyping, learned some Rust, raged on WSL about WebKitGTK and now I want to share the result with you: https://codeberg.org/Elvandar/toccata

It is clearly a proof of concept but I am curious to hear your thoughts


https://redd.it/1qsgk82
@r_php
Symfony's Service Container just got a massive developer experience upgrade thanks to PHP Attributes. No more complex YAML configuration to remember —just pure, clean PHP code.
https://tuhinbepari.medium.com/mastering-symfony-service-container-modern-php-attributes-edition-74d7113614c0

https://redd.it/1qsjd72
@r_php
Libretto: A Composer-compatible package manager written in Rust - 3-10x faster installs

Hey r/PHP!

I've been working on Libretto, a high-performance Composer-compatible package manager written in Rust. The goal is to be a drop-in replacement for Composer with significantly improved performance.

GitHub: https://github.com/libretto-pm/libretto

BENCHMARK RESULTS (Laravel 12 project, 162 packages)

Tested on AMD Ryzen 9 7950X, 32GB RAM, Linux 6.18

Cold Cache Install (no cache, fresh install):

Composer 2.9.3: \~10 seconds average

Libretto 0.1.0: \~3.3 seconds average

Result: \~3x faster

Warm Cache Install (cache populated, vendor deleted):

Composer 2.9.3: \~1.5 seconds average

Libretto 0.1.0: \~0.4 seconds average

Result: \~3.8x faster

dump-autoload:

Composer 2.9.3: \~150ms

Libretto 0.1.0: \~7.5ms

Result: \~20x faster

dump-autoload --optimize:

Composer 2.9.3: \~155ms

Libretto 0.1.0: \~17ms

Result: \~9x faster

HOW IT ACHIEVES THIS PERFORMANCE

\- HTTP/2 Multiplexing: Multiple parallel requests over single TCP connection

\- Adaptive Concurrency: Up to 128 concurrent downloads vs Composer's fixed 12

\- Content-Addressable Storage: pnpm-style global cache with hardlinks

\- SIMD-accelerated JSON parsing: Using sonic-rs

\- Zero-copy deserialization: rkyv for cached data

\- Rust's native performance: No interpreter overhead

CURRENT LIMITATIONS (honest assessment)

\- Alpha quality, not production ready yet

\- Some Composer commands may not work identically

\- Limited Composer plugin compatibility

\- Some post-install noscripts may behave differently

\- Complex version constraints or private repos may have issues

WHAT WORKS WELL

\- install / update / require / remove (core dependency management)

\- dump-autoload (extremely fast)

\- validate / audit

\- PSR-4/PSR-0/classmap autoloading

\- Packagist integration

\- composer.lock compatibility

WHY BUILD THIS?

As projects grow larger (50+ dependencies), Composer's install times become noticeable, especially in CI/CD pipelines. The PHP ecosystem deserves tooling as fast as what JavaScript (pnpm), Python (uv), and Rust (cargo) developers enjoy.

LOOKING FOR FEEDBACK

\- Would you try this in development environments?

\- What features are must-haves before you'd consider it?

\- Any specific pain points with Composer you'd like addressed?

The project is open source (MIT license). PRs and issues welcome!

https://redd.it/1qsqsn5
@r_php
How would you feel about native typed arrays in PHP today? (e.g., string, UserClass, UserEnum, etc...)

Question: How would you feel about PHP adding native typed arrays like string/int so we can enforce element types at runtime without relying on PHPDoc + static analyzers? It would add explicitness to function signatures and make APIs cleaner than repeating array plus manual validation everywhere.

What are the downsides to something like this?

https://redd.it/1qstrzl
@r_php
Short PHP Fundamentals Quiz for Beginners

I built a beginner-friendly PHP fundamentals quiz that teaches the concepts as you move through it, rather than just testing you at the end.

There will be audio explanations coming soon. Would love any feedback from folks learning or teaching PHP in the meanwhile - https://impressto.ca/php\_quizzes.php#php-fundamentals

https://redd.it/1qt3dl5
@r_php
Soft Deletes w/ Cascade

I might be overcomplicating this, but here it goes.

I'm currently researching soft deletes and the related issues with cascading relationships and restoring records accurately. I've explored a few packages, but they don't resolve a few issues I feel like I might run into. For instance, large amounts of soft deletes should be dispatched to jobs to preserve application performance. This carries it's own complications, but even more so with restoring that data. Currently, I've been restoring related data with timestamps and model observers, but I'm looking for something a bit more 'magical'.

I'm curious what others have been doing, as most of what I've found is old information. Maybe those solutions have been good enough?

So tell me, how do you handle soft deletes on models with relationships, and then how do you restore them when you need to.

https://redd.it/1qt5fym
@r_php
I built a declarative ETL / Data Ingestion library for Laravel using Generators and Queues

Hi everyone,

I recently released a library to handle data ingestion (CSV, Excel, XML streams) in a more structured way than the typical "parse and loop" approach.

The goal was to separate the **definition** of an import from the **execution**.

**Key Architectural Decisions:**

1. **Memory Efficiency:** It utilizes Generators (`yield`) to stream source files line-by-line, keeping the memory footprint flat regardless of file size.
2. **Concurrency:** It chunks the stream and dispatches jobs to the Queue, allowing for horizontal scaling.
3. **Atomic Chunks:** It supports transactional chunking—if one row in a batch of 100 fails, the whole batch rolls back (optional).
4. **Observer Pattern:** It emits events for every lifecycle step (RowProcessed, ChunkProcessed, RunFailed) to decouple logging/notification logic.
5. **Error Handling:** Comprehensive error collection with context (row number, column, original value) and configurable failure strategies.

It's primarily built for Laravel (using Eloquent), but I tried to keep the internal processing logic clean.

Here is a quick example of a definition:

// UserImporter.php
public function getConfig(): IngestConfig
{
return IngestConfig::for(User::class)
->fromSource(SourceType::FTP, ['path' => '/daily_dump.csv'])
->keyedBy('email')
->mapAndTransform('status', 'is_active', fn($val) => $val === 'active');
}

I'm looking for feedback on the architecture, specifically:

* How I handle the `RowProcessor` logic
* Memory usage patterns with large files (tested with 2GB+ CSVs)
* Error recovery and retry mechanisms

**Repository:** [https://github.com/zappzerapp/laravel-ingest](https://github.com/zappzerapp/laravel-ingest)

Thanks!

https://redd.it/1qw3obv
@r_php