Laravel E-commerce Templates Advice
Hi everyone,
I’m not very experienced with Laravel templates, but for a project I’d need some e-commerce solutions.
The goal would be to build a webshop with around 80,000 products (tires), so each product has many variants. The shop would need faceted filtering (vehicle type / tire width / height / diameter, etc.).
Product data is received once a day via API in CSV format from multiple distributors. Customers should be able to register accounts. At the moment there is no online payment processor, but it’s possible that one will be needed later.
Shop operators would of course have access to an admin interface, where they can manually edit product prices and, if necessary, the available stock quantity.
Which templates would you recommend in this space that could realistically handle these requirements?
Thanks in advance for the help, and sorry if I missed any important details, I’ll add them in an edit if something comes up.
https://redd.it/1qj0yvw
@r_php
Hi everyone,
I’m not very experienced with Laravel templates, but for a project I’d need some e-commerce solutions.
The goal would be to build a webshop with around 80,000 products (tires), so each product has many variants. The shop would need faceted filtering (vehicle type / tire width / height / diameter, etc.).
Product data is received once a day via API in CSV format from multiple distributors. Customers should be able to register accounts. At the moment there is no online payment processor, but it’s possible that one will be needed later.
Shop operators would of course have access to an admin interface, where they can manually edit product prices and, if necessary, the available stock quantity.
Which templates would you recommend in this space that could realistically handle these requirements?
Thanks in advance for the help, and sorry if I missed any important details, I’ll add them in an edit if something comes up.
https://redd.it/1qj0yvw
@r_php
Reddit
From the PHP community on Reddit
Explore this post and more from the PHP community
Anyone else seeing bias about AI among Laravel devs?
I was networking with some Laravel developers over the past few weeks, and I was struck by how polarized opinions are on how revolutionary AI is for back-end development.
What’s most shocking is the perspective difference among senior developers. Some seniors claim they’ve become 10x more productive, while others say it just generates a bunch of bugs and is useless in advanced tasks.
If you’re open to sharing ... what’s your experience level, and how much do you use AI in day-to-day coding (0–5)?
0 => 'never',
1 => 'rarely',
2 => 'sometimes',
3 => 'often',
4 => 'most of the time',
5 => 'always'
;
https://redd.it/1qj4lqx
@r_php
I was networking with some Laravel developers over the past few weeks, and I was struck by how polarized opinions are on how revolutionary AI is for back-end development.
What’s most shocking is the perspective difference among senior developers. Some seniors claim they’ve become 10x more productive, while others say it just generates a bunch of bugs and is useless in advanced tasks.
If you’re open to sharing ... what’s your experience level, and how much do you use AI in day-to-day coding (0–5)?
0 => 'never',
1 => 'rarely',
2 => 'sometimes',
3 => 'often',
4 => 'most of the time',
5 => 'always'
;
https://redd.it/1qj4lqx
@r_php
Reddit
From the laravel community on Reddit
Explore this post and more from the laravel community
Is Domain Driven Design just needless complexity? My limited experience with it has been mixed at best.
I don't have a lot of experience with DDD so take this post with a grain of salt. It's personal experience rather than anything else and doesn't hold univeral truth.
------
For the past 6ish months I've worked on DDD project with an established team of 5 people. I'm the new guy.
I have nothing to compare it to so I'll take their word for it.
I figured as long as I'm working with it might as well educate myself on the matter. I read Domain Driven Design by Erik Evans, and "Implementing Domain-Driven Design" by Vaughn Vernon.
I liked Vernon's book a lot. It's more hands on.
In theory DDD sound good. It's clean, scalable, easy to work with, blends business needs with coding well.
My experience in practice has been different.
I won't talk about the businesses needs and how businesses guys communicate with devs because I feel like people will have very very different experiences.
I will however like to talk, at a high level, about the effects on the code.
In the project I work with it just seems to add needless complexity for the sake of having "layers" and clean design.
I can't say I have any strong opinions on that, but I do not like writing code for the sake of more abstraction that doesn't really do anything(ironically in Vernon's book this is mentioned as one of the pitfalls).
Not to mention the PR comments tend towards zealotry, sometimes, not all the time.
Even with a debugger the code can be hard to follow. There's 3 4 layers of abstraction even for simple queries to a db.
I feel like you need a team that already has DDD experience to actually implement DDD properly.
I'd like to hear other experiences with DDD. How well did it serves you?
https://redd.it/1qi6d6j
@r_php
I don't have a lot of experience with DDD so take this post with a grain of salt. It's personal experience rather than anything else and doesn't hold univeral truth.
------
For the past 6ish months I've worked on DDD project with an established team of 5 people. I'm the new guy.
I have nothing to compare it to so I'll take their word for it.
I figured as long as I'm working with it might as well educate myself on the matter. I read Domain Driven Design by Erik Evans, and "Implementing Domain-Driven Design" by Vaughn Vernon.
I liked Vernon's book a lot. It's more hands on.
In theory DDD sound good. It's clean, scalable, easy to work with, blends business needs with coding well.
My experience in practice has been different.
I won't talk about the businesses needs and how businesses guys communicate with devs because I feel like people will have very very different experiences.
I will however like to talk, at a high level, about the effects on the code.
In the project I work with it just seems to add needless complexity for the sake of having "layers" and clean design.
I can't say I have any strong opinions on that, but I do not like writing code for the sake of more abstraction that doesn't really do anything(ironically in Vernon's book this is mentioned as one of the pitfalls).
Not to mention the PR comments tend towards zealotry, sometimes, not all the time.
Even with a debugger the code can be hard to follow. There's 3 4 layers of abstraction even for simple queries to a db.
I feel like you need a team that already has DDD experience to actually implement DDD properly.
I'd like to hear other experiences with DDD. How well did it serves you?
https://redd.it/1qi6d6j
@r_php
Reddit
From the PHP community on Reddit
Explore this post and more from the PHP community
MoonShine 4: Laravel Admin Panel – Now with AI!
Hi everyone!
I've been working on MoonShine, an open-source admin panel for Laravel, for several years now, and I'm excited to announce MoonShine 4!
MoonShine is a simple, free admin panel for Laravel. It's good for both new and experienced users. It works with things like TailwindCSS, Laravel, and Alpine.js.
# What's new in MoonShine 4
Fast Building
Get a nearly complete admin panel right away. Don't waste time coding forms, tables, or buttons – just use the stuff that's already there. Build quickly, and you can even reuse parts of MoonShine in dashboards or other projects.
Simple Data (CRUD)
Making, reading, updating, and deleting things like users, orders, or articles is very easy. It works with databases or APIs just fine. It's easy to get started, and the code is readable.
AI Tools
* MoonVibe Generator. Build an admin panel from just one request! Simply say what you want, and it'll build a working Laravel admin panel with the database, models, migrations, and a nice look.
* Forty-Five Package. Need a user list page, a filter, and a button to add new users? Just ask, and you get a page with all of that. It actually generates code with Claude Code, it doesn't just use templates.
Fast Design Changes
Change colors, fonts, and spacing to match your style. There are over 20 ready-made color options, and you can even create your own.
Admin Panel in Telegram
Get to the admin panel from inside Telegram. You don't need a separate app!
Works with More Than Just Laravel
You can also use MoonShine in Symfony and Yii3 projects.
# Why Use It?
vs Filament: good for bigger projects. Skip Livewire and Eloquent. Also, AI generation and Telegram are ready.
vs Nova: It's free.
Try MoonShine out – you might like it!
Repo: [https://github.com/moonshine-software/moonshine](https://github.com/moonshine-software/moonshine)
I wrote more about MoonShine 4 in a [Medium article](https://medium.com/@thecutcode/moonshine-4-ai-powered-admin-panel-revolution-a76e80d2964b):
* How Forty-Five (AI assistant) is changing things
* Design token system and Tailwind 4
* Telegram MiniApp
* PHPStorm plugin
* and more
One last thing. Question:
How long does it take you to code an admin panel for a Laravel project?
https://redd.it/1qj2r8x
@r_php
Hi everyone!
I've been working on MoonShine, an open-source admin panel for Laravel, for several years now, and I'm excited to announce MoonShine 4!
MoonShine is a simple, free admin panel for Laravel. It's good for both new and experienced users. It works with things like TailwindCSS, Laravel, and Alpine.js.
# What's new in MoonShine 4
Fast Building
Get a nearly complete admin panel right away. Don't waste time coding forms, tables, or buttons – just use the stuff that's already there. Build quickly, and you can even reuse parts of MoonShine in dashboards or other projects.
Simple Data (CRUD)
Making, reading, updating, and deleting things like users, orders, or articles is very easy. It works with databases or APIs just fine. It's easy to get started, and the code is readable.
AI Tools
* MoonVibe Generator. Build an admin panel from just one request! Simply say what you want, and it'll build a working Laravel admin panel with the database, models, migrations, and a nice look.
* Forty-Five Package. Need a user list page, a filter, and a button to add new users? Just ask, and you get a page with all of that. It actually generates code with Claude Code, it doesn't just use templates.
Fast Design Changes
Change colors, fonts, and spacing to match your style. There are over 20 ready-made color options, and you can even create your own.
Admin Panel in Telegram
Get to the admin panel from inside Telegram. You don't need a separate app!
Works with More Than Just Laravel
You can also use MoonShine in Symfony and Yii3 projects.
# Why Use It?
vs Filament: good for bigger projects. Skip Livewire and Eloquent. Also, AI generation and Telegram are ready.
vs Nova: It's free.
Try MoonShine out – you might like it!
Repo: [https://github.com/moonshine-software/moonshine](https://github.com/moonshine-software/moonshine)
I wrote more about MoonShine 4 in a [Medium article](https://medium.com/@thecutcode/moonshine-4-ai-powered-admin-panel-revolution-a76e80d2964b):
* How Forty-Five (AI assistant) is changing things
* Design token system and Tailwind 4
* Telegram MiniApp
* PHPStorm plugin
* and more
One last thing. Question:
How long does it take you to code an admin panel for a Laravel project?
https://redd.it/1qj2r8x
@r_php
GitHub
GitHub - moonshine-software/moonshine: Laravel Admin panel and more. Simple for beginners and powerful for experts. Using Blade…
Laravel Admin panel and more. Simple for beginners and powerful for experts. Using Blade, Alpine.js and Tailwind CSS. - moonshine-software/moonshine
Raspberry Pi 5 - Running Symphony some benchmark results
I got a bit annoyed at Digital Ocean for a hobby site I'm running. The D.O. ocean cost is just too high for something that is free and doesn't have heaps of users.
So I thought I'd grab a Pi5 16Gb, 64GB high speed SD card and see if it's a good web server.
What the real game changer is being using the Cursor Cli actually on the server.
1. I've been trying the Claude Code version, but I found you can actually run Opus 4.5 using the Cursor CLI if you have a subnoscription. This way I don't need to have both Cursor and Claude .
2. The agent was able to do all the hard configuration and setup running FrankenPhp which works amazingly well.
3. The agent does an amazing job at my devops. Really loving this. So easy to get anything done. Especially for a small hobby project like this.
I've used the agent (that's the Cursor CLI command to run any LLM model), to do my setup but I've asked it to profile my apps speed and improve it.
After talking to ChatGPT, I thought I would try the standard Raspberry Pi 5, 256Gb NVMe drive . This drive was pretty cheap, $60NZD bucks + $25 for a hat to so I could mount it on top of the Pi.
With the NVMe drive I'm able to do about 40+ requests/second. Of a super heavy homepage (has some redis caching). I've included some results below summarised by Opus, but the starting point was pretty low at 3.29 req/sec.
Some things I found fun.
1. So much fun working with an agent for devops. My skills are average but it was fun going through the motions of optimisation and performance ideas.
2. After deployment, Opus wrote me a great backup noscript and cron that work first time with log file rotation. Then upload my backups to Digital Ocean space (S3 equiv.). Wonderful
3. It was great at running apache bench and tests and finding failing points. Good to see if any of the changes were working.
4. We did some fun optimisation around memory usage, turning MySql for this processor and ram, the default configuration that gets installed is generally not turned for ram, cpu. So this probably helped a bit.
What I don't know yet. Would it have been better to buy an Intel NUC100 or something. I like the Pi a lot as they are always in stock at my computer store. So I can always find one quickly if things blow up. I do like how small the PI is, I'm not sure about power consumption. Not sure how to test, but hopefully it's efficient enough. Good for a hobby project.
Generated from AI ---- but details of setup and speed
• Raspberry Pi 5 (16GB)
• Symfony application
• Caddy web server with FrankenPHP
• 64GB SD card I think its U10 high speed -> upgraded to NVMe drive (R.Pi branded 256GB standard one)
Starting Point - Baseline (SD Card, no optimizations)
| Concurrency | Req/sec | Avg Response |
|-------------|---------|--------------|
| 10 | 3.29 | 3.0s |
| 50 | 2.11 | 23.7s |
Pretty painful. The app was barely usable under any load.
Step 1: Caddy Workers (FrankenPHP)
Configured 8 workers to keep PHP processes alive and avoid cold starts:
| Concurrency | Req/sec | Avg Response |
|-------------|---------|--------------|
| 10 | 15.64 | 640ms |
| 100 | 12.21 | 8,191ms |
\~5x improvement at low concurrency. Workers made a huge difference.
Step 2: Redis Caching - The Plot Twist
Added Redis for caching, expecting better performance. Instead:
| Config | 10 concurrent | 100 concurrent |
|----------------|---------------|----------------|
| No cache | 15.64 req/s | 12.21 req/s |
| Redis (Predis) | 2.35 req/s | 8.21 req/s |
| File cache | 2.25 req/s | 7.98 req/s |
Caching made it WORSE. Both Redis and file cache destroyed performance. The culprit? SD card I/O was
the bottleneck. Every cache read/write was hitting the slow SD card.
Step 3: NVMe Boot
Moved the entire OS
I got a bit annoyed at Digital Ocean for a hobby site I'm running. The D.O. ocean cost is just too high for something that is free and doesn't have heaps of users.
So I thought I'd grab a Pi5 16Gb, 64GB high speed SD card and see if it's a good web server.
What the real game changer is being using the Cursor Cli actually on the server.
1. I've been trying the Claude Code version, but I found you can actually run Opus 4.5 using the Cursor CLI if you have a subnoscription. This way I don't need to have both Cursor and Claude .
2. The agent was able to do all the hard configuration and setup running FrankenPhp which works amazingly well.
3. The agent does an amazing job at my devops. Really loving this. So easy to get anything done. Especially for a small hobby project like this.
I've used the agent (that's the Cursor CLI command to run any LLM model), to do my setup but I've asked it to profile my apps speed and improve it.
After talking to ChatGPT, I thought I would try the standard Raspberry Pi 5, 256Gb NVMe drive . This drive was pretty cheap, $60NZD bucks + $25 for a hat to so I could mount it on top of the Pi.
With the NVMe drive I'm able to do about 40+ requests/second. Of a super heavy homepage (has some redis caching). I've included some results below summarised by Opus, but the starting point was pretty low at 3.29 req/sec.
Some things I found fun.
1. So much fun working with an agent for devops. My skills are average but it was fun going through the motions of optimisation and performance ideas.
2. After deployment, Opus wrote me a great backup noscript and cron that work first time with log file rotation. Then upload my backups to Digital Ocean space (S3 equiv.). Wonderful
3. It was great at running apache bench and tests and finding failing points. Good to see if any of the changes were working.
4. We did some fun optimisation around memory usage, turning MySql for this processor and ram, the default configuration that gets installed is generally not turned for ram, cpu. So this probably helped a bit.
What I don't know yet. Would it have been better to buy an Intel NUC100 or something. I like the Pi a lot as they are always in stock at my computer store. So I can always find one quickly if things blow up. I do like how small the PI is, I'm not sure about power consumption. Not sure how to test, but hopefully it's efficient enough. Good for a hobby project.
Generated from AI ---- but details of setup and speed
• Raspberry Pi 5 (16GB)
• Symfony application
• Caddy web server with FrankenPHP
• 64GB SD card I think its U10 high speed -> upgraded to NVMe drive (R.Pi branded 256GB standard one)
Starting Point - Baseline (SD Card, no optimizations)
| Concurrency | Req/sec | Avg Response |
|-------------|---------|--------------|
| 10 | 3.29 | 3.0s |
| 50 | 2.11 | 23.7s |
Pretty painful. The app was barely usable under any load.
Step 1: Caddy Workers (FrankenPHP)
Configured 8 workers to keep PHP processes alive and avoid cold starts:
| Concurrency | Req/sec | Avg Response |
|-------------|---------|--------------|
| 10 | 15.64 | 640ms |
| 100 | 12.21 | 8,191ms |
\~5x improvement at low concurrency. Workers made a huge difference.
Step 2: Redis Caching - The Plot Twist
Added Redis for caching, expecting better performance. Instead:
| Config | 10 concurrent | 100 concurrent |
|----------------|---------------|----------------|
| No cache | 15.64 req/s | 12.21 req/s |
| Redis (Predis) | 2.35 req/s | 8.21 req/s |
| File cache | 2.25 req/s | 7.98 req/s |
Caching made it WORSE. Both Redis and file cache destroyed performance. The culprit? SD card I/O was
the bottleneck. Every cache read/write was hitting the slow SD card.
Step 3: NVMe Boot
Moved the entire OS
to an NVMe drive. This is where everything clicked:
| Concurrency | Req/sec | Avg Response | Per Request |
|-------------|---------|--------------|-------------|
| 1 | 10.64 | 94ms | 94ms |
| 10 | 39.88 | 251ms | 25ms |
| 50 | 41.13 | 1,216ms | 24ms |
| 100 | 40.71 | 2,456ms | 25ms |
| 200 | 40.87 | 4,893ms | 24ms |
Final Results: Baseline vs Optimized
| Concurrency | Before | After | Improvement |
|-------------|--------|-------|-------------|
| 10 | 3.29 | 39.88 | 12x faster |
| 50 | 2.11 | 41.13 | 19x faster |
https://redd.it/1qijmfj
@r_php
| Concurrency | Req/sec | Avg Response | Per Request |
|-------------|---------|--------------|-------------|
| 1 | 10.64 | 94ms | 94ms |
| 10 | 39.88 | 251ms | 25ms |
| 50 | 41.13 | 1,216ms | 24ms |
| 100 | 40.71 | 2,456ms | 25ms |
| 200 | 40.87 | 4,893ms | 24ms |
Final Results: Baseline vs Optimized
| Concurrency | Before | After | Improvement |
|-------------|--------|-------|-------------|
| 10 | 3.29 | 39.88 | 12x faster |
| 50 | 2.11 | 41.13 | 19x faster |
https://redd.it/1qijmfj
@r_php
Reddit
From the PHP community on Reddit
Explore this post and more from the PHP community
Optimizing PHP code to process 50,000 lines per second instead of 30
https://stitcher.io/blog/processing-11-million-rows
https://redd.it/1qhw5y0
@r_php
https://stitcher.io/blog/processing-11-million-rows
https://redd.it/1qhw5y0
@r_php
stitcher.io
A blog about modern PHP, the web, and programming in general. Follow my newsletter and YouTube channel as well.
Production-ready multi-stage Laravel docker (FPM, Nginx, Migrator, Worker, PostgreSQL) with initial setup
https://github.com/MuhammadQuran17/laravel_production_docker
https://redd.it/1qiyy2o
@r_php
https://github.com/MuhammadQuran17/laravel_production_docker
https://redd.it/1qiyy2o
@r_php
GitHub
GitHub - MuhammadQuran17/laravel_production_docker
Contribute to MuhammadQuran17/laravel_production_docker development by creating an account on GitHub.
New Livewire 4.x Shift
With the official release of Livewire 4 last week, I (finally) made a Livewire Shift - Livewire 4.x Shift.
I've been willing to make Shifts for Livewire in the past. With the release of Livewire 3, they had an internal tool that did a good enough job. However, there is no tool for v4. Plus I use Livewire on more of my own projects now. So I selfishly wanted the automation.
To build out the catalog for the Livewire Shifts, I'm going to backfill a Livewire 3.x Shift. I'm also going to create an MFC Converter. This will convert from class-based components (in Livewire 3) to multi-file components (in Livewire 4). From MFC, you may use the internal tool to convert to single file components (SFC). However, it seems MFC have broader support. At least coming from class-based components. Keep an eye out for those in the coming weeks.
https://redd.it/1qjtv4h
@r_php
With the official release of Livewire 4 last week, I (finally) made a Livewire Shift - Livewire 4.x Shift.
I've been willing to make Shifts for Livewire in the past. With the release of Livewire 3, they had an internal tool that did a good enough job. However, there is no tool for v4. Plus I use Livewire on more of my own projects now. So I selfishly wanted the automation.
To build out the catalog for the Livewire Shifts, I'm going to backfill a Livewire 3.x Shift. I'm also going to create an MFC Converter. This will convert from class-based components (in Livewire 3) to multi-file components (in Livewire 4). From MFC, you may use the internal tool to convert to single file components (SFC). However, it seems MFC have broader support. At least coming from class-based components. Keep an eye out for those in the coming weeks.
https://redd.it/1qjtv4h
@r_php
Laravelshift
Livewire 4.x Shift - Upgrade Livewire 3.x to Livewire 4.x
The Livewire 4.x Shift automates the upgrade of your Laravel application from Livewire 3.x to Livewire 4.x.
I'm feeling overwhelmed and dealing with imposter syndrome. Could I get some feedback on my project progress and situation in general ?
Since the last two months I have been working on a project just out of boredom and the lack of things to do in my dev job. I work for a CRM company (US based, but I am in Europe).
I am building a smaller scale CRM that focuses fully on customisability.
* Custom Modules
* Custom Fields (including custom enums)
* Custom Layouts (list layouts and records layouts )
* Custom Relationships
* custom Theme colours for each module ( can also be turned off and use a universal theme)
Out of the box I have the usual Modules that are needed for a CRM such as Accounts, Contacts, Quotes, Invoices, Cases, Leads and Products.
My stack is : Laravel, Inertia and Vue
So this is the big picture and I have been enjoying the challenge of solving architecture issues so far, the most challenging one was was how to deal with custom fields. I ended up going with a JSON column in every module table that should contain the data for each custom field.
Anyway, I am at the point now where I need to decide whether this is a hobby project to put on my portfolio or actually building this thing into a real product.
I am happy with the functionality and how everything is coming together but I also feel like it perhaps is not that amazing nor interesting what I am creating. The market is saturated with CRMs ( I know that I work for a CRM company) but then again looking at the pricing of most of these CRMs it is INSANE what they are charging.
Our company charges 60usd a month per user per month at 15 users minimum for **the basic plan**. that is almost 11K a year. Yes I know those CRMs are fully fledged and so on but this just plants a seed in my head that perhaps there is something there for smaller companies that need a CRM but cannot afford to spend that much on software.
So my idea would be to sell this thing as fully hosted solution, like for each customer I would host an instance on Hetzner (which would cost me around 2 EUR a month per instance plus 5 EUR a year optional domain registry) and sell it for 30-50 EUR a month for companies who need it ?
The more I am writing this thread the less related to PHP it becomes, I am sorry! But I have been working with PHP for 8 years now and spent most of my professional life debugging other people's code.
Any thoughts on any of this rambling would be highly appreciated
https://redd.it/1qjt5y2
@r_php
Since the last two months I have been working on a project just out of boredom and the lack of things to do in my dev job. I work for a CRM company (US based, but I am in Europe).
I am building a smaller scale CRM that focuses fully on customisability.
* Custom Modules
* Custom Fields (including custom enums)
* Custom Layouts (list layouts and records layouts )
* Custom Relationships
* custom Theme colours for each module ( can also be turned off and use a universal theme)
Out of the box I have the usual Modules that are needed for a CRM such as Accounts, Contacts, Quotes, Invoices, Cases, Leads and Products.
My stack is : Laravel, Inertia and Vue
So this is the big picture and I have been enjoying the challenge of solving architecture issues so far, the most challenging one was was how to deal with custom fields. I ended up going with a JSON column in every module table that should contain the data for each custom field.
Anyway, I am at the point now where I need to decide whether this is a hobby project to put on my portfolio or actually building this thing into a real product.
I am happy with the functionality and how everything is coming together but I also feel like it perhaps is not that amazing nor interesting what I am creating. The market is saturated with CRMs ( I know that I work for a CRM company) but then again looking at the pricing of most of these CRMs it is INSANE what they are charging.
Our company charges 60usd a month per user per month at 15 users minimum for **the basic plan**. that is almost 11K a year. Yes I know those CRMs are fully fledged and so on but this just plants a seed in my head that perhaps there is something there for smaller companies that need a CRM but cannot afford to spend that much on software.
So my idea would be to sell this thing as fully hosted solution, like for each customer I would host an instance on Hetzner (which would cost me around 2 EUR a month per instance plus 5 EUR a year optional domain registry) and sell it for 30-50 EUR a month for companies who need it ?
The more I am writing this thread the less related to PHP it becomes, I am sorry! But I have been working with PHP for 8 years now and spent most of my professional life debugging other people's code.
Any thoughts on any of this rambling would be highly appreciated
https://redd.it/1qjt5y2
@r_php
Reddit
From the PHP community on Reddit
Explore this post and more from the PHP community
Advanced Query Scopes - Laravel In Practice EP2
https://www.youtube.com/watch?v=2yQBIfcDzkY
https://redd.it/1qiz7l4
@r_php
https://www.youtube.com/watch?v=2yQBIfcDzkY
https://redd.it/1qiz7l4
@r_php
YouTube
Advanced Query Scopes - Laravel In Practice EP2
In this episode, Harris from Laravel News shows you exactly how to:
- Use Laravel 12's new #[Scope] attribute for clean query filtering
- Chain scopes together for expressive database queries
- Combine query scopes with custom collections for powerful data…
- Use Laravel 12's new #[Scope] attribute for clean query filtering
- Chain scopes together for expressive database queries
- Combine query scopes with custom collections for powerful data…
Is refactoring bool to enum actually makes code less readable?
Is refactoring bool to enum actually makes code less readable?
I'm stuck on a refactoring decision that seems to go against all the "clean code" advice, and I need a sanity check.
I have methods like this:
Everyone, including me, says "use enums instead of booleans!" So I refactored to:
But look at what happened:
- The word "promoted" now appears three times in the signature
-
- The type, parameter name, and enum cases all say the same thing
- It went from
The irony is that the enum was supposed to improve readability, but now I'm reading "promoted promoted promoted" and my eyes are glazing over. The cases
My question: Is this just a sign that a boolean should stay a boolean? Are there cases where the two-state nature of something means an enum is actually fighting against the language instead of improving it?
Or am I missing a better way to structure this that doesn't feel like stuttering?
How would you all handle this?
https://redd.it/1qjxp9t
@r_php
Is refactoring bool to enum actually makes code less readable?
I'm stuck on a refactoring decision that seems to go against all the "clean code" advice, and I need a sanity check.
I have methods like this:
private function foo(bool $promoted = true): self {
// ...
}
Everyone, including me, says "use enums instead of booleans!" So I refactored to:
enum Promoted: int {
case YES = 1;
case NO = 0;
}
private function foo(Promoted $promoted = Promoted::NO): self {
// ...
}
But look at what happened:
- The word "promoted" now appears three times in the signature
-
Promoted::YES and Promoted::NO are just... booleans with extra steps?- The type, parameter name, and enum cases all say the same thing
- It went from
foo(true) to foo(Promoted::NO) - is that really clearer?The irony is that the enum was supposed to improve readability, but now I'm reading "promoted promoted promoted" and my eyes are glazing over. The cases
YES/NO feel like we've just reinvented true/false with more typing.My question: Is this just a sign that a boolean should stay a boolean? Are there cases where the two-state nature of something means an enum is actually fighting against the language instead of improving it?
Or am I missing a better way to structure this that doesn't feel like stuttering?
How would you all handle this?
https://redd.it/1qjxp9t
@r_php
Reddit
From the PHP community on Reddit
Explore this post and more from the PHP community
Livewire 4 and Flux the best way to build web projects. It offers an excellent development experience, and the page rendering speed is incredibly fast.
Hello everyone, I was the one complaining that Flux was too slow. I don't know what Caleb did but Livewire 4 and Flux are great. This was my old post;
https://www.reddit.com/r/laravel/comments/1lg6ljv/is\_flux\_too\_slow\_or\_am\_i\_missing\_something/
I wrote Flux dropdown rendering speed is 1.7 seconds. I built a whole page with flux used many components and page speed is 94 ms. This is unbelievable. I swear everyone can build a saas project using Livewire and Flux in one day without using AI tools.
https://redd.it/1qk51s8
@r_php
Hello everyone, I was the one complaining that Flux was too slow. I don't know what Caleb did but Livewire 4 and Flux are great. This was my old post;
https://www.reddit.com/r/laravel/comments/1lg6ljv/is\_flux\_too\_slow\_or\_am\_i\_missing\_something/
I wrote Flux dropdown rendering speed is 1.7 seconds. I built a whole page with flux used many components and page speed is 94 ms. This is unbelievable. I swear everyone can build a saas project using Livewire and Flux in one day without using AI tools.
https://redd.it/1qk51s8
@r_php
Reddit
From the laravel community on Reddit
Explore this post and more from the laravel community
Laravel API or Not
I am realizing you can still go the decoupled way (API) and avoid frontend frameworks.
I thought once you go decoupled you also have to pick a frontend framework.
But it seems that’s not the case at all. You can still go the API way and still keep Blade.
https://redd.it/1qk8ffo
@r_php
I am realizing you can still go the decoupled way (API) and avoid frontend frameworks.
I thought once you go decoupled you also have to pick a frontend framework.
But it seems that’s not the case at all. You can still go the API way and still keep Blade.
https://redd.it/1qk8ffo
@r_php
Reddit
From the laravel community on Reddit
Explore this post and more from the laravel community
Package to make `temporaryUploadUrl()` work locally
https://github.com/mnapoli/laravel-local-temporary-upload-url
https://redd.it/1qk76ce
@r_php
https://github.com/mnapoli/laravel-local-temporary-upload-url
https://redd.it/1qk76ce
@r_php
GitHub
GitHub - mnapoli/laravel-local-temporary-upload-url
Contribute to mnapoli/laravel-local-temporary-upload-url development by creating an account on GitHub.
Forcibly run Garbage Collector after closing connection?
1. With PHP I've got a lot of expired session files just sitting on my servers that should have been deleted when the session expired.
2. PHP assigns deleting the session files when it runs GC (garbage collection).
3. PHP doesn't run GC very often or, apparently, ever.
4. Running GC uses resources.
Okay, with the basics out of the way let's do this the smart way. I don't want to hurt the server's response time to client requests. So let's force GC to execute after the connection to the client has closed:
https://redd.it/1qkicao
@r_php
1. With PHP I've got a lot of expired session files just sitting on my servers that should have been deleted when the session expired.
2. PHP assigns deleting the session files when it runs GC (garbage collection).
3. PHP doesn't run GC very often or, apparently, ever.
4. Running GC uses resources.
Okay, with the basics out of the way let's do this the smart way. I don't want to hurt the server's response time to client requests. So let's force GC to execute after the connection to the client has closed:
<?php //This should be after your </html> closing tag for the html element: $c = ob_get_contents();header('Content-Length: '.ob_get_length()); header('Connection: close');ob_end_clean(); ob_start('ob_gzhandler');echo $c;//Close session or it will create conflicts: session_write_close();ob_end_flush(); ob_flush(); flush();//Connection closed; continue processing:// *** How do we force GC to run here? *** // *** How do we force GC to run here? *** // *** How do we force GC to run here? ***?>You can put a sleep(5); near the bottom and do something like put some dummy text in a flat file to verify that PHP is still executing. This also makes sure that the request uses GZIP compression.Now I've already tried testing setting the INI settings:<?php ini_set('session.gc_probability',1); ini_set('session.gc_divisor',1); ?>However PHP complains since headers were already sent. Well, the point is to do this after we close the connection to the client! The probability issue is that when it does run (0.1% of the time apparently) it's resource intensive and is triggered when a session is started which is never going to be after closing the connection to the client! How wonderfullly backwards. 🙄︀So how do we manually trigger garbage collection at the bottom of the primary noscript? I'm using PHP 8.2 and higher.Also yes, I am aware we can manually destroy session files though the goal is to get PHP to do it's job! Thanks in advance for competent replies.https://redd.it/1qkicao
@r_php
Reddit
From the PHP community on Reddit
Explore this post and more from the PHP community
Hi PHP developer need your advice
Hi everyone,
I’m working on a small project using a vibe-coding, and part of it is already built in PHP. I’m now at a crossroads and could use some advice.
1. Should I continue using PHP and integrate new features into the existing code, or would it be better to start fresh and rebuild the entire project using MERN? If MERN is the better choice, why?
2. What is the best database to use with PHP for a small to medium project?
3. What kind of complications or limitations should I expect if I stick with PHP?
The project will use in in real life so please give answer accordingly
Any insights or real-world experiences would be appreciated. Thanks!
https://redd.it/1qkjc8t
@r_php
Hi everyone,
I’m working on a small project using a vibe-coding, and part of it is already built in PHP. I’m now at a crossroads and could use some advice.
1. Should I continue using PHP and integrate new features into the existing code, or would it be better to start fresh and rebuild the entire project using MERN? If MERN is the better choice, why?
2. What is the best database to use with PHP for a small to medium project?
3. What kind of complications or limitations should I expect if I stick with PHP?
The project will use in in real life so please give answer accordingly
Any insights or real-world experiences would be appreciated. Thanks!
https://redd.it/1qkjc8t
@r_php
Reddit
From the PHP community on Reddit
Explore this post and more from the PHP community
I built a package to stop hardcoding Stripe price IDs everywhere
I've been working with Stripe quite a bit recently and was wondering if there could be something better than having Stripe price and product IDs hard-coded in env files, trying to match resources between sandboxes and production, and having our checkout crash if we forgot to add some values somewhere.
I've built this little package and it's been quite useful on some freelancing gigs, and I wanted to release it here to gather some feedback, in order to make it as useful as possible.
The idea is to define your products and prices in a config file, deploy them to Stripe with a single command, and then access them with type-safe keys, like so:
// config/billing.php
'products' => [
'pro' => [
'name' => 'Pro Plan',
'prices' => [
'monthly' => [
'amount' => 2999,
'currency' => 'usd',
'recurring' => ['interval' => 'month'],
],
],
],
],
This creates/updates everything in Stripe, caches it locally in your DB, and auto-generates enums for your IDE, and provides you with a facade to access all your resources easily.
// Before
`$user->checkout('price_1ABC123xyz789');`
// After
`$user->checkout(BillingRepository::priceId(ProductKey::Pro, PriceKey::Monthly));`
What it gives you:
* Version control for billing - Your pricing structure lives in git, not just in the Stripe dashboard
* No API calls at runtime - IDs are cached in your DB
* Auto-generated enums - IDE autocomplete, catch typos at dev time
* Two-way sync - Deploy config → Stripe, or import existing Stripe setup → config
* Handles Stripe's immutable fields - When you change a price amount, it detects this and lets you archive the old price or create a duplicate
**Importing existing setup:**
If you already have products in Stripe, you can pull them into a config file:
`php artisan billing:import --generate-config`
It's essentially infrastructure-as-code for your billing setup.
I've released it as 0.9.0 as I haven't implemented it yet in a large production codebase, so use at your own risk.
\---
GitHub: [https://github.com/valentin-morice/laravel-billing-repository](https://github.com/valentin-morice/laravel-billing-repository)
Packagist: [https://packagist.org/packages/valentin-morice/laravel-billing-repository](https://packagist.org/packages/valentin-morice/laravel-billing-repository)
\---
https://redd.it/1qklho3
@r_php
I've been working with Stripe quite a bit recently and was wondering if there could be something better than having Stripe price and product IDs hard-coded in env files, trying to match resources between sandboxes and production, and having our checkout crash if we forgot to add some values somewhere.
I've built this little package and it's been quite useful on some freelancing gigs, and I wanted to release it here to gather some feedback, in order to make it as useful as possible.
The idea is to define your products and prices in a config file, deploy them to Stripe with a single command, and then access them with type-safe keys, like so:
// config/billing.php
'products' => [
'pro' => [
'name' => 'Pro Plan',
'prices' => [
'monthly' => [
'amount' => 2999,
'currency' => 'usd',
'recurring' => ['interval' => 'month'],
],
],
],
],
This creates/updates everything in Stripe, caches it locally in your DB, and auto-generates enums for your IDE, and provides you with a facade to access all your resources easily.
// Before
`$user->checkout('price_1ABC123xyz789');`
// After
`$user->checkout(BillingRepository::priceId(ProductKey::Pro, PriceKey::Monthly));`
What it gives you:
* Version control for billing - Your pricing structure lives in git, not just in the Stripe dashboard
* No API calls at runtime - IDs are cached in your DB
* Auto-generated enums - IDE autocomplete, catch typos at dev time
* Two-way sync - Deploy config → Stripe, or import existing Stripe setup → config
* Handles Stripe's immutable fields - When you change a price amount, it detects this and lets you archive the old price or create a duplicate
**Importing existing setup:**
If you already have products in Stripe, you can pull them into a config file:
`php artisan billing:import --generate-config`
It's essentially infrastructure-as-code for your billing setup.
I've released it as 0.9.0 as I haven't implemented it yet in a large production codebase, so use at your own risk.
\---
GitHub: [https://github.com/valentin-morice/laravel-billing-repository](https://github.com/valentin-morice/laravel-billing-repository)
Packagist: [https://packagist.org/packages/valentin-morice/laravel-billing-repository](https://packagist.org/packages/valentin-morice/laravel-billing-repository)
\---
https://redd.it/1qklho3
@r_php
GitHub
GitHub - valentin-morice/laravel-billing-repository: Config-as-code for billing providers in Laravel.
Config-as-code for billing providers in Laravel. Contribute to valentin-morice/laravel-billing-repository development by creating an account on GitHub.
Life Timeline: Real-time multiplayer app built with Swoole + Mezzio
Demo: [https://timeline.zweiundeins.gmbh](https://timeline.zweiundeins.gmbh)
Github: [https://github.com/mbolli/php-timeline](https://github.com/mbolli/php-timeline)
I just put my Life Timeline app in production. It's a horizontal timeline app (think Google Sheets timeline view meets Adobe Premiere's track layout) with real-time multiplayer.
I was interested in Swoole's performance but found most examples are either single-file noscripts or custom frameworks. I wanted to see if you could build a "proper" PHP application (PSR-15 middleware, dependency injection, structured architecture) while still benefiting from Swoole's persistent workers. Spoiler: you can, and Mezzio makes it pretty seamless.
**The real-time architecture:** The multiplayer sync uses a pattern I really like:
* **CQRS (Command Query Responsibility Segregation):** Write operations go through Command Handlers, reads through Query Handlers. Each command handler does its thing (update database) and then emits an event.
* **Event Bus:** When a command completes, it fires a `TimelineChangedEvent` to a Swoole-based event bus. This is just a simple pub/sub: The bus holds subscriber callbacks in memory (works because Swoole workers are persistent).
* **SSE (Server-Sent Events):** When clients connect to `/updates`, they subscribe to the event bus. The connection stays open (Swoole coroutines handle this efficiently). When any client makes a change, the event fires, all subscribers get notified, and we push a re-rendered HTML fragment to each client using [Datastar](https://data-star.dev/)'s `PatchElements` format.
The nice thing is there's no WebSocket complexity, no separate pub/sub server (Redis, etc.) — it's all in-process because Swoole workers persist. Obviously this only works for single-server deployments, but for many apps that's fine (or just replace the event bus with NATS).
Feedback welcome. Have you already used this pattern?
https://redd.it/1qklpup
@r_php
Demo: [https://timeline.zweiundeins.gmbh](https://timeline.zweiundeins.gmbh)
Github: [https://github.com/mbolli/php-timeline](https://github.com/mbolli/php-timeline)
I just put my Life Timeline app in production. It's a horizontal timeline app (think Google Sheets timeline view meets Adobe Premiere's track layout) with real-time multiplayer.
I was interested in Swoole's performance but found most examples are either single-file noscripts or custom frameworks. I wanted to see if you could build a "proper" PHP application (PSR-15 middleware, dependency injection, structured architecture) while still benefiting from Swoole's persistent workers. Spoiler: you can, and Mezzio makes it pretty seamless.
**The real-time architecture:** The multiplayer sync uses a pattern I really like:
* **CQRS (Command Query Responsibility Segregation):** Write operations go through Command Handlers, reads through Query Handlers. Each command handler does its thing (update database) and then emits an event.
* **Event Bus:** When a command completes, it fires a `TimelineChangedEvent` to a Swoole-based event bus. This is just a simple pub/sub: The bus holds subscriber callbacks in memory (works because Swoole workers are persistent).
* **SSE (Server-Sent Events):** When clients connect to `/updates`, they subscribe to the event bus. The connection stays open (Swoole coroutines handle this efficiently). When any client makes a change, the event fires, all subscribers get notified, and we push a re-rendered HTML fragment to each client using [Datastar](https://data-star.dev/)'s `PatchElements` format.
The nice thing is there's no WebSocket complexity, no separate pub/sub server (Redis, etc.) — it's all in-process because Swoole workers persist. Obviously this only works for single-server deployments, but for many apps that's fine (or just replace the event bus with NATS).
Feedback welcome. Have you already used this pattern?
https://redd.it/1qklpup
@r_php
GitHub
GitHub - mbolli/php-timeline: High-performance PHP timeline app built with Swoole, Mezzio & Datastar. Real-time multiplayer via…
High-performance PHP timeline app built with Swoole, Mezzio & Datastar. Real-time multiplayer via SSE, CQRS architecture, PSR-7/PSR-15 middleware. A reference implementation for building mo...