Best practice for concurrent HTTP requests in Laravel when scraping multiple URLs
I need to fetch data from a list of URLs concurrently in Laravel (PHP). The use case is scraping results from multiple endpoints — one request per URL — and I want to do it as fast as possible without firing everything at once.
What I have so far
I'm using Laravel's Http::pool() with manual chunking to control concurrency:
My question is
- Is
Http::pool()the recommended approach for this in Laravel, or should I drop down to raw GuzzlePoolwith aconcurrencyoption?