Introduction

If you’re trying to parse large JSON files in Laravel, especially files over 5GB, you’ll quickly run into issues with memory limits and performance bottlenecks. A common mistake is using json_decode() on the entire file — which might work for small datasets, but will crash your server when the file grows large.

In this guide, you’ll learn how to parse large JSON files in Laravel without memory limits, using memory-efficient streaming techniques. These solutions are ideal for real-world projects where scalability and performance matter.


Why Standard JSON Parsing Fails

Here’s a common way developers try to load JSON files:

$data = json_decode(file_get_contents(storage_path('app/large_file.json')), true);

This may seem fine, but it can result in:

  • Allowed memory size exhausted errors
  • Script timeouts
  • Server crashes

Why? Because Laravel and PHP attempt to load the entire file into memory. That doesn’t work for large datasets.

If you need to read JSON in Laravel without hitting memory limits, you’ll need to stream the data instead of loading it all at once.

Efficient Strategies to Parse Large JSON Files in Laravel

Strategy 1: Line-Delimited JSON (NDJSON or JSONL)

If your JSON file contains one object per line, you’re in luck. You can stream and process each line without using excessive memory:

$handle = fopen(storage_path('app/large_file.json'), 'r');

while (($line = fgets($handle)) !== false) {
$data = json_decode($line, true);
// Process each $data object
}

fclose($handle);

Why this works:

  • Easily fits into Laravel jobs, commands, or controllers
  • Efficient memory usage
  • Ideal for files of 5GB+

Strategy 2: Streaming a Large JSON Array

If the file is one huge JSON array, use a streaming parser.

Step 1: Install the parser

bashCopyEditcomposer require salsify/json-streaming-parser

Step 2: Create a Listener

phpCopyEdituse JsonStreamingParser\Listener\IdleListener;

class MyJsonListener extends IdleListener {
    private $key;
    private $value;

    public function key($key) {
        $this->key = $key;
    }

    public function value($value) {
        echo "Key: {$this->key}, Value: {$value}\n";
    }

    public function endObject() {
        // Save to DB or process data
    }
}

Step 3: Use the Parser

phpCopyEdituse JsonStreamingParser\Parser;

$stream = fopen(storage_path('app/large_file.json'), 'r');
$parser = new Parser($stream, new MyJsonListener());
$parser->parse();
fclose($stream);

This method lets you parse large JSON files in Laravel efficiently, even without memory limits.

Laravel: Read JSON Without Memory Limit

If you’re wondering how to read JSON in Laravel without memory limit issues, the answer is always streaming.

Whether you’re processing NDJSON files or large arrays, never use json_decode() on the whole file. Use fgets() or a JSON streaming parser to handle one item at a time — that’s how you avoid memory-related crashes in Laravel.

Common Mistakes to Avoid

  • Don’t use json_decode() on large files
  • Don’t raise memory_limit or max_execution_time as a first step
  • Avoid parsing entire arrays in one go

Bonus Tips

  • For .gz files, use gzopen() and gzgets() to stream lines
  • Break large files into smaller JSON chunks if you can
  • Use Laravel queues to offload long-running JSON parsing tasks

Final Thoughts

You don’t need to crash your server to parse large JSON files in Laravel. Whether your file is 5GB or more, using streaming JSON parsing (NDJSON or arrays) allows for safe and scalable processing.

If you’re looking to read large JSON files without memory limits in Laravel, or simply need to process 5GB+ JSON files in PHP, streaming is your best friend.

Categorized in: