Import CSV rows one at a time (#3450)
Credit @mcharytoniuk. While importing over 250k records ImportModel kept running into various problems. One of them was too big memory usage - `ImportModel` loaded the complete file upfront (`$reader->fetchAll()`). Simple one-line change to `$reader->fetch()` makes `ImportModel` import CSV file row-by-row and returning an iterator which limits memory usage and allows data to be imported. This change optimizes memory usage and allows much simpler importing of larger files.
This commit is contained in:
parent
7a6f1d3c85
commit
318e9d7e76
|
|
@ -142,7 +142,7 @@ abstract class ImportModel extends Model
|
||||||
}
|
}
|
||||||
|
|
||||||
$result = [];
|
$result = [];
|
||||||
$contents = $reader->fetchAll();
|
$contents = $reader->fetch();
|
||||||
foreach ($contents as $row) {
|
foreach ($contents as $row) {
|
||||||
$result[] = $this->processImportRow($row, $matches);
|
$result[] = $this->processImportRow($row, $matches);
|
||||||
}
|
}
|
||||||
|
|
|
||||||
Loading…
Reference in New Issue