Import data

There is a mechanism for importing data into the database.
A csv file with the product id and the new price is uploaded to the server.
In the for loop, I go through all the lines of the csv file.
If the id is not empty, then the UPDATE of the price is done by the known id.

The problem is that this only works with small files, up to 100 positions, all other options do not work out to the end, but nothing is written in the error_log, such as exceeding the resource limit, etc.

Do I understand correctly that the script lacks 30 seconds and is extinguished automatically? Or is there some kind of limit on the number of requests to the database here?

What is the best way to redesign the import system?

Code

$p = file($_SERVER["DOCUMENT_ROOT"]."/i.csv");
for($i=0; $i<sizeof($p); $i++)
{
    $e = explode(';', $p[$i]);  
    $id = $e[0];
    $price = $e[1];
    mysql_query("UPDATE products SET price='$price' WHERE id = '$id'");
}

The main question is how not to produce a bunch of UPDATE requests in a loop and update in batches.
There is an option to do it step by step, i.e. read the file step by step and update step by step, let’s say 50 positions, but this does not solve the problem of the limit of queries to the database.
It is not possible to use mysqli_ multi_query.


Answer 1, authority 100%

Don’t read line by line, read into an array. Arrays are processed much faster.

$fp = fopen ("i.csv","r") or die ( "    " );
/*   */
while ($data = fgetcsv ($fp,4000,"^","~")){   }

Answer 2, authority 100%

Try the same thing, only with the standard fgetcsv();I think it will work 😉


Answer 3, authority 100%

Alternatively, don’t access the database 1000 times. Build one query like

insert into products(id, price) 
values 
    ($id1, $price1), 
    ($id2, $price2)... on duplicate key update price=$price

(assuming idis a unique key)

or if the query is sooo big, group at least 20-50 products per query…
Most likely, you are wasting time on multiple calls to the database…

Two more solutions:

-1- Use LOAD DATA INFILEand, if necessary, this construct:

CREATE TEMPORARY TABLE TempTable ( ID int, column1 text, column2 text ) TYPE=HEAP; 
LOAD DATA INFILE 'file_name' INTO TABLE TempTable;
INSERT INTO ResultTable SELECT column1, column2 FROM TempTable;
DROP TABLE TempTable;

insertreplace with updatewith joinor with where exists()… e.g.:

update products 
inner join temp_new_prices on products.id=temp_new_prices.id 
set products.price=temp_new_prices.price;

-2- Pass a huge string containing n products and prices to the MySQL stored procedure, and then, already in the stored procedure, parse it in a loop …. I won’t give the solution – cumbersome. But if you need (and can not find in Google) – write.


Answer 4, authority 50%

How do you read data from a file?
If file_get_content or readfile – then there will not be enough RAM for large files.

Put set_time_limit(0) at the beginning of the file

And read the file like this:

$handle=fopen($fileName,'r');
if($handle){
  while($string=fgets($handle)){
    ...work...
  }
  fclose($handle);
}