• Hi,

    Have a $wpdb->get_results call like this:

    $currentProductsArray = array();
    $currentProducts = $wpdb->get_results(“SELECT id, feedid, size, price FROM products WHERE shopid = $shopid”);

    Everything is fine if a shop just have a few thousand products, but some shops have 100,000+ products => the array gets huge and causes the script to crash.

    Anyone got an idea for saving the results of $wpdb->get_results in a smaller data structure than the standard PHP array?

    Thanks,
    Mads

Viewing 5 replies - 1 through 5 (of 5 total)
  • Why do you need to retrive all of the product data at the same time?

    I ask that because there’s normally ways to get around things like this that will work just as well, and will not put quite os much strain on your servers memory resources.

    Thread Starter Mads Phikamphon

    (@madsphi)

    Agree. This is often the case.

    Here is what is going on:

    I need to compare the contents of an XML products file with the existing products in the database:

    – check if each product the XML file exist in the database y/n.
    – if it exist, mix data from the existing product with data from the updated product in the XML file.
    – update the database with the updated product.

    Both the XML file and the existing products database contains 100,000+ products.

    Checking the database for each product in the XML file could be an option, but it will be terribly slow to do so many queries. That’s why I try to get all the existing products from the database and into an array, so I can check against that instead of the database.

    Thanks again.

    // Mads

    Processing 100,000 records is going to take time no matter what you do, so it would be better looking at different options. I’d also think that any XML file that’s got 100,000 entries is going to be huge as well, so you’ll probably run into the same memory issues just parsing that.

    If it was me, I would do it individually. You can always extend the run time that’s allowed, and I don’t think that the extra queries are going to be any sort of real issue seeing as how you’re doing just as many, if not many times more, to do your updating.

    If you want to try it, you can up your memory limits using a php.ini file or your .htaccess file (on most hosts at least). That will let you ru the script most times, but you’ll need to do some tweaking to find out just how much RAM you need to allocate.

    Thread Starter Mads Phikamphon

    (@madsphi)

    The XML files are up to around 200 MB.

    Problem is that there 50 shops with an XML file each, i.e. a total of 50 x 100,000+ products => more than 5 million products.

    Everything needs to be updated daily…

    Tried increasing the memory limit to 128 MB, but that didn’t help.

    Of course it didn’t work.

    Parsing an XML file turns that XML file into a big mix of PHP arrays and objects. Taking a 200MB XML file would normally take more than that just to parse into memory.

    If it was me, I would not do this as a web-accessable script. I’d run it on the servers command line using a CRON job. That way you’re not limited by the memory and timeout limits that are set for Apache (or whatever your sever software is).

Viewing 5 replies - 1 through 5 (of 5 total)
  • The topic ‘Handling a huge amount of $wpdb->get_results data’ is closed to new replies.