Forum Replies Created

Viewing 5 replies - 31 through 35 (of 35 total)
  • Plugin Contributor kaspar

    (@kaspar)

    Here’s how to remove the problem of the first line in the CJ datafeeds:

    In two functions:
    function cj_process_datafeed($cj_preview,$cj_time_interval,$cj_time_factor )
    AND
    function cj_datafeed()

    Change this code:
    if ( $cj_previewdone == “ok” ) {
    break;
    }
    if(empty($v)) {
    unset($cj_data[$k]);
    }
    else {
    // Read file row

    to this:
    if ( $cj_previewdone == “ok” ) {
    break;
    }
    if(empty($v)) {
    unset($cj_data[$k]);
    }
    else if(!preg_match(“/[0-9]/”, $v)) {continue;}
    else {
    // Read file row

    This line:
    else if(!preg_match(“/[0-9]/”, $v)) {continue;}

    Makes the code skip processing the first line of the file because it contains no digits. Every other line in the feed file will have at least one digit because there are prices, which contain digits!

    Clearly, the first line, which looks like:

    PROGRAMNAME|PROGRAMURL|CATALOGNAME|LASTUPDATED|NAME|KEYWORDS|DESCRIPTION|SKU|MANUFACTURER|MANUFACTURERID|UPC|ISBN|CURRENCY|SALEPRICE|PRICE|RETAILPRICE|FROMPRICE|BUYURL|IMPRESSIONURL|IMAGEURL|ADVERTISERCATEGORY|THIRDPARTYID|THIRDPARTYCATEGORY|AUTHOR|ARTIST|TITLE|PUBLISHER|LABEL|FORMAT|SPECIAL|GIFT|PROMOTIONALTEXT|STARTDATE|ENDDATE|OFFLINE|ONLINE|INSTOCK|CONDITION|WARRANTY|STANDARDSHIPPINGCOST

    has no digits.

    Enjoy!

    Plugin Contributor kaspar

    (@kaspar)

    Thanks! I’m thinking that the answer may be to break large files into
    smaller chunks and process them separately. Perhaps the delay
    mechanism can be used to control it. The problem is during reading
    and processing the feed file, so maybe I can make it run as several
    separate processes with a delay in between chunks.

    I know about the controls on phpMyAdmin, but thanks for mentioning
    that. I’ve been writing PHP and MySQL code for several years. It’s
    true that shared hosts do cause problems like this.

    Plugin Contributor kaspar

    (@kaspar)

    I see. Those server issues can usually be resolved only by editing the php.ini file. This is not allowed unless you’re on a dedicated server, so many people will have this problem. I’ll check into possible “directives” that can be invoked from within a script to relieve the problems.

    php.ini defines (for the whole hosting server), the maximum amount of memory a script can consume while it’s running. The default value is 8MB, which is very often too small.

    If I can find any answers that will work on shared hosts, I will pass them on to you.

    Plugin Contributor kaspar

    (@kaspar)

    I’m wondering why there is a processing limit of around 200 items per feed. I’ve worked on code that processed hundreds of thousands of items from CJ feeds but it wasn’t in the WordPress environment. I’ve tried using the set_time_limit(0) function of PHP but it made no noticeable difference. It seems to be a timeout issue, though. If you upload a large feed, the script stops after X number of items and doesn’t display the upload count.

    Since many CJ feeds have hundreds or thousands of items, it would be good to solve the 200-item limit somehow. I’d be willing to help if I knew where the problem is.

    Hi Matt,

    I was wondering if you’d gotten my email about setting up to
    ping https://www.freshpodcasts.com via pingomatic? We really do
    want to work with you on that. We (Rodney R. and I) love
    Wordpress and Pingomatic. Thanks for bringing such great
    stuff to the web community.

    Steve

Viewing 5 replies - 31 through 35 (of 35 total)