• I was attempting to import a largeish collection of quotes (a little over 3K quotes, with a JSON size of about 1MB), and the bulk import was failing. I took a look at the code for the import and noticed that it’s attempting to do everything in one big SQL statement. That’s certainly not a bad idea for performance purposes, but it does make a situation like this a little fraught, since there’s no way of telling what the actual problem is. Could be related to hitting some MariaDB/MySQL packet limits? Or is it complaining about one particular quote within those 3K, perhaps?

    Anyway, I ended up just rewriting put_quotes from inside inc/class-quotes-collection-db.php on my local side, to run a separate SQL statement for each quote in the JSON, like so:

    
        public function put_quotes($quotes_data = array()) {
            if(!$quotes_data) return 0;
    
            $success_count = 0;
            $insert_sql = "INSERT INTO " . $this->table_name .
                " (<code>quote</code>, <code>author</code>, <code>source</code>, <code>tags</code>, <code>public</code>, <code>time_added</code>)" .
                " VALUES (%s, %s, %s, %s, %s, NOW())";
    
            foreach($quotes_data as $quote_data) {
    
                // Process the import data
                if( is_object($quote_data) ) {
                    $quote_data = (array) $quote_data;
                }
                $quote_data = $this->validate_data($quote_data);
                extract($quote_data);
    
                // Insert into DB
                $insert = $this->db->prepare($insert_sql, $quote, $author, $source, $tags, $public);
                if ($this->db->query($insert)) {
                    $success_count++;
                }
            }
    
            return $success_count;
        }
    

    (bah, ignore those <code> tags in there; damned if I can get those backticks to show up properly in the forum here. Mentally replace ’em with a backtick.)

    … which worked well enough for me, though I did notice that three out of the 3K+ quotes didn’t get imported, so my problems initially might’ve just been down to some bad data in the bunch. One drawback to this method, as-written, is that it doesn’t provide any feedback about which quotes might’ve caused problems. That would be handy information to report back to the user, if so.

    Anyway, something to think about, and maybe tweak at some point?

    • This topic was modified 3 years, 9 months ago by xolotl.
    • This topic was modified 3 years, 9 months ago by xolotl. Reason: just trying to get that code block to look decent
    • This topic was modified 3 years, 9 months ago by xolotl. Reason: aha, okay, is not my friend here
  • The topic ‘Alternate implementation for bulk import?’ is closed to new replies.