Viewing 13 replies - 1 through 13 (of 13 total)
  • Plugin Author Michael Simpson

    (@msimpson)

    I don’t understand the question at all.

    Thread Starter nittslkoalsok

    (@nittslkoalsok)

    I’m sorry, is a poor description.

    I’ll explain briefly.

    We want to import the CSV data of 11000 reviews but what can I do?
    It does not import only 300 cases no matter how many times doing.

    Plugin Author Michael Simpson

    (@msimpson)

    It is probably running out of memory. Try increasing WP_MEMORY_LIMIT in wp-config.php

    Thread Starter nittslkoalsok

    (@nittslkoalsok)

    Thank you for your reply.

    But it was useless.

    With 200MB to the memory_limit setting of PHP, define (‘WP_MEMORY_LIMIT’, ‘200M’) to wp-config.php; wrote with.

    Plugin Author Michael Simpson

    (@msimpson)

    I tried a test on my own site. I was able to import 13410 records (each with 9 fields) before it died. 300 is a very small number.

    I have in my php.ini file located at the top directory of my WordPress installation:

    memory_limit = 128M;
    max_execution_time = 50000;

    I see that I don’t have WP_MEMORY_LIMIT defined in wp-config.php.

    Thread Starter nittslkoalsok

    (@nittslkoalsok)

    Thank you for your reply.

    It was reduced to 2256 from 11000 review review is wondering data to be imported and how many.
    However, I will stop at 345 in 2256 review of data in the data of 11000 cases.

    Message out to the import after the end is this.

    —–
    2256 rows processed into form CFDB
    Back
    —–

    However, it is 345 If you look at the CFDB.

    I can be edited in the control panel of the server php.ini.
    I was in this way setting.
    max_execution_time 50000
    memory_limit 128M

    I wrote this to wp-config.php.
    define (‘WP_MEMORY_LIMIT’, ‘128M’);
    set_time_limit (50000);

    Thread Starter nittslkoalsok

    (@nittslkoalsok)

    I think I found you.

    It seems well-conceived a duplicate check during import.
    There are 2256 Travellers data but, submit_time is’s the same 1911 cases.

    Is there a way to avoid duplication check without modifying the data that you want to import?

    Plugin Author Michael Simpson

    (@msimpson)

    The code should be trying to avoid duplication. Are you supplying a submit time or are you letting it be auto-generated?

    Thread Starter nittslkoalsok

    (@nittslkoalsok)

    Thank you for your reply.

    I do not have authority to modify the data to be imported because the raw data that I got from a client. You think I want to consider the measures.

    Thank you for reading this under .. at length.

    Plugin Author Michael Simpson

    (@msimpson)

    Please understand that if there is a “submit_time” field in your CSV file, then it must be a unix timestamp number such as 1411387725.2603. Other formats will not work. If there is no “submit_time” field then it will be auto-generated. Does that help?

    Thread Starter nittslkoalsok

    (@nittslkoalsok)

    Thank you for your reply.

    field of submit_time has become a UNIX timestamp.
    To remove it is useless submit_time in the data to be imported because it is in the field very important.

    You may want to question one another, but to another thread contents will change.

    Plugin Author Michael Simpson

    (@msimpson)

    If there is a different submit time field that the customer needs to preserve, simply give that field a different name.

    Thread Starter nittslkoalsok

    (@nittslkoalsok)

    It is that you may be creating a field of submit_time of temporary use only import you put a random number and serial number.

    I was able to finally understand! !
    Thank you! !

Viewing 13 replies - 1 through 13 (of 13 total)
  • The topic ‘About CSV Import’ is closed to new replies.