• Resolved hectorsuper

    (@hectorsuper)


    Hello,

    I have a WP with over 50k posts. If I run this plugin, will it log gazillions of GB on each run? Will it show a gazillion URLs on the admin log?

    Many thanks!

Viewing 4 replies - 1 through 4 (of 4 total)
  • Plugin Author ramon fincken

    (@ramon-fincken)

    It is built for “bigger” sites, meaning there is a limit of amount of pages to fetch.

    However, it will log its actions by default.

    Therefore I will update the software right now to exclude logging for you. Keep an eye on the new version.

    Plugin Author ramon fincken

    (@ramon-fincken)

    Update: V 2.2 is there. See FAQ

    Thread Starter hectorsuper

    (@hectorsuper)

    Thanks a lot @ramon-fincken!

    Last question, does it attempt to do all in a single php operation? In other words, for a large site, the max execution time in my php.ini (say, 180 seconds) will not be enough. What happens then? Will the next run start again from the beginning or will it do it incrementally?

    Cheers

    Plugin Author ramon fincken

    (@ramon-fincken)

    By default it will take 20 and process(crawl) each in a single PHP call.

    It will however scan the XML untill it reaches the LAST crawled URL.

    In the case that PHP (in your example) timesout after 180 seconds it will indeed loop. I think I might alter that later on.
    If this is the/your case -> decrease the amount of pages in a single run using “How to override the 20 pages crawl limit” in the FAQ section.

    That beeing said -> I try to set the time limit to “unlimited” in this line of code:

    @set_time_limit(0);

Viewing 4 replies - 1 through 4 (of 4 total)
  • The topic ‘Will work for large WP?’ is closed to new replies.