• Hello. I was perusing a dreamhost message board and came across a php that might work as a wonderful plugin. The message board post (written by someone called tez) is here: https://discussion.dreamhost.com/showflat.pl?Cat=&Board=3rdparty&Number=48907&page=0&view=collapsed&sb=5&o=365&part=&vc=1
    The author of the php file says that it records the number of pagehits from an IP and if it goes over a certain number of hits in 10 seconds, it temporarily bans the IP. This sounds like a fantastic potential spam plugin. Unfortunately, I’m PHP stupid and so don’t know if something like this would work with WordPress and can be a useful plugin. So, I thought I’d post it here and see what you guys think about it…if it sounds like something that’s workable, then maybe someone will make a plugin from it.

    Here’s the php file and Tez’s instructions (there’s more to the post, linked above):


    There are two parts to this. A php script, which you can either insert into your index.php, or a common functions file, or as an include_once(‘filename.php’);
    And a 777 writeable directory to store the tracked IPs with a iplog.dat file in there. This script will eventually create 255, 2 digit, 0kb files in the folder to track each ip.

    The php. I created a floodstopper.php file and in any script like index.php for my forum or site I put include_once(‘floodstopper’); near the start and it will execute this script every time someone hits the index.php file. If you want to be really nasty you could make a .htaccess file which redirects all 404 file not found errors to this script as well – just in case. or maybe redirect all image leechers to this etc.
    <?php
    $cookie = $_COOKIE[‘yourcookie’]; /* if you have a known cookie on your site,
    you can use this, otherwise just ignore this, it will set a different limit
    for people with this cookie */

    /* I use yourothercookie as the cookie ID for the forum, my forum uses ID
    greater than 0 for all members and -1 for guests and members who have logged out,
    so making it match greater than zero means members will get better access and
    guests with or without cookies won’t */
    $othercookie = $_COOKIE[‘yourothercookie’];
    if($cookie && $othercookie > 0) $itime = 20; // Minimum number of seconds between visits
    else $itime = 10; // Minimum number of seconds between visits
    $ipenalty = 60; // Seconds before visitor is allowed back
    if($cookie && $othercookie > 0)$imaxvisit = 30; // Maximum visits per $iteme segment
    else $imaxvisit = 9; // Maximum visits per $iteme segment
    $iplogdir = “./iplog/”;
    $iplogfile = “iplog.dat”;

    $ipfile = substr(md5($_SERVER[“REMOTE_ADDR”]), -2);
    $oldtime = 0;
    if (file_exists($iplogdir.$ipfile)) $oldtime = filemtime($iplogdir.$ipfile);

    $time = time();
    if ($oldtime < $time) $oldtime = $time;
    $newtime = $oldtime $itime;

    if ($newtime >= $time $itime*$imaxvisit)
    {
    touch($iplogdir.$ipfile, $time $itime*($imaxvisit-1) $ipenalty);
    $oldref = $_SERVER[‘HTTP_REFERER’];
    header(“HTTP/1.0 503 Service Temporarily Unavailable”);
    header(“Connection: close”);
    header(“Content-Type: text/html”);
    echo “<html><body bgcolor=#000000 text=#00FF00 link=#ffff00>
    <font face=’Verdana, Arial’><b>
    Too many page views (more than “.$imaxvisit.” visits within “.$itime.” seconds)
    by your IP address. Unregistered visitor hackers get less privileges.
    </b>
    “;
    echo “Please wait “.$ipenalty.” seconds and try again.
    Or go to google.com </font></body></html>”;
    $fp = fopen($iplogdir.$iplogfile, “a”);
    if ($fp)
    {
    $useragent = “<unknown user agent>”;
    if (isset($_SERVER[“HTTP_USER_AGENT”])) $useragent = $_SERVER[“HTTP_USER_AGENT”];
    fputs($fp, $_SERVER[“REMOTE_ADDR”].” “.date(“d/m/Y H:i:s”).” “.$useragent.”\n”);
    fclose($fp);
    $yourdomain = $_SERVER[‘HTTP_HOST’];

    //the @ symbol before @mail means ‘supress errors’ so you wont see errors on the page if email fails.
    if($_SESSION[‘reportedflood’] < 1 && ($newtime < $time $itime $itime*$imaxvisit))
    @mail(‘webmaster@’.$yourdomain, $yourdomain.’site flood by ‘.$cookie.’ ‘
    .$_SERVER[‘REMOTE_ADDR’],’flood occured and ban for ‘.$_SERVER[‘REMOTE_ADDR’].’ at ‘
    .$_SERVER[‘REQUEST_URI’].’ from ‘.$oldref.’ agent ‘.$_SERVER[‘HTTP_USER_AGENT’].’ ‘
    .$cookie.’ ‘.$othercookie);
    $_SESSION[‘reportedflood’] = 1;
    }
    exit();
    }
    else $_SESSION[‘reportedflood’] = 0;

    touch($iplogdir.$ipfile, $newtime);
    ?>

    and create a directory called /iplog/ and make it writeable CHOWN 777 (log into webFTP from control panel to do this) (or 755 if that works on your server, but I doubt it!).
    create a file in /iplog/ called iplog.dat and CHOWN it 666 (log into webFTP from control panel to do this).

    What do you guys think? Something worth a plugin? (And thanks to Tez on the dreamhost for posting this!)

Viewing 4 replies - 1 through 4 (of 4 total)
  • I know WP internally already has some flood protection — not sure it might be 30s or 60s. And SpamKarma2 (as well as my CG-AntiSpam) also have flood protection mechanisms in them. SK2 might have a ‘temporary ban’ system as well (CG-AntiSpam doesn’t at the moment).

    -d

    Thread Starter michellealaska

    (@michellealaska)

    Thanks, David. I wasn’t aware of the ‘flood protection’ already available! These look to have comment flood protection – you can’t post multiple comments in a short time – whereas this looks to limit the number of page hits a single IP can make in a period of time. This will stop, I think, various bots from running through one’s entire site at once and still, hopefully, stop spammers from hitting tons of comments at the same time. So this would (I think!) be more comprehensive than the comment flood protection available now. I think. Hmm. I should do some more searching around, though. (Thanks!)

    Yes, what you posted is designed to limit pageviews/requests per X seconds. If you have robots.txt set up properly, bots shouldn’t crawl your site ‘too fast’ anyway. And, I’d be concerned about sending out 503’s — at least without knowing how bots respond to that versus either a non-response or positive response.

    And, at least in my experience with spammers, they don’t hit enough times in that short a period to trigger that IP block — not to mention, they typically ‘hit’ via a lot of IPs, so even if they did hit you rapidly, you wouldn’t have a single IP repeatedly in a short period.

    Interesting approach though.

    -d

    Hey david, what’s the proper approach to robots.txt? The advice I got, it’s just used to make a few of my more private folders uncrawlable… I’ve got nothing else in there to instruct bots on speed/frequency — just have my google sitemap for some of that action. Just wondered if I’m missing anything in Robots.

Viewing 4 replies - 1 through 4 (of 4 total)
  • The topic ‘Could this be a plugin? (Spam protection)’ is closed to new replies.