I have done some extensive tests to see if the NUMBERS set in Rate Limiting (with Block) are actually accurate and how accurate.
No caching of anykind, page or object.
The test i did was for 404 and also human/bots for 15 per minute.
The results are extemely unstable. Sometimes the Block happened at 16 visits, usually between 24-30 and i saw also a no-block after 40 or 50 per minute.
Any explanation on that?
Thanks
]]>I realize these settings could potentially block legit scrapers. I also realize other than the IP addresses for well known scrapers like Google, attackers could spoof their user agent to look like a scraping bot instead of a user.
The sites I run are for small businesses so they might have at most 10 or 15 visitors on them at a time. Can anyone recommend settings for the Rate Limiting feature that will prevent these Application Layer Denial of Service attacks on WordPress without prematurely blocking legit scraping traffic?
]]>Rate limiting something like this
Image from Wordfence Security
1. https://pasteboard.co/6n2vp69vdqMk.jpg
2. Allow us to block ASN level https://ipinfo.io/countries/gb
3. user agent blocking https://www.whatismybrowser.com/detect/what-is-my-user-agent/
Greeting of the day!!
I have a query regarding applying coupon requests. Why there is no rate limiting implemented on the apply coupon request? It seems that a store could be a victim of a DoS or Dictionary attack.
]]>I just logged in to a site I haven’t checked in a while to find that I have a huge backlog of tasks in the queue:
Cron is running fine for other tasks, so that’s not the culprit. When I try to force cron manually I get a huge list of ‘d/usage’ errors returned:
Cloud Error: Please try after 2m 29s for service d/usage
Logging into QUIC.cloud confirms that my API usage is at zero across all services this month for this domain. My website evidently hasn’t been using the API, so why is it being rate-limited?
Image optimisation requests are also being limited/refused, again, in spite of the fact my API usage is currently at zero.
How do you identify hosts when applying rate limiting? Is it only by referring IP or by domain? I can imagine that if you are only using the server IP then this is going to rate limit users like me unfairly – greedy neighbours on the same server are going to cause the IP to get limited all the time, which is unfair.
Can we just go back to how things used to be? I much preferred using my own server to generate CCSS etc. It just worked, whereas this new system that externalises these processes onto your servers via API has consistently failed to work ever since it was introduced.
Best wishes
]]>