- Joined
- Jul 15, 2018
- Messages
- 69
This should allow more response under high loads if there is a scraper bot running or something
-may need to tweak the "download x pages at once" setting to stay under the rate limit?
-may need some sort of script on the db servers to watch the numbers of queries coming from a single ip address (dont bother with logins - too easy to get multiple logins)
-would also allow the admins to flag scraper accounts easily
-would need some sort of system where if you get flagged and rate limited or banned - you can protest if its incorrect - but that would than be "person" vs "database logs of activity" so not really an arguable case
just mah 2c worth (well maybe .5c cause it could well be a p.i.t.a. to implement depending on what servers and back end code is there already
could end up saving a %*$#ton of bandwidth if the site is being seriously scraped and thats what is contributing the very large "dead time" responses that happen in peak hours ?
-may need to tweak the "download x pages at once" setting to stay under the rate limit?
-may need some sort of script on the db servers to watch the numbers of queries coming from a single ip address (dont bother with logins - too easy to get multiple logins)
-would also allow the admins to flag scraper accounts easily
-would need some sort of system where if you get flagged and rate limited or banned - you can protest if its incorrect - but that would than be "person" vs "database logs of activity" so not really an arguable case
just mah 2c worth (well maybe .5c cause it could well be a p.i.t.a. to implement depending on what servers and back end code is there already
could end up saving a %*$#ton of bandwidth if the site is being seriously scraped and thats what is contributing the very large "dead time" responses that happen in peak hours ?