Once you have XTools up and running, depending on how much traffic you receive, you might want to implement measures to ensure stability.
7.1. Adding rate limiting¶
Rate limiting can safeguard against spider crawls and bots that overload the application.
To configure, set the following variables in
10is the number of minutes
app.rate_limit_countrequests from the same user to the same URI are allowed.
5is the number of requests from the same user that are allowed during the time frame specified by
Using the above example, if you try to load the same page more than 5 times within 10 minutes, the request will be denied and you will have to wait 10 minutes before you can make the same request. This only applies to result pages and the API, and not index pages. Additionally, no rate limitations are imposed if the user is authenticated.
Any requests that are denied are logged at
You can blacklist user agents and URIs using the request_blacklist.yml file.
7.2. Offloading API requests¶
XTools features a rich public API. If you expect your XTools installation will receive a lot of traffic, you can send your API consumers (which may include bots, for instance) to a dedicated server so that resources on the main app server are not hogged.
@TODO document how to forward API requests to another server via Apache config.
7.3. Killing slow queries¶
Some queries on users with a high edit count may take a very long time to finish or even timout. You may wish to add a query killer to ensure stability.
If you are running on a Linux environment, consider using pt-kill. A query killer daemon could be configured like so:
pt-kill --user=xxxx --password=xxxx --host=xxxx \ --busy-time=90 \ --log /var/www/web/killed_slow_queries.txt \ --match-info "^(select|SELECT|Select)" \ --kill --print --daemonize --verbose
This will kill any SELECT query that takes over 90 seconds to finish, and log the query at
Note that pt-kill requires libdbi-perl and libdbd-mysql-perl.