Beefy Boxes and Bandwidth Generously Provided by pair Networks
Pathologically Eclectic Rubbish Lister
 
PerlMonks  

Optimize DBI connect to avoid max_user_connections

by Alex-WisdomSeeker (Initiate)
on May 29, 2014 at 14:20 UTC ( #1087805=perlquestion: print w/replies, xml ) Need Help??
Alex-WisdomSeeker has asked for the wisdom of the Perl Monks concerning the following question:

Every month or two a swarm of robot visits my site on a managed server and opens up connections so fast that my current max_user_connection value of 25 (will increase it to 75) is reached. Currently I restart the server and it works fine again until the next swarm comes. It is a webshop programmed in perl which tries to get the data using DBI connect, like this :
$db=DBI->connect('dbi:mysql:******'); $query = $db->prepare("SELECT * FROM `Tablename` WHERE 1 AND `artnr` + = '$link' ORDER BY id DESC LIMIT 0 , 25;"); $query->execute(); while (my @row = $query->fetchrow_array()){ (datas like name etc,*,*) = @row; print "Shoppage" } $query->finish; $db->disconnect();
There are some more connections in the same script looking for a cart and an whitelist name. So I have some questions : Will the problem solve itself after some time or will the open process run until reset and try to get infos from the locked DB ? Is it possible to do a small query to check for max user connections on the DB to exit if it is to high ? Any other idea to get protection from DOS attacks or bot swarms (thought about rectriciting Asian IPs in htaccess or using HTTP::BrowserDetect) ? Thanks for you time any maybe Help !

Replies are listed 'Best First'.
Re: Optimize DBI connect to avoid max_user_connections
by InfiniteSilence (Curate) on May 29, 2014 at 14:29 UTC

    Randal Schwartz wrote an article on this subject titled, Throttling Your Web Server which might be useful.

    Otherwise I would think that settings in your robots.txt might be sufficient to tell the spider to either slow down (there's a Crawl-delay directive) or to simply stop spidering your site.

    If you are using Apache (or any modern HTTP server I think) you can, of course, simply deny certain IP addresses.

    Celebrate Intellectual Diversity

Re: Optimize DBI connect to avoid max_user_connections
by marto (Bishop) on May 29, 2014 at 14:32 UTC

    "Any other idea to get protection from DOS attacks or bot swarms (thought about rectriciting Asian IPs in htaccess or using HTTP::BrowserDetect) ?"

    Read your webserver documentation regarding DoS mitigation, for example. Recently I had to add rules to block Indian/Chinese bots from spamming a wiki I help manage. The ones I had to block don't respect robots.txt

Re: Optimize DBI connect to avoid max_user_connections
by Anonymous Monk on May 29, 2014 at 14:33 UTC
    If you're using Apache, take a look at the module Apache::DBI.
      I will do so, maybe there is another problem, too. The DB went down again this time with lots of "Waiting for table lock" errors. It really is a small site with around 500 visitors per day, I did not have this prblems with my old host so maybe it is a managed server problem. Will go through the scripts and look for not closed db connections again. Thanks a lot so far !

        Sounds like you're using MyISAM. Alter the tables to use the InnoDB engine instead of MyIASM and most if not all of those table lock warnings should go away. InnoDB uses row locks instead of table locks. It can/will use full table locks when warranted for the operation.

Log In?
Username:
Password:

What's my password?
Create A New User
Node Status?
node history
Node Type: perlquestion [id://1087805]
Approved by marto
help
Chatterbox?
[M4ver1k]: pass*

How do I use this? | Other CB clients
Other Users?
Others exploiting the Monastery: (5)
As of 2017-11-20 03:12 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?
    In order to be able to say "I know Perl", you must have:













    Results (283 votes). Check out past polls.

    Notices?