Archive for December, 2007

Analyzing MySQL usage; finding and fixing bottlenecks

MyTop and Mtop are nice tools to analyze MySQL load and queries, however long running systems require another system, so I present a simple script to analyze longer running multi user databases for abusers and bottlenecks. The code is very easy to alter to retrieve other stats.
Otherwise you can always mail me to fix it for what you might need.


#!/usr/bin/perl

while(1) {
# load all previous data, if any
%users = ();
if (-f "./sql.log") {
open(F, "./sql.log");
while() {
chomp;
# user no_queries tot_length
/(.*?) (.*?) (.*)/;
$users{$1} = "$2 $3";
}
close F;
}

@p = `mysql -u yyy --password=xxx --execute='show full processlist'`;
for($i=3;$i<scalar(@p)-1;$i++) {
$r = $p[$i];
chomp($r);
@cols = split(/t/, $r);

next if $cols[4] eq "Sleep";

$q = "";
for($j=7;$j<scalar(@cols);$j++) {
$q .= $cols[$j]." ";
}
next if $q=~/show full processlist/;
next if $q eq "NULL";

$u = $cols[1];
$l = $cols[5];

$res = "";
if (!$users{$u}) {
$res = "1 $l";
} else {
$res = $users{$u};
$res =~ /(.*?) (.*)/;
$l += $2;
$t = $1 + 1;
$res = "$t $l";
}
$users{$u} = $res;
}

open(F, ">./sql.log");
foreach(keys(%users)) {
$res = $users{$_};
print F "$_ $resn";
}
close F;
}

Automatically kill all processes that do not belong on your system

A client of mine asked me for a program which would destroy ‘illegal’ processes. After a brief search I found the existing apps too limited or not configurable, so I threw one together that learns itself which processes are and are not allowed.

Simply run it like;

./process.pl learn

first; just do all stuff on your computer/server that is normal, so the system can learn.

After that press ctrl-c and rerun as:

./process.pl check

That’s it!


#!/usr/bin/perl

$cmd = $ARGV[0];

if (!$cmd) {
$cmd = "check";
}

%allowproc = ();
if ($cmd eq "check") {
open(F, "procs.log");
while() {
chomp;
$allowproc{$_} = 1;
}
close F;
}

if ($cmd eq "learn") {
open(F, ">procs.log");
}

while (1) {
@procs = `ps auxwww`;
foreach(@procs) {
chomp;
/.*?s+(.*?)s+.*?s+.*?s+.*?s+.*?s+.*?s+.*?s+.*?s+.*?s+(.*)/;
next if /^$/;
next if /defunct/;
next if /process.pl/;
if (!$allowproc{$2}) {
if ($cmd eq "learn") {
$allowproc{$2} = 1;
print F $2."n";
} else {
if (!$allowproc{$2}) {
`kill -9 $1`;
}
}
}
}
sleep 1;
}


Removing the Linux ext filelimit for scripts that do not circumvent that limit (like Mihalism multi host image hosting script)

With a few changes this will work on any script / system ofcourse, but it was done for Mihalism.

First you need to put all files in directories, go to the ‘files’ directory and run;

ls -la|awk ‘{print $8}’|perl -e ‘while(<>){chomp; @a=split(//, $_);$d=””;for($i=0;$i<5;$i++){$d.=@a[$i]."/"; print `mkdir $d`}; print `mv $_ $d`;}' Then the code needs a bit of fixing; open the file source/global_functions.php and add a function; function calc_directory($file){
$dir = “”;
for($i=0;$i<5;$i++) $dir .= $file[$i]."/";
return $dir;
}

then you need to fix all files that are calling the $path and $file to reflect the changed;

in the main directory of the script, run;

grep -R $path.$file *|awk ‘{print $1}’|uniq|perl -e ‘while(<>){chomp; /^(.*):$/; $f=$1;$s=`cat $f`; $s=~s/$path.$filename/$path.calc_directory($filename)/isgm; $s=~s/$path.$file/$path.calc_directory($file)/isgm; open(F, “>$f”); print F $s; close F;}’

and then test; I had no time to test yet, but the idea should be clear enough.