Archive for January, 2009

A simple, non bloated script for Amazon S3 backups (Linux, easily portable)

While searching for scripts to backup files from Linux -> S3 (servers) I was surprised how difficult it was to find any nice, trim ones. There are huge java jars filled with crap to do it ofcourse;


dbserv01:~# ls -lah js3tream.jar
-rw-r--r-- 1 root root 3.2M 2007-12-19 15:07 js3tream.jar

Come on. 3.2M… What does that include? Windows Vista?

Sure js3tream is actually nice software. Portable and it works. It’s incredibly, mindbogglingly slow (but ofcourse, Java is not slow these days as many people tell me…) on my quad Xeon servers. But yeah, it has the features you would want. Except encryption of the files I upload; you arrange that in a different way please. Come to think of it; other features are missing as well.

There is S3Sync in Ruby which is actually nice and working well, but still too much hassle to get going as simple script. But no complaints about the speed or size of that.

Then there is some rsync thing in Python (I forgot the name and the HELLLLLLLLLLLLLLLLLL I went to to install it will not make me remember it any time soon).

So, as almost all things in life, if you want it done right, you just have to do it yourself. I don’t find it pretty at all, but it weighs in, including comments, at 36 kbs, which is, by far the smallest possible script I could find for the purpose I needed.

I’m not counting rar or gpg as they are not included in the other ones either, but when counting them it wouldn’t go over 1 mb.

Download http://undesigned.org.za/2007/10/22/amazon-s3-php-class

And install:

– rar (apt-get install rar)
– gpg (apt-get install gnupg)
– php (apt-get install php5-cli php5-curl)

Then edit the Amazon S3.php; put on top of it;


#!/usr/bin/php


if (sizeof($_SERVER["argv"])<4) {
echo "Usage: ./s3backup bucket ident dir dir dirn";
exit;
}

$bucket = $_SERVER["argv"][1];
$ident = $_SERVER["argv"][2];

define('PHPARTIALS_FILE_AWS_S3_ACCESSKEY', '');
define('PHPARTIALS_FILE_AWS_S3_SECRETKEY', '');
define('PHPARTIALS_FILE_RAR_ENCRYPT', '');
define('PHPARTIALS_FILE_GPG_ENCRYPT', '');

$s3 = new S3(PHPARTIALS_FILE_AWS_S3_ACCESSKEY, PHPARTIALS_FILE_AWS_S3_SECRETKEY);

#print_r($s3->getBucket($bucket));exit;

$l = $s3->listBuckets(true);

if (!in_array($bucket, $l)) {
$s3->putBucket($bucket, S3::ACL_PRIVATE);
}

$fn = $ident."-".strftime('%d%m%y%H%M').".rar";

array_shift($_SERVER["argv"]);
array_shift($_SERVER["argv"]);
array_shift($_SERVER["argv"]);

$f = implode(' ', $_SERVER["argv"]);

$cmd = "rar a -hp".PHPARTIALS_FILE_RAR_ENCRYPT." $fn $f";

echo `$cmd`;

$fn1=$fn.".enc";

$cmd = "gpg --batch --yes --trust-model always --encrypt --recipient '".PHPARTIALS_FILE_GPG_ENCRYPT."' -o $fn1 $fn";

echo `$cmd`;

$s3->putObjectFile($fn, $bucket, baseName($fn1));

print "Backup created!nSize: ".filesize($fn1)." bytesn";

unlink($fn);
unlink($fn1);

exit;
?>

Rename the S3.php to something like s3backup and chmod 700 s3backup to make it executable.

You put your 2 keys for Amazon S3 in the Amazon defines, a password for rarring in the RAR password define and the name of the user your public key is for on that system to GPG encrypt the rar.

To add a public key for GPG, just put your GPG Public key in a file, say /root/mypub.key and run;

gpg -a –import /root/mypub.key

And all will be fine.