Tuesday, July 31, 2007

Website Backup

We manage our website through FTP. I wrote a script using Perl to back up the site every day. This allows me to "undo" changes by going back to older versions of a file.

I realize that this could be written in pure Perl, without relying on wget, but why reinvent the wheel?


#!/usr/bin/perl
use warnings;

$datedir=`date +%F`;
chomp $datedir;
$rootdir="/var/backup/web/".$datedir;
$ftpsite="ftp.website.com";
$credentials="/root/ftp.txt"; #text file, one line: username:password


mkdir ("$rootdir", 0755) unless (-d "$rootdir");

open (UPASS, $credentials) or die;
$creds = <UPASS>;
close (UPASS);

($username,$password) = split (/:/,$creds);

$username=~s/\s+//; #strip whitespace characters - just in case
$password=~s/\s+//;

chdir($rootdir);

$result=`wget ftp://$ftpsite/Web --user=$username --password=$password -nH -m --cut-dirs=1`;

open (LOGFILE,"> backup.log");
print LOGFILE "$result";
close (LOGFILE);

No comments: