Sep 23, 2000, 7:09 PM
Post #2 of 2
There's lots of things you could do, but using perl to do so is prolly overkill.
A very simple backup script (runable from a CGI as well as command line) would be ...
<BLOCKQUOTE><font size="1" face="Arial,Helvetica,sans serif">code:</font><HR>
echo "Content-Type: text/plain"
$TAR --create --verbose --gzip \
--directory $ORIGINAL \
--file $BACKUP . 2>&1</pre><HR></BLOCKQUOTE>
/path/to/websites(s) should be the parent directory of the site or sites you want to archive.
/where/to/save/backup.tar.gz is the name of the backup file (.tar.gz is the more common UNIX alternative to .zip on PCs or .sit on Macs).
Also note the single period . after $BACKUP.