CGI/Perl Guide | Learning Center | Forums | Advertise | Login
Site Search: in

  Main Index MAIN
INDEX
Search Posts SEARCH
POSTS
Who's Online WHO'S
ONLINE
Log in LOG
IN

Home: Perl Programming Help: Beginner:
files/server copy backup

 



propfast
Deleted

Sep 22, 2000, 4:37 PM

Post #1 of 2 (267 views)
files/server copy backup Can't Post

I am useless at any unix / linux command line instruction or telnetting.

I really and desperately need to find a script (a no brainer) which would allow me to copy or backup entire sites or even most of a hard drive from Hard drive 1 to second hard drive on a remote RAQ3 or Linux server.

Any ideas, help or suggests would be *really* welcomed.

Many thanks, Propfast.


Kanji
User / Moderator

Sep 23, 2000, 7:09 PM

Post #2 of 2 (267 views)
Re: files/server copy backup [In reply to] Can't Post

There's lots of things you could do, but using perl to do so is prolly overkill.

A very simple backup script (runable from a CGI as well as command line) would be ...

<BLOCKQUOTE><font size="1" face="Arial,Helvetica,sans serif">code:</font><HR>

#!/bin/sh


TAR=/bin/tar
ORIGINAL=/path/to/website(s)
BACKUP=/where/to/save/backup.tar.gz


echo "Content-Type: text/plain"
echo ""


$TAR --create --verbose --gzip \
--directory $ORIGINAL \
--file $BACKUP . 2>&1</pre><HR></BLOCKQUOTE>

/path/to/websites(s) should be the parent directory of the site or sites you want to archive.

/where/to/save/backup.tar.gz is the name of the backup file (.tar.gz is the more common UNIX alternative to .zip on PCs or .sit on Macs).

Also note the single period . after $BACKUP.

 
 


Search for (options) Powered by Gossamer Forum v.1.2.0

Web Applications & Managed Hosting Powered by Gossamer Threads
Visit our Mailing List Archives