A ?? for linux gurus

Greebo

N9017H - C172M (1976)
Joined
Feb 11, 2005
Messages
10,976
Location
Baltimore, MD
Display Name

Display name:
Retired Evil Overlord
As you may know, I'm busily working on setting up a new server over which we will have more control than in our current shared hosting environment.

In preparation for the move, I'm running through all the steps needed to migrate the forums, including moving the attachments.

So far my only effiicient technique (efficient meaning, minimal involvement by me to make it happen) has been a 2 stage transfer - ftp down to my machine and back up to the new machine. I *could* do a direct transfer from old to new, but copying 100mb of files in a multilayered nested directory structure (root level has 9 dirs, each level under that has 9 dirs) is a real PITA in command line ftp.

I am looking for a linux based ftp utility that will let me say "Get this entire directory including all subdirectories and their contents and duplicate the structure exactly."

Anyone know of one?
 
You could tar the file, and then just ftp the tar file. Alternately use rsync, and you can keep it in sync automatically. Let me know if you want the syntax.
 
What is rsync?

And - DUH - stupid me - of course I could tar the file - that would make the overall transfer faster too. *smack self in forehead*

I'm out of depth enough in linux to not even think of these common sense ideas.

Still - what is rsynch? Keeping the attachments dir in synch automatically ahead of time would be good...
 
Oh - wait - never mind the TAR idea. I have no command line access to the hosted server. Only to the new one. :(
 
Whats your access? ftp only? Bummer.

rsync is a utility to keep directory trees in "sync" between servers. It is optimized for fast transfers. Very cool. I use it to sync my main servers with disaster recovery ones at another location. It runs by cron and works very well. You would need shell access, though...
 
OOH!! I bet you could use wget! Something like:

wget -r -l99 ftp://user:password@host/path


thats an L in the "l99" specifing the number of levels to recurse.

I _think_ that would work!
 
Last edited:
Aha! wget sounds perfect (googled it) - downloading the rpm now. Thanks!
 
What the hell! LOL I have a emoticon in my example!?!?! thats supposed to be a ":" User assword?

How can you prevent automatic parsing?
 
Greebo said:
As you may know, I'm busily working on setting up a new server over which we will have more control than in our current shared hosting environment.

In preparation for the move, I'm running through all the steps needed to migrate the forums, including moving the attachments.

So far my only effiicient technique (efficient meaning, minimal involvement by me to make it happen) has been a 2 stage transfer - ftp down to my machine and back up to the new machine. I *could* do a direct transfer from old to new, but copying 100mb of files in a multilayered nested directory structure (root level has 9 dirs, each level under that has 9 dirs) is a real PITA in command line ftp.

I am looking for a linux based ftp utility that will let me say "Get this entire directory including all subdirectories and their contents and duplicate the structure exactly."

Anyone know of one?

Not offhand, but why can't you do a nfs mount from one server to the other and just copy the whole thing? That way the timestamps etc will transfer, something that doesn't happen with ftp AFaIK.
 
Doubtful nfs is on or the ports open on the "shared" server. I sure don't have mine exposed to the world.
 
Excellent - once I figured out how it worked, its doing the job perfectly!
 
sshekels said:
OOH!! I bet you could use wget! Something like:

wget -r -l99 ftp://user:password@host/path

(use Code tags to turn off parsing)

I used:
Code:
 wget [url="ftp://nunya:business@host/path"]ftp://nunya:business@host/path[/url] -r --passive-ftp -nH -q &

Seems to be doing just what I want. :)
 
wget is also a great way to grab http files from a unix box. I always hated downloading it on my PC, and then moving it over. Just put the URL in, and away you go!

BTW, you could set it up as a cron to keep it all in sync.
 
Last edited:
sshekels said:
Doubtful nfs is on or the ports open on the "shared" server. I sure don't have mine exposed to the world.

Ah, most of my limited experience is behind the firewall where things are a bit looser.
 
lancefisher said:
Ah, most of my limited experience is behind the firewall where things are a bit looser.
You're loose behind the firewall? :hairraise:

[mental note: take the backseat when flying with Lance] ;)
 
I dont really NEED to keep it in synch. When we switch over, there will be a downtime anyway for the restoring of the most recent posts to the new server, so may as well do attachments in synch. :)
 
Back
Top