Lets say you have a nice Linux server with a nice connection you want to grab a whole bunch of files from an FTP server using multiple connections to utilize all of the available bandwidth. Graphical based utilities are the work of the devil, so we can use wget.
First change to your directory that you want to download all of the files to.
Here is a nice wget command that you can spawn multiple instances of to get the job done:
wget -r -nc -b ftp://example.com/somedirectory/full/of/recursive/files/ –ftp-user=user –ftp-password=password
The -r tells wget to recursively download so it grabs everything in that directory and beneath it.
The -nc tells wget to skip a file if it already exists locally to prevent all of your instances from replacing the same file over and over.
The -b tells wget to work in the background and send output to a log file. This is necessary to run 20 instances at once.
The –ftp-user= and –ftp-password= are only required if the FTP server doesn’t allow anonymous access.
Just run the command as many times as you want for the desired number of threads.
A quick killall wget will put all of the action to a halt if the fun is getting out of control.