bash script mass download from links in text file

Status
Not open for further replies.

Mutikasa

Active Member
314
2011
36
0
This is simple one.
First get your all links in one text file. One link one line, no spaces in links.
I'm using curl with wget here. Probably is possible to use only curl or only wget but i didn't try it.

here's the code
Code:
## first do loop condition to read from file

while read line ;
do

## now use curl to get cookie, this is for filesonic

curl --cookie-jar /your/path/cookies.txt \  ## where you want to save cookie
--data "email=your@email.com" \
--data "password=yourpassword" http://www.filesonic.com/user/login ;

## now download from every link

wget --continue --load-cookies=/your/path/cookies.txt \  ## --continue is to resume download
--output-document=/your/path/file.rar \ ## this is optional
$line ;

done < /your/path/links.txt ;
I'm getting cookie before every download because if u get cookie only on the start of the script session might expire and you will download nothing after that.
 
Last edited:
1 comment
Status
Not open for further replies.
Back
Top