I’m constantly finding myself in a position where I need to download a bunch of files from a site site that I’m browsing in my case. Recently, I wanted to download a bunch of PDF lecture notes from my courses online module page. And I hate doing the same things repetitively, when I know that I could automate it somehow. So out comes a simple python script for the command line written in python, with argparse.
Give it a list of input urls and an optional file name list, download each file from url, and rename it given the name. Also has an option to specify the output destination.
Sample usage:
./download_files.py -i downloads.txt -n names.txt -o lecturenotes --extension pdf
Where the files would be something like this:
downloads.txt:
https://www.something.com/res/filea.pdf https://www.something.com/res/fileb.pdf https://www.something.com/res/filec.pdf
names.txt:
01 - Groups and Vector Spaces 02 - Linear Operators 03 - Applications (simple Q systems)
in this example, downloads.txt and names.txt are one url and file per line in their respective files. The downloaded files in this instance would all be append with an .pdf
extension.
Leave a Reply