I had to test our download servers at work right away, i.e., HTTP vs HTTPS, which required downloading a large number of files. A file list was given to me to work with, like so:
/download/integrations/file1.exe
/download/us/update/patch/file1.exe
/download/us/cab/file1.exe
So I decided to use the bash shell to read each line and run wget quick and dirty. Here’s the script:
cat c:\\test.txt | while read line; do wget “URL/${line}”; done
While running it I just used the Windows clock to get an approximate time difference.
That’s it! Hope this helps.