The bandwidth of my home office internet connection is severely impacted in the last one week due to the cable disruptions in Mediterranean sea (news1, news2, news3). Simultaneous browsing the net and downloading documents has become a pain since then. So I had to resort to some of my older tricks to get the things going.
Instead of downloading any file/binary of size more than a few MB, I am now keeping the URL in a get-them list. There are a few cron jobs scheduled for human off-peak usage (and can be manually launched when I am heading out) that would scan the get-them list and grab the files. The grabbing is done using wget, launched in parallel for each file in the list. Once the list is exhausted, the URLs are cleared. I do backup the URLs for (manually) verifying the downloads.
This scheme is in place now and is of great help in crippled bandwidth situations.