0

Is there a method on Windows or Linux for downloading all files* within an online directory, specifically:

http://download.opensuse.org/tumbleweed/

so I have a local repository I can work on? Any browser or command-line solutions would be fine, and I am willing to download any browser.

*recursively

1 Answers1

1

wget has a feature specifically designed for this need. The command you need would be:

wget --no-parent -r http://download.opensuse.org/tumbleweed/

-r means recursive.
--no-parent would skip the "Parent directory" link. Without this, the siblings of tumbleweed will also get downloaded.

SparedWhisle
  • 4,165
  • Weirdly, I keep getting a '404' error for some of the files but I don't understand how that's possible. Surely wget should only be attempting to get the files that are actually in the directory? It seems to be reattempting, but something doesn't feel quite right... – Peter David Carter Apr 15 '16 at 20:47