Questions tagged [wget]

GNU Wget is a free software package for retrieving files using HTTP, HTTPS and FTP, the most widely-used Internet protocols. It is a non-interactive command line tool, so it may easily be called from scripts, Cron jobs, terminals without X Window System (X11) support, etc.

GNU Wget is a free software package for retrieving files using HTTP, HTTPS and FTP, the most widely-used Internet protocols. It is a non-interactive command line tool, so it may easily be called from scripts, Cron jobs, terminals without X Window System (X11) support, etc.

Tag usage

Use this tag for questions related to usage of the command line tool. This includes questions about parameters, return codes, error messages, etc.

If your question only concerns the specific protocol being used (HTTP, HTTPS, FTP), please use that tag instead. In most cases, it will be appropriate to use this tag along with a specific protocol tag. It is always a good idea to do some triage with different software programs to determine if your question is related to only GNU Wget, or if your question is relevant to different software programs as well.

Related tags

901 questions
50
votes
1 answer

How to use wget to download HTTP error pages?

wget normally stops when it gets a HTTP error, e.g. 404 or so. Is there an option to make wget to download the page content regardless of the HTTP code?
lilydjwg
  • 916
45
votes
1 answer

Make wget convert HTML links to relative after download if -k wasn't specified

The -k option (or --convert-link) will convert links in your web pages to relative after the download finishes, such as the man page says: After the download is complete, convert the links in the document to make them suitable for local…
Nathaniel
  • 4,356
34
votes
3 answers

Download ALL Folders, SubFolders, and Files using Wget

I have been using Wget, and I have run across an issue. I have a site,that has several folders and subfolders within the site. I need to download all of the contents within each folder and subfolder. I have tried several methods using Wget, and when…
25
votes
3 answers

Is there a shorter version of wget --no-check-certificate option?

When I try to use wget on an HTTPS site, I need to add: wget --no-check-certificate https://... This is rather long, so does a shortcut exist?
juanpablo
  • 7,144
24
votes
4 answers

wget -o writes empty files on failure

If I write wget "no such address" -o "test.html" it first creates the test.html and in case of failure, leaves it empty. However, when not using -o, it will wait to see if the download succeeds and only after that, it'll write the file. I'd like…
akurtser
  • 765
16
votes
3 answers

Is it possible to do a wget dry-run?

I know you can download webpages recursively using wget, but is it possible to do a dry-run? So that you could sort of do a test-run to see how much would be downloaded if you actually did it? Thinking about pages that have a lot of links to media…
Svish
  • 39,580
12
votes
5 answers

Wget is silent, but it displays error messages

I want to download a file with Wget, but per the usual UNIX philosophy, I don't want it to output anything if the download succeeds. However, if the download fails, I want an error message. The -q option suppresses all output, including error…
phihag
  • 2,777
9
votes
2 answers

Exclude list of specific files in wget

I am trying to download a lot of pages from a website on dial-up and it can be brutally slow. I have almost got the perfect wget command, but because I'm downloading pages from the same site wget wastes times downloading the same standard images for…
nanker
  • 203
9
votes
2 answers

Downloading multiple files, and specifying output filenames with wget

I'm using wget with the -i option to download a list of files from a URL. However, I want to specify the names that these files will be saved with as well. I see you can do that with a single file using -O, and can specify a directory with -P; is…
leecbaker
  • 443
9
votes
1 answer

how to resume the wget process?

I accidentally closed the terminal while running a wget process to download a website. It has been 2 days since I started the wget process, so I don't know the process' status. Yesterday the traffic was high and today the traffic is low. It looks…
keling menua
7
votes
4 answers

wget - I can't download files with "?"

I wanted to download some tuts on aircrack by wget and I failed: wget -r 2 http://www.aircrack-ng.org/doku.php\?id=tutorial\&DokuWiki=78e8249415a9ce232228ed8f9f02b9dd --2011-10-06 14:16:11-- http://2/ Resolving 2... 0.0.0.2 Connecting to…
oneat
  • 3,321
7
votes
1 answer

How can I let wget to show file upload progress?

I'm using wget to upload files using the POST method. Sometimes the file is quite big. Is there a way to show the progress, like it does with download?
Wang Weijun
6
votes
4 answers

Problems with Wget to a CloudFlare hosted site: 503 Service Unavailable

I have seen other instances of 503 errors using Wget, but to no available I cannot solve this. When I try to download a certain website, I get a 503 Service Unavailable error. This does not happen to any website except for the one in question. This…
Zac Webb
  • 161
6
votes
1 answer

Why does `wget` download index.html instead of a direct file?

I'm just trying to download this, but it always redirect to the main page and in the end just download the index.html file, not the file I'm trying to download: http://tweaking.com/files/setups/tweaking.com_windows_repair_aio.zip Do you guys know…
Jaheaga
  • 61
5
votes
1 answer

What units does wget use for bandwidth?

When downloading, wget reports speed in "K/s". K...what? kilobits? kilobytes? 1024 or 1000? Update: wget -O /dev/null http://newark1.linode.com/100MB-newark.bin produces "348 K/s". Meanwhile: nethogs says "343 KB/sec" for the entire Wi-Fi…
endolith
  • 7,637
1
2 3 4 5 6 7 8