12

I want to download a file with Wget, but per the usual UNIX philosophy, I don't want it to output anything if the download succeeds. However, if the download fails, I want an error message.

The -q option suppresses all output, including error messages. If I include -nv option instead, Wget still prints (on stderr):

2012-05-03 16:17:05 URL:http://example.net/ [2966] -> "index.html" [1]

How can I remove even that output, but still get error messages?

phihag
  • 2,777

5 Answers5

8

Try curl instead:

curl -fsS $url -o $file

Long version:

curl --fail --silent --show-error $url --output $file

GNOME users may try Gvfs:

gvfs-cp $url $file
BlaM
  • 479
  • 2
  • 7
  • 21
u1686_grawity
  • 452,512
3

Lame hack if you can't get a better answer:

wget {url} 2>/tmp/err.log || cat /tmp/err.log; rm /tmp/err.log

(The 2> /tmp/err.log redirects stderr to a tmp file; if wget returns 0 [success], the || short circuits otherwise it will print out the error log values)

Foon
  • 221
  • +1 I missed that all output was going to stderr; I've deleted my answer of just redirecting stdout to /dev/null. – chepner May 03 '12 at 14:29
  • 5
    That works, but it's lame. error_log=$(wget -nv example.net 2>&1) || echo $error_log is a more elegant solution, but still clumsy. – phihag May 03 '12 at 14:29
1

Since currently all wget output goes to stderr, it seems that to solve this 'the elegant way' you would have to patch the wget source.

wget source design dictate verbosity level difference between messages, rather than a simple split between error and not error message.

There is an open bug about this http://savannah.gnu.org/bugs/?33839, and there also some older discussion. Here is a suggested patch http://www.mail-archive.com/wget%40sunsite.dk/msg03289.html and here there is an answer from Hrvoje Niksic about this http://www.mail-archive.com/wget%40sunsite.dk/msg03330.html.

Other than that, there is of course the good solution you proposed in a comment to Foon's less elegant solution.

amotzg
  • 969
0

The simplest syntax that achieves what you're looking for is

wget <url> 2>&1 | grep ERROR

Examples

$ wget https://google.com/ 2>&1 | grep ERROR #this gives no output
$ wget https://google.com/randomstring 2>&1 | grep ERROR
2020-11-20 11:36:30 ERROR 404: Not Found.

Explanation

  • 2>&1 redirects stderr to stdout to capture wget's output
  • grep ERROR suppresses all output lines except where the string "ERROR" occurs
0

You could also pipe the output to grep and filter out the success message.

This should work:

wget ... -nv 2>&1 | grep -Pv "^\d\d\d\d-\d\d-\d\d \d\d:\d\d:\d\d URL:.*\[\d+\] -> ".*" \[\d+\]$"
Dennis
  • 49,727