curl -IL "URL"
This command would send a HEAD request (-I), follow through all redirects (-L), and display some useful information in the end. Most of the time it's ideal:
curl -IL "http://www.google.com"
HTTP/1.1 200 OK
Date: Fri, 11 Jun 2010 03:58:55 GMT
Expires: -1
Cache-Control: private, max-age=0
Content-Type: text/html; charset=ISO-8859-1
Server: gws
X-XSS-Protection: 1; mode=block
Transfer-Encoding: chunked
However, the server I was curling didn't support HEAD requests explicitly. Additionally, I was really only interested in HTTP status codes and not in the rest of the output. This means I would have to change my strategy and issue GET requests, ignoring HTML output completely.
Curl manual to the rescue. A few minutes later, I came up with the following, which served my needs perfectly:
curl -sL -w "%{http_code} %{url_effective}\n" "URL" -o /dev/null
Here is a sample of what comes out:
curl -sL -w "%{http_code} %{url_effective}\n" "http://www.amazon.com/Kindle-Wireless-Reading-Display-Generation/dp/B0015T963C?tag=androidpolice-20" -o /dev/null
200 http://www.amazon.com/Kindle-Wireless-Reading-Display-Generation/dp/B0015T963C
Here, -s silences curl's progress output, -L follows all redirects as before, -w prints the report using a custom format, and -o redirects curl's HTML output to /dev/null.
Here are the other special variables available in case you want to customize the output some more:
- url_effective
- http_code
- http_connect
- time_total
- time_namelookup
- time_connect
- time_pretransfer
- time_redirect
- time_starttransfer
- size_download
- size_upload
- size_header
- size_request
- speed_download
- speed_upload
- content_type
- num_connects
- num_redirects
- ftp_entry_path
Is there a better way to do this with curl? Perhaps, but this way offers the most flexibility, as I am in control of all the formatting. Artem Russakovskii
No comments:
Post a Comment