[ < ] | [ > ] | [ << ] | [ Up ] | [ >> ] | [Top] | [Contents] | [Index] | [ ? ] |
The examples are divided into three sections loosely based on their complexity.
7.1 Simple Usage Simple, basic usage of the program. 7.2 Advanced Usage Advanced tips. 7.3 Very Advanced Usage The hairy stuff.
[ < ] | [ > ] | [ << ] | [ Up ] | [ >> ] | [Top] | [Contents] | [Index] | [ ? ] |
wget http://fly.srk.fer.hr/ |
wget --tries=45 http://fly.srk.fer.hr/jpg/flyweb.jpg |
wget -t 45 -o log http://fly.srk.fer.hr/jpg/flyweb.jpg & |
The ampersand at the end of the line makes sure that Wget works in the background. To unlimit the number of retries, use `-t inf'.
wget ftp://gnjilux.srk.fer.hr/welcome.msg |
wget ftp://ftp.gnu.org/pub/gnu/ links index.html |
[ < ] | [ > ] | [ << ] | [ Up ] | [ >> ] | [Top] | [Contents] | [Index] | [ ? ] |
wget -i file |
If you specify `-' as file name, the URLs will be read from standard input.
wget -r http://www.gnu.org/ -o gnulog |
wget --convert-links -r http://www.gnu.org/ -o gnulog |
wget -p --convert-links http://www.server.com/dir/page.html |
The HTML page will be saved to `www.server.com/dir/page.html', and the images, stylesheets, etc., somewhere under `www.server.com/', depending on where they were on the remote server.
wget -p --convert-links -nH -nd -Pdownload \ http://www.server.com/dir/page.html |
wget -S http://www.lycos.com/ |
wget -s http://www.lycos.com/ more index.html |
wget -r -l2 -P/tmp ftp://wuarchive.wustl.edu/ |
wget -r -l1 --no-parent -A.gif http://www.server.com/dir/ |
More verbose, but the effect is the same. `-r -l1' means to retrieve recursively (see section 3. Recursive Retrieval), with maximum depth of 1. `--no-parent' means that references to the parent directory are ignored (see section 4.3 Directory-Based Limits), and `-A.gif' means to download only the GIF files. `-A "*.gif"' would have worked too.
wget -nc -r http://www.gnu.org/ |
wget ftp://hniksic:mypassword@unix.server.com/.emacs |
Note, however, that this usage is not advisable on multi-user systems
because it reveals your password to anyone who looks at the output of
ps
.
wget -O - http://jagor.srce.hr/ http://www.srce.hr/ |
You can also combine the two options and make pipelines to retrieve the documents from remote hotlists:
wget -O - http://cool.list.com/ | wget --force-html -i - |
[ < ] | [ > ] | [ << ] | [ Up ] | [ >> ] | [Top] | [Contents] | [Index] | [ ? ] |
crontab 0 0 * * 0 wget --mirror http://www.gnu.org/ -o /home/me/weeklog |
wget --mirror --convert-links --backup-converted \ http://www.gnu.org/ -o /home/me/weeklog |
wget --mirror --convert-links --backup-converted \ --html-extension -o /home/me/weeklog \ http://www.gnu.org/ |
Or, with less typing:
wget -m -k -K -E http://www.gnu.org/ -o /home/me/weeklog |
[ << ] | [ >> ] | [Top] | [Contents] | [Index] | [ ? ] |