* docs/wget2.md: Wrap long options with backtick

Copyright-paperwork-exempt: Yes
This commit is contained in:
Carles Pina i Estany
2023-08-04 00:33:32 +02:00
committed by Tim Rühsen
parent f4bf9a2037
commit f827dbaaae

View File

@ -162,7 +162,7 @@ Go to background immediately after startup. If no output file is specified via t
If this function is used, no URLs need be present on the command line. If there are URLs both on the command line and in an
input file, those on the command lines will be the first ones to be retrieved. `file` is expected to contain one URL per line,
except one of the --force- options specifies a different format.
except one of the `--force-` options specifies a different format.
If you specify `--force-html`, the document will be regarded as HTML. In that case you may have problems with relative
links, which you can solve either by adding `<base href="url">` to the documents or by specifying `--base=url` on the command
@ -183,37 +183,37 @@ Go to background immediately after startup. If no output file is specified via t
### `-F`, `--force-html`
When input is read from a file, force it to be treated as an HTML file. This enables you to retrieve relative links from
existing HTML files on your local disk, by adding "<base href="url">" to HTML, or using the --base command-line option.
existing HTML files on your local disk, by adding "<base href="url">" to HTML, or using the `--base` command-line option.
### `--force-css`
Read and parse the input file as CSS. This enables you to retrieve links from existing CSS files on your local disk.
You will need --base to handle relative links correctly.
You will need `--base` to handle relative links correctly.
### `--force-sitemap`
Read and parse the input file as sitemap XML. This enables you to retrieve links from existing sitemap files on your local disk.
You will need --base to handle relative links correctly.
You will need `--base` to handle relative links correctly.
### `--force-atom`
Read and parse the input file as Atom Feed XML. This enables you to retrieve links from existing sitemap files on your local disk.
You will need --base to handle relative links correctly.
You will need `--base` to handle relative links correctly.
### `--force-rss`
Read and parse the input file as RSS Feed XML. This enables you to retrieve links from existing sitemap files on your local disk.
You will need --base to handle relative links correctly.
You will need `--base` to handle relative links correctly.
### `--force-metalink`
Read and parse the input file as Metalink. This enables you to retrieve links from existing Metalink files on your local disk.
You will need --base to handle relative links correctly.
You will need `--base` to handle relative links correctly.
### `-B`, `--base=URL`
Resolves relative links using URL as the point of reference, when reading links from an HTML file specified via the
-i/--input-file option (together with a --force... option, or when the input file was fetched remotely from a server describing it as
`-i`/`--input-file` option (together with a `--force`... option, or when the input file was fetched remotely from a server describing it as
HTML, CSS, Atom or RSS). This is equivalent to the presence of a "BASE" tag in the HTML input file, with URL as the value for the "href"
attribute.
@ -228,7 +228,7 @@ Go to background immediately after startup. If no output file is specified via t
config file(s). If set, `$WGET2RC` replaces `~/.wget2rc`.
`--no-config` empties the internal list of config files.
So if you want to prevent reading any config files, give --no-config on the command line.
So if you want to prevent reading any config files, give `--no-config` on the command line.
`--no-config` followed by `--config=file` just reads `file` and skips reading the default config files.
@ -500,7 +500,7 @@ Go to background immediately after startup. If no output file is specified via t
`--start-pos` has higher precedence over `--continue`. When `--start-pos` and `--continue` are both specified, Wget2 will
emit a warning then proceed as if `--continue` was absent.
Server support for continued download is required, otherwise --start-pos cannot help. See -c for details.
Server support for continued download is required, otherwise --start-pos cannot help. See `-c` for details.
### `--progress=type`
@ -520,7 +520,7 @@ Go to background immediately after startup. If no output file is specified via t
Force Wget2 to display the progress bar in any verbosity.
By default, Wget2 only displays the progress bar in verbose mode. One may however, want Wget2 to display the
progress bar on screen in conjunction with any other verbosity modes like --no-verbose or --quiet. This is often
progress bar on screen in conjunction with any other verbosity modes like `--no-verbose` or `--quiet`. This is often
a desired a property when invoking Wget2 to download several small/large files. In such a case, Wget2 could simply
be invoked with this parameter to get a much cleaner output on the screen.
@ -540,8 +540,8 @@ Go to background immediately after startup. If no output file is specified via t
Don't set the local file's timestamp by the one on the server.
By default, when a file is downloaded, its timestamps are set to match those from the remote file. This allows
the use of --timestamping on subsequent invocations of Wget2. However, it is sometimes useful to base the local
file's timestamp on when it was actually downloaded; for that purpose, the --no-use-server-timestamps option has
the use of `--timestamping` on subsequent invocations of Wget2. However, it is sometimes useful to base the local
file's timestamp on when it was actually downloaded; for that purpose, the `--no-use-server-timestamps` option has
been provided.
### `-S`, `--server-response`
@ -593,11 +593,11 @@ Go to background immediately after startup. If no output file is specified via t
### `--limit-rate=amount`
Limit the download speed to amount bytes per second. Amount may be expressed in bytes, kilobytes with the k
suffix, or megabytes with the m suffix. For example, --limit-rate=20k will limit the retrieval rate to 20KB/s.
suffix, or megabytes with the m suffix. For example, `--limit-rate=20k` will limit the retrieval rate to 20KB/s.
This is useful when, for whatever reason, you don't want Wget2 to consume the entire available bandwidth.
This option allows the use of decimal numbers, usually in conjunction with power suffixes; for example,
--limit-rate=2.5k is a legal value.
`--limit-rate=2.5k` is a legal value.
Note that Wget2 implements the limiting by sleeping the appropriate amount of time after a network read that took
less time than specified by the rate. Eventually this strategy causes the TCP transfer to slow down to
@ -612,7 +612,7 @@ Go to background immediately after startup. If no output file is specified via t
Specifying a large value for this option is useful if the network or the destination host is down, so that Wget2
can wait long enough to reasonably expect the network error to be fixed before the retry. The waiting interval
specified by this function is influenced by "--random-wait", which see.
specified by this function is influenced by `--random-wait`, which see.
### `--waitretry=seconds`
@ -626,7 +626,7 @@ Go to background immediately after startup. If no output file is specified via t
Some web sites may perform log analysis to identify retrieval programs such as Wget2 by looking for statistically
significant similarities in the time between requests. This option causes the time between requests to vary
between 0.5 and 1.5 ### wait seconds, where wait was specified using the --wait option, in order to mask Wget2's
between 0.5 and 1.5 ### wait seconds, where wait was specified using the `--wait` option, in order to mask Wget2's
presence from such analysis.
A 2001 article in a publication devoted to development on a popular consumer platform provided code to perform
@ -700,16 +700,16 @@ Go to background immediately after startup. If no output file is specified via t
### `-4`, `--inet4-only`, `-6`, `--inet6-only`
Force connecting to IPv4 or IPv6 addresses. With --inet4-only or -4, Wget2 will only connect to IPv4 hosts,
Force connecting to IPv4 or IPv6 addresses. With `--inet4-only` or `-4`, Wget2 will only connect to IPv4 hosts,
ignoring AAAA records in DNS, and refusing to connect to IPv6 addresses specified in URLs. Conversely, with
--inet6-only or -6, Wget2 will only connect to IPv6 hosts and ignore A records and IPv4 addresses.
`--inet6-only` or `-6`, Wget2 will only connect to IPv6 hosts and ignore A records and IPv4 addresses.
Neither options should be needed normally. By default, an IPv6-aware Wget2 will use the address family specified
by the host's DNS record. If the DNS responds with both IPv4 and IPv6 addresses, Wget2 will try them in sequence
until it finds one it can connect to. (Also see "--prefer-family" option described below.)
until it finds one it can connect to. (Also see `--prefer-family` option described below.)
These options can be used to deliberately force the use of IPv4 or IPv6 address families on dual family systems,
usually to aid debugging or to deal with broken network configuration. Only one of --inet6-only and --inet4-only
usually to aid debugging or to deal with broken network configuration. Only one of `--inet6-only` and `--inet4-only`
may be specified at the same time. Neither option is available in Wget2 compiled without IPv6 support.
### `--prefer-family=none/IPv4/IPv6`
@ -780,7 +780,7 @@ Go to background immediately after startup. If no output file is specified via t
### `--use-askpass=command`
Prompt for a user and password using the specified command. Overrides the user and/or password set by --user/--password (if any).
Prompt for a user and password using the specified command. Overrides the user and/or password set by `--user`/`--password` (if any).
### `--no-iri`
@ -830,10 +830,10 @@ Go to background immediately after startup. If no output file is specified via t
Be aware that this may have unintended side effects, for example "image.php?name=sun" will be changed
to "image.php". The cutting happens when saving the file, after downloading.
File names obtained from a "Content-Disposition" header are not affected by this setting (see --content-disposition),
File names obtained from a "Content-Disposition" header are not affected by this setting (see `--content-disposition`),
and can be a solution for this problem.
When "--trust-server-names" is used, the redirection URL is affected by this setting.
When `--trust-server-names` is used, the redirection URL is affected by this setting.
### `--chunk-size=size`
@ -1078,7 +1078,7 @@ Go to background immediately after startup. If no output file is specified via t
so proves your identity.
Mirroring such a site requires Wget2 to send the same cookies your browser sends when communicating with the site.
This is achieved by --load-cookies: simply point Wget2 to the location of the cookies.txt file, and it will send
This is achieved by `--load-cookies`: simply point Wget2 to the location of the cookies.txt file, and it will send
the same cookies your browser would send in the same situation. Different browsers keep textual cookie files in
different locations:
@ -1095,7 +1095,7 @@ Go to background immediately after startup. If no output file is specified via t
has been tested with Internet Explorer 5; it is not guaranteed to work with earlier versions.
"Other browsers."
If you are using a different browser to create your cookies, --load-cookies will only work if you can locate
If you are using a different browser to create your cookies, `--load-cookies` will only work if you can locate
or produce a cookie file in the Netscape format that Wget2 expects.
If you cannot use `--load-cookies`, there might still be an alternative. If your browser supports a "cookie
@ -1108,19 +1108,19 @@ Go to background immediately after startup. If no output file is specified via t
### `--save-cookies file`
Save cookies to `file` before exiting. This will not save cookies that have expired or that have no expiry time
(so-called "session cookies"), but also see --keep-session-cookies.
(so-called "session cookies"), but also see `--keep-session-cookies`.
### `--keep-session-cookies`
When specified, causes --save-cookies to also save session cookies. Session cookies are normally not saved
When specified, causes `--save-cookies` to also save session cookies. Session cookies are normally not saved
because they are meant to be kept in memory and forgotten when you exit the browser. Saving them is useful on
sites that require you to log in or to visit the home page before you can access some pages. With this option,
multiple Wget2 runs are considered a single browser session as far as the site is concerned.
Since the cookie file format does not normally carry session cookies, Wget2 marks them with an expiry timestamp of
0. Wget2's --load-cookies recognizes those as session cookies, but it might confuse other browsers. Also note
that cookies so loaded will be treated as other session cookies, which means that if you want --save-cookies to
preserve them again, you must use --keep-session-cookies again.
0. Wget2's `--load-cookies` recognizes those as session cookies, but it might confuse other browsers. Also note
that cookies so loaded will be treated as other session cookies, which means that if you want `--save-cookies` to
preserve them again, you must use `--keep-session-cookies` again.
### `--cookie-suffixes=file`
@ -1146,7 +1146,7 @@ Go to background immediately after startup. If no output file is specified via t
Send header-line along with the rest of the headers in each HTTP request. The supplied header is sent as-is,
which means it must contain name and value separated by colon, and must not contain newlines.
You may define more than one additional header by specifying --header more than once.
You may define more than one additional header by specifying `--header` more than once.
wget2 --header='Accept-Charset: iso-8859-2' \
--header='Accept-Language: hr' \
@ -1197,19 +1197,19 @@ Go to background immediately after startup. If no output file is specified via t
Explorer. This option allows you to change the "User-Agent" line issued by Wget. Use of this option is
discouraged, unless you really know what you are doing.
Specifying empty user agent with --user-agent="" instructs Wget2 not to send the "User-Agent" header in HTTP
Specifying empty user agent with `--user-agent=""` instructs Wget2 not to send the "User-Agent" header in HTTP
requests.
### `--post-data=string`, `--post-file=file`
Use POST as the method for all HTTP requests and send the specified data in the request body. --post-data sends
string as data, whereas --post-file sends the contents of file. Other than that, they work in exactly the same
string as data, whereas `--post-file` sends the contents of file. Other than that, they work in exactly the same
way. In particular, they both expect content of the form "key1=value1&key2=value2", with percent-encoding for
special characters; the only difference is that one expects its content as a command-line parameter and the other
accepts its content from a file. In particular, --post-file is not for transmitting files as form attachments:
accepts its content from a file. In particular, `--post-file` is not for transmitting files as form attachments:
those must appear as "key=value" data (with appropriate percent-coding) just like everything else. Wget2 does not
currently support "multipart/form-data" for transmitting POST data; only "application/x-www-form-urlencoded".
Only one of --post-data and --post-file should be specified.
Only one of `--post-data` and `--post-file` should be specified.
Please note that wget2 does not require the content to be of the form "key1=value1&key2=value2", and neither does
it test for it. Wget2 will simply transmit whatever data is provided to it. Most servers however expect the POST
@ -1220,7 +1220,7 @@ Go to background immediately after startup. If no output file is specified via t
characters in the text will also be sent as-is in the POST request.
Please be aware that Wget2 needs to know the size of the POST data in advance. Therefore the argument to
"--post-file" must be a regular file; specifying a FIFO or something like /dev/stdin won't work. It's not quite
`--post-file` must be a regular file; specifying a FIFO or something like /dev/stdin won't work. It's not quite
clear how to work around this limitation inherent in HTTP/1.0. Although HTTP/1.1 introduces chunked transfer
that doesn't require knowing the request length in advance, a client can't use chunked unless it knows it's
talking to an HTTP/1.1 server. And it can't know that until it receives a response, which in turn requires the
@ -1245,26 +1245,26 @@ Go to background immediately after startup. If no output file is specified via t
If the server is using session cookies to track user authentication, the above will not work because
`--save-cookies` will not save them (and neither will browsers) and the cookies.txt file will be empty. In that
case use --keep-session-cookies along with --save-cookies to force saving of session cookies.
case use `--keep-session-cookies` along with `--save-cookies` to force saving of session cookies.
### `--method=HTTP-Method`
For the purpose of RESTful scripting, Wget2 allows sending of other HTTP Methods without the need to explicitly
set them using --header=Header-Line. Wget2 will use whatever string is passed to it after --method as the HTTP
set them using `--header=Header-Line`. Wget2 will use whatever string is passed to it after `--method` as the HTTP
Method to the server.
### `--body-data=Data-String`, `--body-file=Data-File`
Must be set when additional data needs to be sent to the server along with the Method specified using `--method`.
`--body-data` sends string as data, whereas --body-file sends the contents of file. Other than that, they work in
`--body-data` sends string as data, whereas `--body-file` sends the contents of file. Other than that, they work in
exactly the same way.
Currently, `--body-file` is not for transmitting files as a whole. Wget2 does not currently support
"multipart/form-data" for transmitting data; only "application/x-www-form-urlencoded". In the future, this may be
changed so that wget2 sends the --body-file as a complete file instead of sending its contents to the server.
changed so that wget2 sends the `--body-file` as a complete file instead of sending its contents to the server.
Please be aware that Wget2 needs to know the contents of BODY Data in advance, and hence the argument to
--body-file should be a regular file. See --post-file for a more detailed explanation. Only one of --body-data
and --body-file should be specified.
`--body-file` should be a regular file. See `--post-file` for a more detailed explanation. Only one of `--body-data`
and `--body-file` should be specified.
If Wget2 is redirected after the request is completed, Wget2 will suspend the current method and send a GET request
till the redirection is completed. This is true for all redirection response codes except 307 Temporary Redirect
@ -1378,7 +1378,7 @@ Go to background immediately after startup. If no output file is specified via t
certificate. It is almost always a bad idea not to check the certificates when transmitting confidential or
important data. For self-signed/internal certificates, you should download the certificate and verify against
that instead of forcing this insecure mode. If you are really sure of not desiring any certificate verification,
you can specify --check-certificate=quiet to tell Wget2 to not print any warning about invalid certificates,
you can specify `--check-certificate=quiet` to tell Wget2 to not print any warning about invalid certificates,
albeit in most cases this is the wrong thing to do.
### `--certificate=file`
@ -1490,7 +1490,7 @@ Go to background immediately after startup. If no output file is specified via t
Once that time has passed, that HSTS policy will no longer be valid and will eventually be removed from the
database.
If you supply your own HSTS database via --hsts-file, be aware that Wget2 may modify the provided file if any
If you supply your own HSTS database via `--hsts-file`, be aware that Wget2 may modify the provided file if any
change occurs between the HSTS policies requested by the remote servers and those in the file. When Wget2 exits,
it effectively updates the HSTS database by rewriting the database file with the new entries.
@ -1656,7 +1656,7 @@ Go to background immediately after startup. If no output file is specified via t
The -r option is to retrieve recursively, and -nd to not create directories.
Note that when --delete-after is specified, --convert-links is ignored, so .orig files
Note that when --delete-after is specified, `--convert-links` is ignored, so .orig files
are simply not created in the first place.
### `-k`, `--convert-links`
@ -1697,7 +1697,7 @@ Go to background immediately after startup. If no output file is specified via t
It works particularly well in conjunction with `--adjust-extension`, although this coupling is not enforced. It
proves useful to populate Internet caches with files downloaded from different hosts.
Example: if some link points to //foo.com/bar.cgi?xyz with --adjust-extension asserted and its local destination
Example: if some link points to //foo.com/bar.cgi?xyz with `--adjust-extension` asserted and its local destination
is intended to be ./foo.com/bar.cgi?xyz.css, then the link would be converted to //foo.com/bar.cgi?xyz.css. Note
that only the filename part has been modified. The rest of the URL has been left untouched, including the net
path ("//") which would otherwise be processed by Wget2 and converted to the effective scheme (ie. "https://").
@ -1821,7 +1821,7 @@ Go to background immediately after startup. If no output file is specified via t
### `--ignore-tags=list`
This is the opposite of the --follow-tags option. To skip certain HTML tags when recursively looking for
This is the opposite of the `--follow-tags` option. To skip certain HTML tags when recursively looking for
documents to download, specify them in a comma-separated list.
In the past, this option was the best bet for downloading a single page and its requisites, using a command-line
@ -1832,7 +1832,7 @@ Go to background immediately after startup. If no output file is specified via t
However, the author of this option came across a page with tags like "<LINK REL="home" HREF="/">" and came to the
realization that specifying tags to ignore was not enough. One can't just tell Wget2 to ignore "<LINK>", because
then stylesheets will not be downloaded. Now the best bet for downloading a single page and its requisites is
the dedicated --page-requisites option.
the dedicated `--page-requisites` option.
### `--ignore-case`