Separate <code> and /

This commit is contained in:
Fabrice Mouhartem 2022-07-26 10:26:04 +02:00
parent a701bb221a
commit 94f9a25261

View File

@ -59,11 +59,11 @@ Sometimes, [robots.txt](https://en.wikipedia.org/wiki/Robots_exclusion_standard)
### Number of tries
Occasionally, when the server is busy answering you, `wget` will try again and again (20 times by default), which can slower your mirroring quite a bit (especially if the timeout is big). You can lower this bound using the… `--tries/-t` option.
Occasionally, when the server is busy answering you, `wget` will try again and again (20 times by default), which can slower your mirroring quite a bit (especially if the timeout is big). You can lower this bound using the… `--tries`/`-t` option.
## Finding 404 on a website
Using the `--spider` option to not actually download files, you can use it as a debugger for your website with `--output-file/-o` to log the result in a file.
Using the `--spider` option to not actually download files, you can use it as a debugger for your website with `--output-file`/`-o` to log the result in a file.
```sh
wget --spider -r -nd -o <logfile> <url>
@ -75,7 +75,7 @@ The list of broken links is then summarized at the end of the log file.
## Send a POST request
My most frequent use of `curl` is to send POST requests to different kind of API, the syntax is quite simple using the `--form`/-F` option:
My most frequent use of `curl` is to send POST requests to different kind of API, the syntax is quite simple using the `--form`/`-F` option:
```sh
curl -F <field1>=<content1> -F <field2>=<content2> <url>