mirror of
https://github.com/mikf/gallery-dl.git
synced 2024-11-25 12:12:34 +01:00
revert: options.md
revert: options.md
This commit is contained in:
commit
6f3c3a5d1e
106
docs/options.md
106
docs/options.md
@ -6,53 +6,72 @@
|
|||||||
## General Options:
|
## General Options:
|
||||||
-h, --help Print this help message and exit
|
-h, --help Print this help message and exit
|
||||||
--version Print program version and exit
|
--version Print program version and exit
|
||||||
-f, --filename FORMAT Filename format string for downloaded files ('/O' for "original" filenames)
|
-f, --filename FORMAT Filename format string for downloaded files
|
||||||
|
('/O' for "original" filenames)
|
||||||
-d, --destination PATH Target location for file downloads
|
-d, --destination PATH Target location for file downloads
|
||||||
-D, --directory PATH Exact location for file downloads
|
-D, --directory PATH Exact location for file downloads
|
||||||
-X, --extractors PATH Load external extractors from PATH
|
-X, --extractors PATH Load external extractors from PATH
|
||||||
--proxy URL Use the specified proxy
|
--proxy URL Use the specified proxy
|
||||||
--source-address IP Client-side IP address to bind to
|
--source-address IP Client-side IP address to bind to
|
||||||
--user-agent UA User-Agent request header
|
--user-agent UA User-Agent request header
|
||||||
--clear-cache MODULE Delete cached login sessions, cookies, etc. for MODULE (ALL to delete everything)
|
--clear-cache MODULE Delete cached login sessions, cookies, etc. for
|
||||||
|
MODULE (ALL to delete everything)
|
||||||
|
|
||||||
## Input Options:
|
## Input Options:
|
||||||
-i, --input-file FILE Download URLs found in FILE ('-' for stdin). More than one --input-file can be specified
|
-i, --input-file FILE Download URLs found in FILE ('-' for stdin).
|
||||||
|
More than one --input-file can be specified
|
||||||
-I, --input-file-comment FILE
|
-I, --input-file-comment FILE
|
||||||
Download URLs found in FILE. Comment them out after they were downloaded successfully.
|
Download URLs found in FILE. Comment them out
|
||||||
|
after they were downloaded successfully.
|
||||||
-x, --input-file-delete FILE
|
-x, --input-file-delete FILE
|
||||||
Download URLs found in FILE. Delete them after they were downloaded successfully.
|
Download URLs found in FILE. Delete them after
|
||||||
|
they were downloaded successfully.
|
||||||
|
|
||||||
## Output Options:
|
## Output Options:
|
||||||
-q, --quiet Activate quiet mode
|
-q, --quiet Activate quiet mode
|
||||||
-w, --warning Print only warnings and errors
|
-w, --warning Print only warnings and errors
|
||||||
-v, --verbose Print various debugging information
|
-v, --verbose Print various debugging information
|
||||||
-g, --get-urls Print URLs instead of downloading
|
-g, --get-urls Print URLs instead of downloading
|
||||||
-G, --resolve-urls Print URLs instead of downloading; resolve intermediary URLs
|
-G, --resolve-urls Print URLs instead of downloading; resolve
|
||||||
|
intermediary URLs
|
||||||
-j, --dump-json Print JSON information
|
-j, --dump-json Print JSON information
|
||||||
-s, --simulate Simulate data extraction; do not download anything
|
-s, --simulate Simulate data extraction; do not download
|
||||||
|
anything
|
||||||
-E, --extractor-info Print extractor defaults and settings
|
-E, --extractor-info Print extractor defaults and settings
|
||||||
-K, --list-keywords Print a list of available keywords and example values for the given URLs
|
-K, --list-keywords Print a list of available keywords and example
|
||||||
|
values for the given URLs
|
||||||
-e, --error-file FILE Add input URLs which returned an error to FILE
|
-e, --error-file FILE Add input URLs which returned an error to FILE
|
||||||
--list-modules Print a list of available extractor modules
|
--list-modules Print a list of available extractor modules
|
||||||
--list-extractors Print a list of extractor classes with description, (sub)category and example URL
|
--list-extractors Print a list of extractor classes with
|
||||||
|
description, (sub)category and example URL
|
||||||
--write-log FILE Write logging output to FILE
|
--write-log FILE Write logging output to FILE
|
||||||
--write-unsupported FILE Write URLs, which get emitted by other extractors but cannot be handled, to FILE
|
--write-unsupported FILE Write URLs, which get emitted by other
|
||||||
--write-pages Write downloaded intermediary pages to files in the current directory to debug problems
|
extractors but cannot be handled, to FILE
|
||||||
|
--write-pages Write downloaded intermediary pages to files in
|
||||||
|
the current directory to debug problems
|
||||||
--no-colors Do not emit ANSI color codes in output
|
--no-colors Do not emit ANSI color codes in output
|
||||||
|
|
||||||
## Downloader Options:
|
## Downloader Options:
|
||||||
-r, --limit-rate RATE Maximum download rate (e.g. 500k or 2.5M)
|
-r, --limit-rate RATE Maximum download rate (e.g. 500k or 2.5M)
|
||||||
-R, --retries N Maximum number of retries for failed HTTP requests or -1 for infinite retries (default: 4)
|
-R, --retries N Maximum number of retries for failed HTTP
|
||||||
|
requests or -1 for infinite retries (default: 4)
|
||||||
--http-timeout SECONDS Timeout for HTTP connections (default: 30.0)
|
--http-timeout SECONDS Timeout for HTTP connections (default: 30.0)
|
||||||
--sleep SECONDS Number of seconds to wait before each download. This can be either a constant value or a range (e.g. 2.7 or 2.0-3.5)
|
--sleep SECONDS Number of seconds to wait before each download.
|
||||||
--sleep-request SECONDS Number of seconds to wait between HTTP requests during data extraction
|
This can be either a constant value or a range
|
||||||
--sleep-extractor SECONDS Number of seconds to wait before starting data extraction for an input URL
|
(e.g. 2.7 or 2.0-3.5)
|
||||||
--filesize-min SIZE Do not download files smaller than SIZE (e.g. 500k or 2.5M)
|
--sleep-request SECONDS Number of seconds to wait between HTTP requests
|
||||||
--filesize-max SIZE Do not download files larger than SIZE (e.g. 500k or 2.5M)
|
during data extraction
|
||||||
|
--sleep-extractor SECONDS Number of seconds to wait before starting data
|
||||||
|
extraction for an input URL
|
||||||
|
--filesize-min SIZE Do not download files smaller than SIZE (e.g.
|
||||||
|
500k or 2.5M)
|
||||||
|
--filesize-max SIZE Do not download files larger than SIZE (e.g.
|
||||||
|
500k or 2.5M)
|
||||||
--chunk-size SIZE Size of in-memory data chunks (default: 32k)
|
--chunk-size SIZE Size of in-memory data chunks (default: 32k)
|
||||||
--no-part Do not use .part files
|
--no-part Do not use .part files
|
||||||
--no-skip Do not skip downloads; overwrite existing files
|
--no-skip Do not skip downloads; overwrite existing files
|
||||||
--no-mtime Do not set file modification times according to Last-Modified HTTP response headers
|
--no-mtime Do not set file modification times according to
|
||||||
|
Last-Modified HTTP response headers
|
||||||
--no-download Do not download any files
|
--no-download Do not download any files
|
||||||
--no-postprocessors Do not run any post processors
|
--no-postprocessors Do not run any post processors
|
||||||
--no-check-certificate Disable HTTPS certificate validation
|
--no-check-certificate Disable HTTPS certificate validation
|
||||||
@ -74,18 +93,32 @@
|
|||||||
-C, --cookies FILE File to load additional cookies from
|
-C, --cookies FILE File to load additional cookies from
|
||||||
--cookies-export FILE Export session cookies to FILE
|
--cookies-export FILE Export session cookies to FILE
|
||||||
--cookies-from-browser BROWSER[/DOMAIN][+KEYRING][:PROFILE][::CONTAINER]
|
--cookies-from-browser BROWSER[/DOMAIN][+KEYRING][:PROFILE][::CONTAINER]
|
||||||
Name of the browser to load cookies from, with optional domain prefixed with '/', keyring name prefixed with '+', profile prefixed with ':', and container
|
Name of the browser to load cookies from, with
|
||||||
prefixed with '::' ('none' for no container)
|
optional domain prefixed with '/', keyring name
|
||||||
|
prefixed with '+', profile prefixed with ':',
|
||||||
|
and container prefixed with '::' ('none' for no
|
||||||
|
container)
|
||||||
|
|
||||||
## Selection Options:
|
## Selection Options:
|
||||||
--download-archive FILE Record all downloaded or skipped files in FILE and skip downloading any file already in it
|
--download-archive FILE Record all downloaded or skipped files in FILE
|
||||||
-A, --abort N Stop current extractor run after N consecutive file downloads were skipped
|
and skip downloading any file already in it
|
||||||
-T, --terminate N Stop current and parent extractor runs after N consecutive file downloads were skipped
|
-A, --abort N Stop current extractor run after N consecutive
|
||||||
--range RANGE Index range(s) specifying which files to download. These can be either a constant value, range, or slice (e.g. '5', '8-20', or '1:24:3')
|
file downloads were skipped
|
||||||
--chapter-range RANGE Like '--range', but applies to manga chapters and other delegated URLs
|
-T, --terminate N Stop current and parent extractor runs after N
|
||||||
--filter EXPR Python expression controlling which files to download. Files for which the expression evaluates to False are ignored. Available keys are the filename-specific
|
consecutive file downloads were skipped
|
||||||
ones listed by '-K'. Example: --filter "image_width >= 1000 and rating in ('s', 'q')"
|
--range RANGE Index range(s) specifying which files to
|
||||||
--chapter-filter EXPR Like '--filter', but applies to manga chapters and other delegated URLs
|
download. These can be either a constant value,
|
||||||
|
range, or slice (e.g. '5', '8-20', or '1:24:3')
|
||||||
|
--chapter-range RANGE Like '--range', but applies to manga chapters
|
||||||
|
and other delegated URLs
|
||||||
|
--filter EXPR Python expression controlling which files to
|
||||||
|
download. Files for which the expression
|
||||||
|
evaluates to False are ignored. Available keys
|
||||||
|
are the filename-specific ones listed by '-K'.
|
||||||
|
Example: --filter "image_width >= 1000 and
|
||||||
|
rating in ('s', 'q')"
|
||||||
|
--chapter-filter EXPR Like '--filter', but applies to manga chapters
|
||||||
|
and other delegated URLs
|
||||||
|
|
||||||
## Post-processing Options:
|
## Post-processing Options:
|
||||||
-P, --postprocessor NAME Activate the specified post processor
|
-P, --postprocessor NAME Activate the specified post processor
|
||||||
@ -96,7 +129,16 @@
|
|||||||
--write-tags Write image tags to separate text files
|
--write-tags Write image tags to separate text files
|
||||||
--zip Store downloaded files in a ZIP archive
|
--zip Store downloaded files in a ZIP archive
|
||||||
--cbz Store downloaded files in a CBZ archive
|
--cbz Store downloaded files in a CBZ archive
|
||||||
--mtime NAME Set file modification times according to metadata selected by NAME. Examples: 'date' or 'status[date]'
|
--mtime NAME Set file modification times according to
|
||||||
--ugoira FORMAT Convert Pixiv Ugoira to FORMAT using FFmpeg. Supported formats are 'webm', 'mp4', 'gif', 'vp8', 'vp9', 'vp9-lossless', 'copy'.
|
metadata selected by NAME. Examples: 'date' or
|
||||||
--exec CMD Execute CMD for each downloaded file. Supported replacement fields are {} or {_path}, {_directory}, {_filename}. Example: --exec "convert {} {}.png && rm {}"
|
'status[date]'
|
||||||
--exec-after CMD Execute CMD after all files were downloaded. Example: --exec-after "cd {_directory} && convert * ../doc.pdf"
|
--ugoira FORMAT Convert Pixiv Ugoira to FORMAT using FFmpeg.
|
||||||
|
Supported formats are 'webm', 'mp4', 'gif',
|
||||||
|
'vp8', 'vp9', 'vp9-lossless', 'copy'.
|
||||||
|
--exec CMD Execute CMD for each downloaded file. Supported
|
||||||
|
replacement fields are {} or {_path},
|
||||||
|
{_directory}, {_filename}. Example: --exec
|
||||||
|
"convert {} {}.png && rm {}"
|
||||||
|
--exec-after CMD Execute CMD after all files were downloaded.
|
||||||
|
Example: --exec-after "cd {_directory} &&
|
||||||
|
convert * ../doc.pdf"
|
||||||
|
Loading…
Reference in New Issue
Block a user