summaryrefslogtreecommitdiff
path: root/README
diff options
context:
space:
mode:
authorHiltjo Posthuma <hiltjo@codemadness.org>2022-01-06 13:18:52 +0100
committerHiltjo Posthuma <hiltjo@codemadness.org>2022-01-06 13:18:52 +0100
commit790a941eb0c78867f744d0551ac20b421b6c75e2 (patch)
treefc275e863d343eca61c3f11d3e68b760f62db62a /README
parent27a46121b3722e1933f0e40fedcf06675b2bca9d (diff)
README: sfeed_download small changes
Diffstat (limited to 'README')
-rw-r--r--README6
1 files changed, 3 insertions, 3 deletions
diff --git a/README b/README
index ca08b0f..40037ab 100644
--- a/README
+++ b/README
@@ -756,14 +756,14 @@ Shellscript to handle URLs and enclosures in parallel using xargs -P.
This can be used to download and process URLs for downloading podcasts,
webcomics, download and convert webpages, mirror videos, etc. It uses a
plain-text cache file for remembering processed URLs. The match patterns are
-defined in the fetch() function and in the awk script and can be modified to
-handle items differently depending on their context.
+defined in the shellscript fetch() function and in the awk script and can be
+modified to handle items differently depending on their context.
The arguments for the script are files in the sfeed(5) format. If no file
arguments are specified then the data is read from stdin.
#!/bin/sh
- # sfeed_download: Downloader for URLs and enclosures in feed files.
+ # sfeed_download: downloader for URLs and enclosures in sfeed(5) files.
# Dependencies: awk, curl, flock, xargs (-P), youtube-dl.
cachefile="${SFEED_CACHEFILE:-$HOME/.sfeed/downloaded_urls}"