summaryrefslogtreecommitdiff
path: root/README
diff options
context:
space:
mode:
Diffstat (limited to 'README')
-rw-r--r--README58
1 files changed, 2 insertions, 56 deletions
diff --git a/README b/README
index 8ea13ed..47d4d76 100644
--- a/README
+++ b/README
@@ -117,6 +117,8 @@ Optional dependencies
used by sfeed_update(1). If the text in your RSS/Atom feeds are already UTF-8
encoded then you don't need this. For a minimal iconv implementation:
https://git.etalabs.net/cgit/noxcuse/tree/src/iconv.c
+- xargs with support for the -P and -0 option,
+ used by sfeed_update(1).
- mandoc for documentation: https://mdocml.bsd.lv/
- curses (typically ncurses), otherwise see minicurses.h,
used by sfeed_curses(1).
@@ -706,62 +708,6 @@ sfeedrc file and change the curl options "-L --max-redirs 0".
- - -
-Shellscript to update feeds in parallel more efficiently using xargs -P.
-
-It creates a queue of the feeds with its settings, then uses xargs to process
-them in parallel using the common, but non-POSIX -P option. This is more
-efficient than the more portable solution in sfeed_update which can stall a
-batch of $maxjobs in the queue if one item is slow.
-
-sfeed_update_xargs shellscript:
-
- #!/bin/sh
- # update feeds, merge with old feeds using xargs in parallel mode (non-POSIX).
-
- # include script and reuse its functions, but do not start main().
- SFEED_UPDATE_INCLUDE="1" . sfeed_update
- # load config file, sets $config.
- loadconfig "$1"
-
- # process a single feed.
- # args are: config, tmpdir, name, feedurl, basesiteurl, encoding
- if [ "${SFEED_UPDATE_CHILD}" = "1" ]; then
- sfeedtmpdir="$2"
- _feed "$3" "$4" "$5" "$6"
- exit $?
- fi
-
- # ...else parent mode:
-
- # feed(name, feedurl, basesiteurl, encoding)
- feed() {
- # workaround: *BSD xargs doesn't handle empty fields in the middle.
- name="${1:-$$}"
- feedurl="${2:-http://}"
- basesiteurl="${3:-${feedurl}}"
- encoding="$4"
-
- printf '%s\0%s\0%s\0%s\0%s\0%s\0' "${config}" "${sfeedtmpdir}" \
- "${name}" "${feedurl}" "${basesiteurl}" "${encoding}"
- }
-
- # fetch feeds and store in temporary directory.
- sfeedtmpdir="$(mktemp -d '/tmp/sfeed_XXXXXX')" || exit 1
- mkdir -p "${sfeedtmpdir}/feeds"
- touch "${sfeedtmpdir}/ok" || exit 1
- # make sure path exists.
- mkdir -p "${sfeedpath}"
- # print feeds for parallel processing with xargs.
- feeds | SFEED_UPDATE_CHILD="1" xargs -r -0 -P "${maxjobs}" -L 6 "$(readlink -f "$0")"
- statuscode=$?
- # check error exit status indicator for parallel jobs.
- test -f "${sfeedtmpdir}/ok" || statuscode=1
- # cleanup temporary files etc.
- cleanup
- exit ${statuscode}
-
-- - -
-
Shellscript to handle URLs and enclosures in parallel using xargs -P.
This can be used to download and process URLs for downloading podcasts,