summaryrefslogtreecommitdiff
path: root/sfeedrc.5
diff options
context:
space:
mode:
authorBenjamin Chausse <benjamin@chausse.xyz>2024-08-09 14:11:50 -0400
committerBenjamin Chausse <benjamin@chausse.xyz>2024-08-09 14:11:50 -0400
commit5857d82e8e596d6fda406a0c4d8d68ca7a03c124 (patch)
tree553916894dee907825360580c5d9a05c82c5af16 /sfeedrc.5
parent3574e3cbf9d99546e868aeb995ce2c171cdc36a6 (diff)
parent19957bc272e745af7b56b79fa648e8b6b77113b1 (diff)
Merge remote-tracking branch 'upstream/master'HEADmaster
Diffstat (limited to 'sfeedrc.5')
-rw-r--r--sfeedrc.542
1 files changed, 26 insertions, 16 deletions
diff --git a/sfeedrc.5 b/sfeedrc.5
index aedbf59..7640a28 100644
--- a/sfeedrc.5
+++ b/sfeedrc.5
@@ -1,4 +1,4 @@
-.Dd August 5, 2021
+.Dd December 26, 2023
.Dt SFEEDRC 5
.Os
.Sh NAME
@@ -18,7 +18,7 @@ The default is
can be used to change the amount of concurrent
.Fn feed
jobs.
-The default is 8.
+The default is 16.
.El
.Sh FUNCTIONS
.Bl -tag -width Ds
@@ -37,6 +37,9 @@ Name of the feed, this is also used as the filename for the TAB-separated
feed file.
The feed name cannot contain the '/' character because it is a path separator,
they will be replaced with '_'.
+Each
+.Fa name
+should be unique.
.It Fa feedurl
URL to fetch the RSS/Atom data from, usually a HTTP or HTTPS URL.
.It Op Fa basesiteurl
@@ -59,12 +62,12 @@ is a shellscript each function can be overridden to change its behaviour,
notable functions are:
.Bl -tag -width Ds
.It Fn fetch "name" "url" "feedfile"
-Fetch feed from URL and writes data to stdout, its arguments are:
+Fetch feed from URL and write the data to stdout, its arguments are:
.Bl -tag -width Ds
.It Fa name
Specified name in configuration file (useful for logging).
.It Fa url
-Url to fetch.
+URL to fetch.
.It Fa feedfile
Used feedfile (useful for comparing modification times).
.El
@@ -73,8 +76,9 @@ By default the tool
.Xr curl 1
is used.
.It Fn convertencoding "name" "from" "to"
-Convert from text-encoding to another and writes it to stdout, its arguments
-are:
+Convert data from stdin from one text-encoding to another and write it to
+stdout,
+its arguments are:
.Bl -tag -width Ds
.It Fa name
Feed name.
@@ -88,9 +92,9 @@ By default the tool
.Xr iconv 1
is used.
.It Fn parse "name" "feedurl" "basesiteurl"
-Parse and convert RSS/Atom XML to the
+Read RSS/Atom XML data from stdin, convert and write it as
.Xr sfeed 5
-TSV format.
+data to stdout.
.Bl -tag -width Ds
.It Fa name
Name of the feed.
@@ -100,16 +104,20 @@ URL of the feed.
Base URL of the feed links.
This argument allows to fix relative item links.
.El
-.It Fn filter "name"
+.It Fn filter "name" "url"
Filter
.Xr sfeed 5
-data from stdin, write to stdout, its arguments are:
+data from stdin and write it to stdout, its arguments are:
.Bl -tag -width Ds
.It Fa name
Feed name.
+.It Fa url
+URL of the feed.
.El
.It Fn merge "name" "oldfile" "newfile"
-Merge data of oldfile with newfile and writes it to stdout, its arguments are:
+Merge
+.Xr sfeed 5
+data of oldfile with newfile and write it to stdout, its arguments are:
.Bl -tag -width Ds
.It Fa name
Feed name.
@@ -118,13 +126,15 @@ Old file.
.It Fa newfile
New file.
.El
-.It Fn order "name"
+.It Fn order "name" "url"
Sort
.Xr sfeed 5
-data from stdin, write to stdout, its arguments are:
+data from stdin and write it to stdout, its arguments are:
.Bl -tag -width Ds
.It Fa name
Feed name.
+.It Fa url
+URL of the feed.
.El
.El
.Sh EXAMPLES
@@ -136,7 +146,7 @@ shown below:
# list of feeds to fetch:
feeds() {
# feed <name> <feedurl> [basesiteurl] [encoding]
- feed "codemadness" "https://www.codemadness.nl/atom.xml"
+ feed "codemadness" "https://www.codemadness.org/atom_content.xml"
feed "explosm" "http://feeds.feedburner.com/Explosm"
feed "golang github releases" "https://github.com/golang/go/releases.atom"
feed "linux kernel" "https://www.kernel.org/feeds/kdist.xml" "https://www.kernel.org"
@@ -159,8 +169,8 @@ file:
.Bd -literal
# fetch(name, url, feedfile)
fetch() {
- # allow for 1 redirect, hide User-Agent, timeout is 15 seconds.
- curl -L --max-redirs 1 -H "User-Agent:" -f -s -m 15 \\
+ # allow for 1 redirect, set User-Agent, timeout is 15 seconds.
+ curl -L --max-redirs 1 -H "User-Agent: 007" -f -s -m 15 \e
"$2" 2>/dev/null
}
.Ed