diff options
author | Hiltjo Posthuma <hiltjo@codemadness.org> | 2021-01-27 13:07:45 +0100 |
---|---|---|
committer | Hiltjo Posthuma <hiltjo@codemadness.org> | 2021-01-27 15:24:02 +0100 |
commit | 645ef7420056796e6d2716bf920b8704451912ac (patch) | |
tree | 17f7ad1a29673a5c5a0feaad33ee061c59626c42 /README | |
parent | 2f8a83288d91ea0abc2e4ebd6754513ee3ad37ec (diff) |
typofixes
Diffstat (limited to 'README')
-rw-r--r-- | README | 20 |
1 files changed, 10 insertions, 10 deletions
@@ -143,7 +143,7 @@ sfeed_mbox - Format feed data (TSV) to mbox. sfeed_plain - Format feed data (TSV) to a plain-text list. sfeed_twtxt - Format feed data (TSV) to a twtxt feed. sfeed_update - Update feeds and merge items. -sfeed_web - Find urls to RSS/Atom feed from a webpage. +sfeed_web - Find URLs to RSS/Atom feed from a webpage. sfeed_xmlenc - Detect character-set encoding from a XML stream. sfeedrc.example - Example config file. Can be copied to $HOME/.sfeed/sfeedrc. style.css - Example stylesheet to use with sfeed_html(1) and @@ -156,7 +156,7 @@ Files read at runtime by sfeed_update(1) sfeedrc - Config file. This file is evaluated as a shellscript in sfeed_update(1). -Atleast the following functions can be overridden per feed: +At least the following functions can be overridden per feed: - fetch: to use wget(1), OpenBSD ftp(1) or an other download program. - filter: to filter on fields. @@ -190,7 +190,7 @@ man 1 sfeed Usage and examples ------------------ -Find RSS/Atom feed urls from a webpage: +Find RSS/Atom feed URLs from a webpage: url="https://codemadness.org"; curl -L -s "$url" | sfeed_web "$url" @@ -226,7 +226,7 @@ View formatted output in your editor: - - - Example script to view feed items in a vertical list/menu in dmenu(1). It opens -the selected url in the browser set in $BROWSER: +the selected URL in the browser set in $BROWSER: #!/bin/sh url=$(sfeed_plain "$HOME/.sfeed/feeds/"* | dmenu -l 35 -i | \ @@ -252,7 +252,7 @@ argument is optional): - - - The filter function can be overridden in your sfeedrc file. This allows -filtering items per feed. It can be used to shorten urls, filter away +filtering items per feed. It can be used to shorten URLs, filter away advertisements, strip tracking parameters and more. # filter fields. @@ -367,7 +367,7 @@ cut -b is used to trim the "N " prefix of sfeed_plain(1). - - - For some podcast feed the following code can be used to filter the latest -enclosure url (probably some audio file): +enclosure URL (probably some audio file): awk -F '\t' 'BEGIN { latest = 0; } length($8) { @@ -597,7 +597,7 @@ generated ETag to pin and fingerprint a client. - - - -CDN's blocking requests due to a missing HTTP User-Agent request header +CDNs blocking requests due to a missing HTTP User-Agent request header sfeed_update will not send the "User-Agent" header by default for privacy reasons. Some CDNs like Cloudflare don't like this and will block such HTTP @@ -619,7 +619,7 @@ are treated as an error. For example to prevent hijacking an unencrypted http:// to https:// redirect or to not add time of an unnecessary page redirect each time. It is encouraged to -use the final redirected url in the sfeedrc config file. +use the final redirected URL in the sfeedrc config file. If you want to ignore this advise you can override the fetch() function in the sfeedrc file and change the curl options "-L --max-redirs 0". @@ -675,7 +675,7 @@ TSV format. #!/bin/sh # Export newsbeuter/newsboat cached items from sqlite3 to the sfeed TSV format. # The data is split per file per feed with the name of the newsboat title/url. - # It writes the urls of the read items line by line to a "urls" file. + # It writes the URLs of the read items line by line to a "urls" file. # # Dependencies: sqlite3, awk. # @@ -745,7 +745,7 @@ TSV format. "html" "\t" field($5) "\t" field($6) "\t" field($7) \ > fname; - # write urls of the read items to a file line by line. + # write URLs of the read items to a file line by line. if ($10 == "0") { print $3 > "urls"; } |