Age | Commit message (Collapse) | Author |
|
Since newsboat version 2.22 (2020-12-21) it stores the content mime-type of a
field so allow to export this.
The older entries are empty and will be exported as "html" (even though they
might have been plain-text).
... also add the (empty) category field.
|
|
Workaround it by setting the empty "middle" fields to some value. The last
field can be empty.
Some feeds were incorrectly using the wrong base URL if the `baseurl` field was
empty but the encoding field was set. So it incorrectly used the encoding field
instead.
Only now noticed some feeds were failing because the baseURL is validated since
commit f305b032bc19b4e81c0dd6c0398370028ea910ca and returning a non-zero exit
status.
This doesn't happen with GNU xargs, busybox or toybox xargs.
Affected (atleast): OpenBSD, NetBSD, FreeBSD and DragonFlyBSD xargs which share
similar code.
Simple way to reproduce the difference:
printf 'a\0\0c\0' | xargs -0 echo
Prints "a c" on *BSD.
Prints "a c" on GNU xargs (and some other implementations).
|
|
Combine E-Tags, If-Modified-Since in one section. Also mention the curl
--compression option for typically GZIP decompression.
Note that E-Tags were broken in curl <7.73 due to a bug with "weak" e-tags.
https://github.com/curl/curl/issues/5610
From a question/feedback by e-mail from Hadrien Lacour, thanks.
|
|
|
|
Kindof a non-issue but if theres a sfeedrc with no feeds then xargs will still
be executed and give an error. The xargs -r option (GNU extension) fixes this:
From the OpenBSD xargs(1) man page:
"-r Do not run the command if there are no arguments. Normally the
command is executed at least once even if there are no arguments."
Reproducable with the sfeedrc:
feeds() {
true
}
|
|
|
|
This code uses the non-portable xargs -P option to more efficiently process
feeds in parallel.
|
|
Interesting C compiler project:
lacc: A simple, self-hosting C compiler:
https://github.com/larmel/lacc
|
|
- Export read/unread state to a separate plain-text "urls" file, line by line.
- Handle white-space control-chars better.
From the sfeed(1) man page:
" The fields: title, id, author are not allowed to have newlines and TABs,
all whitespace characters are replaced by a single space character.
Control characters are removed."
So do the reverse for newsboat aswell: change white-space characters which are
also control-characters (such as TABs and newlines) to a single space
character.
|
|
... move sections around in a more logical order and tweak some words.
Prompted by a question and feedback from Aleksei, thanks!
|
|
|
|
https://support.google.com/analytics/answer/1033867?hl=nl
|
|
Use the same base filename as the feed file, because sfeed_update replaces '/'
in names with '_':
filename="$(printf '%s' "$1" | tr '/' '_')"
This fixes the example for fetching feeds with names containing '/'.
Reported by __20h__, thanks!
|
|
|
|
Found by testing using mawk.
|
|
|
|
cproc:
cproc: https://github.com/michaelforney/cproc
qbe: https://c9x.me/compile/
z80 (sfeed base program)
fuzix: http://www.fuzix.org/
RC2014 emulator: https://github.com/EtchedPixels/RC2014
sdcc: http://sdcc.sourceforge.net/
|
|
|
|
- Add an example to optimize bandwidth use with the curl -z option.
- Add a note about CDNs blocking based on the User-Agent (based on a question
mailed to me).
- Add an script to convert existing newsboat items to the sfeed(5) TSV format.
|
|
This is a "quick&dirty" regex to block some of the typical 1px width or height
tracking pixels.
|
|
No functional difference, but it should improve readability.
|
|
- HPPA in qemu (OpenBSD).
- FreeDOS + djgpp (+ wget with networking) in 32-bit mode.
|
|
|
|
Also use a more portable date +'%s' (remove -j).
NOTE though: date +'%s' is not POSIX, but it is supported in most cases.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
... both are out-of-scope for sfeed.
- sfeed_tail can be written as some simple customized awk script reading from a
FIFO. The C version did not work well on FIFO's.
- Security considerations are mentioned in the official HTML spec and applies to
all HTML and protocol handlers, so is out-of-scope.
|
|
|
|
|
|
|
|
- add preface text.
- use "\t" pattern for awk (easier to read and copy-paste).
- add a small example to get the most recent enclosure.
|
|
|
|
|
|
|
|
- Show how to filter protocol schemes more strictly. For example to allow only
http://, https:// and gopher:// (not file://, javascript:, etc).
- Filter links and now also enclosures.
|
|
... and simplify example in README.
|
|
make the procmail example safer due to account process limits.
|
|
|
|
This was removed before, because of potential security issues in commit
b7e288a96418e1ea5e7904ab2896edb3f4615a10
Thanks trqx for the feedback.
|
|
Man pages:
- sfeed_update: fix: fetchfeed parameter documentation.
- sfeed_update: fix/update: urls in sfeedrc.example.
- sfeed_update: document maxjobs variable.
- sfeedrc: document filter and order functions here.
- more semantic keywords: function arguments and some Nm.
README:
- Document more clearly sfeedrc is a shellscript at the first usage "steps".
- Add newsboat OPML export and import to sfeed_update example.
- Document the Makefile is POSIX (not some GNU/Makefile).
- Add reference to my tool hurl: a HTTP/HTTPS/Gopher file grab client.
- Describe the reason/usefulness of the filter example.
- Describe how to override curl(1), an optional dependency.
With feedback from lich, thanks!
|