Command Line Apps newsbeuter

From antiX

Revision as of 12:44, 24 January 2010 by Anticapitalista (Talk | contribs)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

newsbeuter

newsbeuter is a command line RSS reader. Taken from the excellent newsbeuter man page.

FIRST STEPS:

After you’ve installed newsbeuter, you can run it for the first time by typing "newsbeuter" on your command prompt. This will bring you the following message:

Error: no URLs configured. Please fill the file /home/ak/.newsbeuter/urls with RSS feed URLs or import an OPML fi$

newsbeuter 2.1

usage: ./newsbeuter [-i <file>|-e] [-u <urlfile>] [-c <cachefile>] [-x <command> ...] [-h]

       -e              export OPML feed to stdout
       -r              refresh feeds on start
       -i <file>       import OPML file
       -u <urlfile>    read RSS feed URLs from <urlfile>
       -c <cachefile>  use <cachefile> as cache file
       -C <configfile> read configuration from <configfile>
       -X              clean up cache thoroughly
       -x <command>... execute list of commands
       -o              activate offline mode (only applies to bloglines synchronization mode)
       -v              get version information
       -l <loglevel>   write a log with a certain loglevel (valid values: 1 to 6)
       -d <logfile>    use <logfile> as output log file
       -E <file>       export list of read articles to <file>
       -I <file>       import list of read articles from <file>
       -h              this help

This means that newsbeuter can’t start without any configured feeds. To add feeds to newsbeuter, you can either add URLs to the configuration file $HOME/.newsbeuter/urls or you can import an OPML file by running "newsbeuter -i blogroll.opml". To manually add URLs, open the file with your favorite text editor and add the URLs, one per line:

       http://rss.cnn.com/rss/cnn_topstories.rss
       http://newsrss.bbc.co.uk/rss/newsonline_world_edition/front_page/rss.xml

If you need to add URLs that have restricted access via username/password, simply provide the username/password in the following way:

       http://username:password@hostname.domain.tld/feed.rss

In order to protect username and password, make sure that $HOME/.newsbeuter/urls has the appropriate permissions. Newsbeuter also makes sure that usernames and passwords within URLs aren’t displayed in its user interface. You can also configure local files as feeds, by prefixing the local path with "file://" and adding it to the

      $HOME/.newsbeuter/urls file:
      file:///var/log/rss_eventlog.xml

Now you can run newsbeuter again, and it will present you with a controllable list of the URLs that you configured previously. You can now start downloading the feeds, either by pressing "R" to download all feeds, or by pressing "r" to download the currently selected feed. You can then select a feed you want to read, and by pressing "Enter", you can go to the article list for this feed. This works even while the downloading is still in progress. You can now see the list of available articles by their title. A "N" on the left indicates that an article wasn’t read yet. Pressing Enter brings you to the content of the article. You can scroll through this text, and also run a browser (default: lynx) to view the complete article if the content is empty or just an abstract or a short description. Pressing "q" brings you back to the article list, and pressing "q" again brings you back to the feed list. Pressing "q" a third time then closes newsbeuter.

Newsbeuter caches the article that it downloads. This means that when you start newsbeuter again and reload a feed, the old articles can still be read even if they aren’t in the current RSS feeds anymore. Optionally you can configure how many articles shall be preserved by feed so that the article backlog doesn’t grow endlessly (see "max-items" below).

Newsbeuter also uses a number of measures to preserve the users´ and feed providers´ bandwidth, by trying to avoid unnecessary feed downloads through the use of conditional HTTP downloading. It saves every feed’s "Last-Modified" and "ETag" response header values (if present) and advises the feed’s HTTP server to only send data if the feed has been updated by modification date/time or "ETag" header. This doesn’t only make feed downloads for RSS feeds with no new updates faster, it also reduces the amount of transferred data per request. Conditional HTTP downloading can be optionally disabled per feed by using the "always-download" configuration command.

Searching for articles is possible in newsbeuter, too. Just press the "/" key, enter your search phrase, and the title and content of all articles are searched for it. When you do a search from the list of feeds, all articles of all feeds will be searched. When you do a search from the article list of a feed, only the articles of the currently viewed feed are searched. When opening an article from a search result dialog, the search phrase is highlighted.

The history of all your searches is saved to the filesystem, to \~/.newsbeuter/history.search. By default, the last 100 search phrases are stored, but this limited can be influenced through the "history-limit" configuration variable. To disable search history saving, simply set the history-limit to 0.

Personal tools