summaryrefslogtreecommitdiff
path: root/doc/todo/aggregation.mdwn
diff options
context:
space:
mode:
Diffstat (limited to 'doc/todo/aggregation.mdwn')
-rw-r--r--doc/todo/aggregation.mdwn25
1 files changed, 1 insertions, 24 deletions
diff --git a/doc/todo/aggregation.mdwn b/doc/todo/aggregation.mdwn
index 7d765f9e9..53b3133e2 100644
--- a/doc/todo/aggregation.mdwn
+++ b/doc/todo/aggregation.mdwn
@@ -1,24 +1 @@
-Here's a scary idea.. A plugin that can aggregate feeds from other
-locations. Presumably there would need to be a cron job to build the wiki
-periodically, and each time it's built any new items would be turned into
-pages etc. There might also need to be a way to expire old items, unless
-you wanted to keep them forever.
-
-This would allow ikiwiki to work as a kind of a planet, or at least a
-poor-man's news aggregator.
-
-* XML::Feed has a very nice interface, may require valid feeds though.
-* How to store GUIDs? Maybe as meta tags on pages, although that would need
- caching of such metadata somewhere.
-* How to configure which feeds to pull, how often, and where to put the
- pulled entries? One way would be command line/config file, but I think
- better would be to use preprocessor directives in a wiki page, probably
- the same page that inlines all the pages together.
-* Where to store when a feed was last pulled?
-
-So I need:
-
-* A way to store info from the preprocessor directives about what pages
- to pull and expiry.
-* A way to store info on last pull time, guids, etc.
-* Switch for a mode that a) pulls b) expires old c) rebuilds wiki (for cron)
+* Still need to support feed expiry.