Age | Commit message (Collapse) | Author |
|
Not needed; lastupdate will be 0 for new feeds.
|
|
|
|
.ikiwiki/aggregatetime, to allow for more sophisticated cron jobs.
|
|
|
|
|
|
Besides being wrong to do, this could lead to the wrong item
being expired, as follows: If B is added and at the same time
A is changed, then A's ctime may be set to the current time,
while B's is set to its creation time. Thus the new item, A,
is incorrectly removed as older.
(This interacted especially badly with the bug fixed by
90b4d079605b72bb50d1da41402d994960e10937.)
|
|
The aggregate state merge code neglected to merge changes to the md5
field of an item. Therefore, if an item's md5 changed after initial
aggregation, it would be updated, and rewritten, each time thereafter.
This was wasteful and indirectly led to some expire problems.
|
|
See [[bugs/Aggregated_Atom_feeds_are_double-encoded]]. By default,
XML::Atom outputs strings of UTF-8 bytes with the Perl UTF8 flag stripped
off, which IkiWiki assumes to be Latin-1 and re-encodes as UTF-8 on
output. XML::Feed does not currently (0.41-1) set the magic variable to
change this behaviour (I've filed a bug on CPAN), but IkiWiki can
usefully set the same variable as a workaround.
|
|
|
|
|
|
This can happen when a new field,
such as the new lasttry, is added.
|
|
aggregation is run, even if the usual time has not passed. Closes: #508622 (Michael Gold)
|
|
|
|
|
|
The old method failed for '[' x 3.
|
|
holger reported that decode_utf8 was crashing with perl 5.8.8. Earlier, I
thought that passing 0 to the function avoided this with old perls, but
that was apparently not enough, it still crashes. So, put it inside the
eval, so we can at least recover from it crashing.
|
|
links. Since this needs the just released XML::Feed 0.3, as well as a not yet released XML::RSS, it will fall back to the old method if no xml:base info is available.
|
|
The machine parseable date needs to include a timezone.
Also, simplified the interface for date display.
|
|
|
|
in the future.
|
|
newpagefile.
Note that newpagefile is not used here (or in recentchanges) because
the internal use pages they generate are transient and unlikely to
benefit from being put each in their own subdir.
|
|
|
|
|
|
I saw this in the wild, apparently a page was not present on disk, but was
in the aggregate db, and not marked as expired either. Not sure how that
happened, but such pages should get marked as expired since they have an
effectively zero ctime.
|
|
|
|
The expiry code does need to make sure to sort in ctime order, even if
expiring by count, so it expires the right ones.
|
|
elements.
|
|
needs to wait for the pages to be rendered though)
|
|
too many plugins.. brain exploding..
|
|
They were a bit confusing, since they did not actually set the default, and
example values are sufficient.
|
|
|
|
|
|
|
|
This handles deleting empty directories too.
|
|
|
|
Conflicts:
IkiWiki/Plugin/aggregate.pm
|
|
|
|
|
|
Usage:
1. Update all pagespecs that use aggregated pages to use internal()
2. ikiwiki-transition aggregateinternal $srcdir $htmlext
(where $srcdir and $htmlext are the srcdir and htmlext options in
your .setup file)
3. Add aggregateinternal to your .setup file
4. Rebuild the wiki
|
|
|
|
|
|
This addresses <http://ikiwiki.info/todo/aggregate_to_internal_pages/>
in a simple way. With this approach, a flag day is required, on which all
users of aggregated pages start to inline them using the internal() pagespec;
after that, the aggregateinternal option can safely be switched on in the
setup file (and the old aggregated pages can be deleted by hand).
|
|
Allows to specify the template file which is used to
create the html pages.
|
|
explicitly pass 0 (FB_DEFAULT) as the second parameter. Apparently perl 5.8 needs this to avoid crashing on malformed utf-8, despite its docs saying it is the default.
|
|
stuck on shared hosting without cron. (Sheesh.) Enabled via the `aggregate_webtrigger` configuration optiom.
|
|
lacking one.
|
|
Used in several subs, not all of which load it on demand, this seems simpler.
|
|
|
|
Now aggregation will not lock the wiki. Any changes made during aggregaton are
merged in with the changed state accumulated while aggregating. A separate
lock file prevents multiple concurrent aggregators. Garbage collection
of orphaned guids is much improved. loadstate() is only called once
per process, so tricky support for reloading wiki state is not needed.
(Tested fairly thuroughly.)
|
|
|