Age | Commit message (Collapse) | Author |
|
|
|
Besides being wrong to do, this could lead to the wrong item
being expired, as follows: If B is added and at the same time
A is changed, then A's ctime may be set to the current time,
while B's is set to its creation time. Thus the new item, A,
is incorrectly removed as older.
(This interacted especially badly with the bug fixed by
90b4d079605b72bb50d1da41402d994960e10937.)
|
|
The aggregate state merge code neglected to merge changes to the md5
field of an item. Therefore, if an item's md5 changed after initial
aggregation, it would be updated, and rewritten, each time thereafter.
This was wasteful and indirectly led to some expire problems.
|
|
|
|
|
|
|
|
available.
The test suite was emitting a lot of ugly gettext warnings;
setting LC_ALL didn't solve the problem for all locale setups
(since ikiwiki remaps it to LANG, and ikiwiki didn't know about
the C locale).
People also seem generally annoyed by the messages when
Locale::Gettext is not installed, and I suspect will be
generally happier if it just silently doesn't localize.
The optimisation came about when I noticed that the gettext
sub was doing rather a lot of work each call just to see
if localisation is needed. We can avoid that work by caching,
and the best thing to cache is a version of the gettext sub
that does exactly the right thing.
This was slightly complicated by the locale setting,
which might need to override the original locale (or lack
thereof) after gettext has been called. So it needs to invalidate
the cache in that case. It used to do it via a global variable,
which I am happy to have also gotten rid of.
|
|
Conflicts:
debian/changelog
debian/control
Signed-off-by: intrigeri <intrigeri@boum.org>
|
|
support several cases including mercurial's long user names on the RecentChanges page, and urls with spaces being handled by the 404 plugin.
|
|
|
|
can be loaded and that its config is ok. If a plugin fails for any reason, disable it in the generated file. Closes: 532001
|
|
Do not allow an unterminated """ string to be treated as a series of bare
words. Fixes runaway regexp recursion/backtracking in strange situations.
(See 1d57a21c987a5e970df01efe10acdf69982c2d61 for test case.)
|
|
"discussion", which caused Discussion pages to get unwanted Discussion links.
|
|
and document the comment directive syntax.
Rationalle: Comments need to be user-editable so that they can be posted
via git commit etc.
The _comment directive is still supported, for back-compat.
|
|
|
|
|
|
change. Closes: #530502
|
|
format: Provide a htmlizefallback hook that other plugins can use to
handle formats that are not suitable for general-purpose htmlize hooks.
highlight: Use the hook to allow formatting of any language/extension,
without it needing to be enabled for standalone source files.
highlight: If the highlight perl binding is not available, fallback
safely to a passthrough mode.
|
|
* debian/control: Add suggests for libhighlight-perl, although
that package is not yet created by Debian's highlight source package.
(See #529869)
|
|
directive starting with _ is likewise internal.
|
|
by plugins in the index. Fix this bug.
|
|
|
|
need a srcdir.
|
|
Conflicts:
debian/changelog
|
|
And avoid a whole class of potential security problems (though
none that I know of actually existing..), by avoiding
performing any string interpolation on user-supplied data when translating
pagespecs.
|
|
|
|
underlays via add_underlay.
|
|
.ikiwiki, abort with an error rather than creating it.
|
|
|
|
This is sorta an optimisation, and sorta a bug fix. In one
test case I have available, it can speed a page build up from 3
minutes to 3 seconds.
The root of the problem is that $links{$page} contains arrays of
links, rather than hashes of links. And when a link is found,
it is just pushed onto the array, without checking for dups.
Now, the array is emptied before scanning a page, so there
should not be a lot of opportunity for lots of duplicate links
to pile up in it. But, in some cases, they can, and if there
are hundreds of duplicate links in the array, then scanning it
for matching links, as match_link and some other code does,
becomes much more expensive than it needs to be.
Perhaps the real right fix would be to change the data structure
to a hash. But, the list of links is never accessed like that,
you always want to iterate through it.
I also looked at deduping the list in saveindex, but that does
a lot of unnecessary work, and doesn't completly solve the problem.
So, finally, I decided to add an add_link function that handles deduping,
and make ikiwiki-transition remove the old dup links.
|
|
When finding the pageurl, it was calling bestlink unnecessarily.
Since at this point $page contains the full name of the page that
is being inlined, there is no need to do bestlink's scan
for it.
This is only a minor optimisation, since bestlink is only called
once per displayed, inlined page.
|
|
nonexistant directories with some broken perl versions.
|
|
Should wait to upload until ikiwiki is fixed in testing.
|
|
|
|
Uses new debhelper feature to turn off python-support.
The tiny python module included herein certianly doesn't
need all the python-support nonsense.
|
|
|
|
|
|
* pagespec_match_list: New API function, matches pages in a list
and throws an error if the pagespec is bad.
* inline, brokenlinks, calendar, linkmap, map, orphans, pagecount,
pagestate, postsparkline: Display a handy error message if the pagespec
is erronious.
|
|
wish to, if the configuration makes signin optional for commenting.
|
|
* Add IkiWiki::ErrorReason objects, and modify pagespecs to return
them in cases where they fail to match due to a configuration or syntax
error.
* inline: Display a handy error message if the inline cannot display any
pages due to such an error.
This is perhaps somewhat incomplete, as other users of pagespecs do not
display the error, and will eventually need similar modifications to inline.
I should probably factor out a pagespec_match_all function and make it throw
ErrorReasons.
|
|
If the server has a clock running a bit ahead of the web browsing client,
relativedate could cause somewhat confusing displays like "3 seconds from now"
for just posted things.
As a hack, avoid displaying times in the future if they're less than a
small slip forward. I chose 30 minutes because both client and server could
be wrong in different directions, while it's still close enough that "just
now" is not horribly wrong.
|
|
|
|
printed at that point if it's not available, allowing the admin to see it during wiki setup. Closes: #520015
|
|
|
|
|
|
Conflicts:
debian/changelog
debian/control
|
|
|
|
|
|
This won't change it normally, and is useful if using archive
format to display some aggregated feed titles.
|
|
|