Age | Commit message (Collapse) | Author |
|
a newer perl module supports it. Although, based on its bug
reports at
<http://rt.cpan.org/Public/Dist/Display.html?Name=Net-OpenID-Consumer>,
there may be some problems (myspace is known not to work,
for example).
|
|
bottom of the target page correctly.
|
|
this location, I'm creating a redirect rather than just fixing one broken link.
|
|
Signed-off-by: intrigeri <intrigeri@boum.org>
|
|
and document the comment directive syntax.
Rationalle: Comments need to be user-editable so that they can be posted
via git commit etc.
The _comment directive is still supported, for back-compat.
|
|
|
|
on danish translation.
|
|
|
|
format: Provide a htmlizefallback hook that other plugins can use to
handle formats that are not suitable for general-purpose htmlize hooks.
highlight: Use the hook to allow formatting of any language/extension,
without it needing to be enabled for standalone source files.
highlight: If the highlight perl binding is not available, fallback
safely to a passthrough mode.
|
|
|
|
|
|
* debian/control: Add suggests for libhighlight-perl, although
that package is not yet created by Debian's highlight source package.
(See #529869)
|
|
about checking new posts in to VCS
|
|
|
|
|
|
|
|
|
|
|
|
Conflicts:
debian/changelog
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
This is sorta an optimisation, and sorta a bug fix. In one
test case I have available, it can speed a page build up from 3
minutes to 3 seconds.
The root of the problem is that $links{$page} contains arrays of
links, rather than hashes of links. And when a link is found,
it is just pushed onto the array, without checking for dups.
Now, the array is emptied before scanning a page, so there
should not be a lot of opportunity for lots of duplicate links
to pile up in it. But, in some cases, they can, and if there
are hundreds of duplicate links in the array, then scanning it
for matching links, as match_link and some other code does,
becomes much more expensive than it needs to be.
Perhaps the real right fix would be to change the data structure
to a hash. But, the list of links is never accessed like that,
you always want to iterate through it.
I also looked at deduping the list in saveindex, but that does
a lot of unnecessary work, and doesn't completly solve the problem.
So, finally, I decided to add an add_link function that handles deduping,
and make ikiwiki-transition remove the old dup links.
|
|
|
|
a restricted wiki
|
|
|
|
* pagespec_match_list: New API function, matches pages in a list
and throws an error if the pagespec is bad.
* inline, brokenlinks, calendar, linkmap, map, orphans, pagecount,
pagestate, postsparkline: Display a handy error message if the pagespec
is erronious.
|
|
* Add IkiWiki::ErrorReason objects, and modify pagespecs to return
them in cases where they fail to match due to a configuration or syntax
error.
* inline: Display a handy error message if the inline cannot display any
pages due to such an error.
This is perhaps somewhat incomplete, as other users of pagespecs do not
display the error, and will eventually need similar modifications to inline.
I should probably factor out a pagespec_match_all function and make it throw
ErrorReasons.
|
|
|
|
|
|
Conflicts:
debian/changelog
debian/control
|
|
Signed-off-by: intrigeri <intrigeri@boum.org>
|
|
|
|
|
|
|
|
|
|
The plugin list inlines all pages under plugins with a few exceptions, and
would have included this page. Moving it to discussion avoids the problem.
|
|
(Still a few bits I haven't bothered fully comprehending in detail.)
|
|
|
|
|
|
Conflicts:
doc/plugins/contrib/po.mdwn
|
|
|
|
|
|
|
|
|