Age | Commit message (Collapse) | Author |
|
During backlink calulation, all links are examined and broken links can
be detected for free, so store a list of broken links and have brokenlinks
use it.
Exposing the %brokenlinks structure is a bit ugly, but the speedup seems
worth it: Around 1 second for wikis the size of the doc wiki that use
brokenlinks.
|
|
By adding this setting, we get both more configurability, and a minor
optimisation too, since gettext does not need to be called continually
to get the Discussion value.
|
|
|
|
|
|
|
|
|
|
This separates style from content - backlinks() performs lossy
transformations on the page names to get it in the form that the page
template wants.
|
|
|
|
Another benefit is that consistently using gettext("Discussion")
eliminates the need to translate one string.
|
|
"discussion", which caused Discussion pages to get unwanted Discussion links.
|
|
|
|
Avoids some uninitialised value warnings.
|
|
|
|
The machine parseable date needs to include a timezone.
Also, simplified the interface for date display.
|
|
relative, in a very nice way, if I say so myself.
|
|
|
|
|
|
if desired.
|
|
|
|
Previously, if a page changed its type but not its mtime
(e.g. mv page.txt page.mdwn), then it would not be rebuilt.
Now, check if the source of a page has changed,
in which case force a rebuild of that page.
(cherry picked from commit b6a3b8a683fed7a7f6d77a5b3f2dfbd14c849843)
|
|
can be used to avoid a security check that is a good safe default, but
problimatic overkill in some situations.
I decided to underdocument this, because the option looks ugly, and I don't
want people randomly turning it on because it looks like a good idea. So if
you need it, you'll get an error message mentioning how to fix it.
|
|
* Add a postscan hook.
* search: Use postscan hook, avoid updating index when previewing.
|
|
* Renamed to parentlinks every single variable or function called
pedigree
* Removed the parentlinks function from Render.pm
* Enabled the new parentlinks plugin by default
* Adapted testsuite and documentation to reflate the above facts
Signed-off-by: intrigeri <intrigeri@boum.org>
|
|
If hardlinks are enabled, it would hardlink files from the underlay. That
was sorta annoying if you tried to edit by hand for some reason, so let's
not. Files that are hardlinked should be rare enough that a few extra stats
won't hurt.
|
|
* The editpage form now uses the raw page name, not the page title, in its
'page' cgi parameter. Using the title was ambiguous and made it
impossible to tell between some pages, like "foo/bar" and "foo__47__bar",
sometimes causing the wrong page to be edited.
* This change means that some edit links need to be updated.
Force a rebuild on upgrade to this version.
* Above change also allowed really fixing escaped slashes from the blogpost
form.
|
|
|
|
|
|
Because the search plugin needed it, also because it's one of the few
plugins that didn't already have it.
I also considered adding it to htmlize, but I really cannot imagine caring
what the destpage is when htmlizing. (I'll probably be poven wrong later.)
|
|
wikilinks added by filters from being scanned properly. But no known filter hook does that, and calling filters unncessarily during scan slowed down complex filters such as the one used to update the xapian index.
|
|
This allows plugins to getopt and change what is done before an incorrect
line is printed.
|
|
pruning not yet implemented, however
|
|
number of system calls in half. (Still room for improvement.)
|
|
on the same filesystem and the wiki includes large media files, which would normally be copied, wasting time and space.
|
|
internal pages won't be in revision control so this avoids some ugly noise
|
|
custom, first-class types of wikilinks.
* Move standard wikilink implementation to a new wikilink plugin, which
will of course be enabled by default.
|
|
|
|
There are several cases (recentchanges files, aggregated files)
where some source files are not in revision control.
|
|
since this leads to too many problems with web caching, especially with
inlined pages. Properly solving this would involve tracking every page
that contributes to a page's content and using the youngest of them all,
as well as special cases for things like the version plugin, and it's just
too complex to do.
|
|
|
|
|
|
scan() does too much. All that is needed is to preprocess the internal page
in scan-only mode.
|
|
license, and copyright. This can be used to create custom RecentChanges.
* meta: To support the pagespec functions, metadata about pages has to be
retained as pagestate.
* Fix encoding bug when pagestate values contained spaces.
|
|
|
|
I kept it to a simple global configuration, rather than using the
preprocessor directive for recentchanges, because that had chicken and egg
problems and seemed overcomplicated. This should work reasonably well,
though it would be good to add some more metadata so that more customised
recentchanges pages can be made.
|
|
This makes it a lot quicker to deal with lots of recentchanges pages
appearing and disappearing. It avoids needing to clutter up pagespecs with
exclusions for those pages, by making normal pagespecs not match them.
|
|
|
|
This is important to do because until will_render is called, ikiwiki doesn't
know that the page exists. This avoids recentchanges re-writing every change
page every run.
|
|
|
|
|
|
|