Age | Commit message (Collapse) | Author |
|
plugins. (Particularly a win when using external plugins.)
|
|
|
|
|
|
Add an inject function, that can be used by plugins that want to replace
one of ikiwiki's functions with their own version. (This is a scary thing
that grubs through the symbol table, and replaces all exported occurances
of a function with the injected version.)
external: RPC functions can be injected to replace exported functions.
Removed the stupid displaytime hook, and use injection instead.
|
|
toplevel tagpage, and not closer subpages.
The html links already went there, but internally the links were not
recorded as absolute, which could cause confusing backlinks etc.
For example, with tagbase=tags, if blog/tags/bar existed and blog/foo was
tagged bar, it would link to /tags/bar. But, the link would be recorded
simply as a link to tags/bar, and so later blog/tags/bar would appear to
have the backlink.
|
|
feed links. So rss will be included along with atom, and pages with multiple feeds will get links added for all feeds.
|
|
utf-8 characters are written out as such, and not as the encoded perl strings the C Data::Dumper produces.
Note that the text produced by the C version was interpreted fine
when ikiwiki loaded the setup file. But it was not user-friendly.
|
|
entity-encoding the wikiname in the session cookie.
|
|
The machine parseable date needs to include a timezone.
Also, simplified the interface for date display.
|
|
|
|
Need to handle the case where url is not set.
|
|
relative, in a very nice way, if I say so myself.
|
|
* Add an underlay for javascript, and add ikiwiki.js containing some utility
code.
* toggle: Stop embedding the full toggle code on each page using it, and
move it to toggle.js in the javascript underlay.
|
|
|
|
in the future.
|
|
|
|
|
|
plugins. Closes: #502047
|
|
|
|
|
|
for. This supports most of the ACL type things users have been wanting to be done. Closes: #443346 (It does not control who can read a page, but that's out of scope for ikiwiki.)
|
|
|
|
This avoids another one of those $_ scoping issues where a deep call to a
function that changes $_ clobbers the array that is being looped over.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
It makes sense to use bestlink to determine which page rootpage refers to,
but if no page matches, just use the raw value.
|
|
Conflicts:
debian/changelog
|
|
authentication. Closes: #500524
|
|
|
|
This is the easy part of supporting foo/index.mdwn sources for page foo.
Note that if foo.mdwn exists too, there will be a warning about multiple
sources for the same page, and which is used is indeterminate.
indexpages should also cause web based editing to create index source pages
by default; this and other fallout of the option not yet implemented.
|
|
files rendered during page preview.
|
|
page, and is preserved across rebuilds.
|
|
Upgrades to the new index format should be transparent.
The version field is 3, because 1 was the old textual index, 2 was the
pre-versioned format.
This also includes some efficiency improvements to index loading, by
not copying a hash and using a reference.
|
|
toplevel templates directory.
|
|
* htmltidy: Avoid returning undef if tidy fails. Also avoid returning the
untidied content if tidy crashes. In either case, it seems best to tidy
the content to nothing.
* htmltidy: Avoid spewing tidy errors to stderr.
|
|
|
|
|
|
|
|
|
|
acting on a set of pages.
|
|
tagbase, when it's set.
|
|
Seems that the problem is that once the \nnn coming from git is converted
to a single character, decode_utf8 decides that this is a standalone
character, and not part of a multibyte utf-8 sequence, and so does nothing.
I tried playing with the utf-8 flag, but that didn't work. Instead, use
decode("utf8"), which doesn't have the same qualms, and successfully
decodes the octets into a utf-8 character.
Rant:
Think for a minute about fact that any and every program that parses git-log,
or git-show, etc output to figure out what files were in a commit needs to
contain this snippet of code, to convert from git-log's wacky output to a
regular character set:
if ($file =~ m/^"(.*)"$/) {
($file=$1) =~ s/\\([0-7]{1,3})/chr(oct($1))/eg;
}
(And it's only that "simple" if you don't care about filenames with
embedded \n or \t or other control characters.)
Does that strike anyone else as putting the parsing and conversion in the
wrong place (ie, in gitweb, ikiwiki, etc, etc)? Doesn't anyone who actually
uses git with utf-8 filenames get a bit pissed off at seeing \xxx\xxx
instead of the utf-8 in git-commit and other output?
|
|
|
|
I saw this in the wild, apparently a page was not present on disk, but was
in the aggregate db, and not marked as expired either. Not sure how that
happened, but such pages should get marked as expired since they have an
effectively zero ctime.
|
|
|
|
|