Age | Commit message (Collapse) | Author |
|
|
|
* comments: Comments pending moderation are now stored in the srcdir
alongside accepted comments, but with a `._comment_pending` extension.
* This allows easier byhand moderation, as the "_pending" need
only be stripped off and the comment be committed to version control.
* The `comment_pending()` pagespec can be used to match such unmoderated
comments, which makes it easy to add a feed of them, or a counter
indicating how many there are.
* Belatedly added a `comment()` pagespec.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
to ensure those wrong massive depends_simple don't linger on systems that
rebuilt for the other reasons already
|
|
|
|
|
|
|
|
Conflicts:
debian/NEWS
|
|
|
|
|
|
debian/NEWS and debian/postinst should be edited before release to have
an appropriate version number.
|
|
|
|
|
|
Conflicts:
doc/bugs/transitive_dependencies.mdwn
|
|
|
|
|
|
|
|
directive starting with _ is likewise internal.
|
|
by plugins in the index. Fix this bug.
|
|
|
|
This is sorta an optimisation, and sorta a bug fix. In one
test case I have available, it can speed a page build up from 3
minutes to 3 seconds.
The root of the problem is that $links{$page} contains arrays of
links, rather than hashes of links. And when a link is found,
it is just pushed onto the array, without checking for dups.
Now, the array is emptied before scanning a page, so there
should not be a lot of opportunity for lots of duplicate links
to pile up in it. But, in some cases, they can, and if there
are hundreds of duplicate links in the array, then scanning it
for matching links, as match_link and some other code does,
becomes much more expensive than it needs to be.
Perhaps the real right fix would be to change the data structure
to a hash. But, the list of links is never accessed like that,
you always want to iterate through it.
I also looked at deduping the list in saveindex, but that does
a lot of unnecessary work, and doesn't completly solve the problem.
So, finally, I decided to add an add_link function that handles deduping,
and make ikiwiki-transition remove the old dup links.
|
|
ikiwiki setups use. Document how to use the new url form.
|
|
|
|
This is easier to remeber, and less error-prone than passing it all the
pages in the wiki.
|
|
|
|
* teximg: The prefix is configurable, and has changed to not include the
nonstandard mhchem by default. (willu)
* teximg: dvipng is used if available to render images. Its output is
antialiased and better than dvips. If not available, the old dvips+convert
chain will be used. (willu)
* Drop suggests on texlive-science, add suggests on dvipng.
|
|
|
|
Also fixed a bug in how aggregateinternal used IkiWiki::Setup::load,
and added checks for arguments to other subcommands.
|
|
|
|
* The editpage form now uses the raw page name, not the page title, in its
'page' cgi parameter. Using the title was ambiguous and made it
impossible to tell between some pages, like "foo/bar" and "foo__47__bar",
sometimes causing the wrong page to be edited.
* This change means that some edit links need to be updated.
Force a rebuild on upgrade to this version.
* Above change also allowed really fixing escaped slashes from the blogpost
form.
|
|
|
|
|
|
- Add a Help link.
- If the pageterm is too long, hash it.
|
|
|
|
|
|
Everything but the actual coding to support them.
|
|
|
|
recentchangediff to work with svn repos.
|
|
|
|
|
|
If we have transitions of this sort in the future, this program will
hopefully be used to handle them too.
|