summaryrefslogtreecommitdiff
path: root/doc
diff options
context:
space:
mode:
authorintrigeri <intrigeri@boum.org>2009-06-06 14:03:40 +0200
committerintrigeri <intrigeri@boum.org>2009-06-06 14:03:40 +0200
commit86edd73d169600875a10a635ef8df4a644545b0d (patch)
tree1216eb826f2da7a1c11d84395f25468d1acfa69c /doc
parent17b3d73f6e65d6a754633902b0dd4716d53b03a9 (diff)
parente40d2a6b2b1bdf677f11cc4a71595acf609d1e75 (diff)
Merge commit 'upstream/master' into pub/po
Conflicts: debian/changelog debian/control Signed-off-by: intrigeri <intrigeri@boum.org>
Diffstat (limited to 'doc')
-rw-r--r--doc/anchor.mdwn3
-rw-r--r--doc/basewiki.mdwn6
-rw-r--r--doc/bugs/Insecure_dependency_in_mkdir.mdwn7
-rw-r--r--doc/bugs/SSI_include_stripped_from_mdwn.mdwn18
-rw-r--r--doc/bugs/__34__more__34___doesn__39__t_work.mdwn17
-rw-r--r--doc/bugs/aggregate_global_feed_names.mdwn13
-rw-r--r--doc/bugs/complex_wiki-code___40__braces__41___in_wikilink-text_breaks_wikilinks.mdwn9
-rw-r--r--doc/bugs/firefox_doesn__39__t_want_to_load_updated_pages_at_ikiwiki.info.mdwn5
-rw-r--r--doc/bugs/goto_with_bad_page_name.mdwn25
-rw-r--r--doc/bugs/no_easy_way_to_wrap_HTML_container_around_a_set_of_inlined_pages.mdwn14
-rw-r--r--doc/bugs/pagecount_is_broken.mdwn3
-rw-r--r--doc/bugs/support_for_openid2_logins.mdwn7
-rw-r--r--doc/bugs/tagged__40____41___matching_wikilinks.mdwn18
-rw-r--r--doc/bugs/unwanted_discussion_links_on_discussion_pages.mdwn36
-rw-r--r--doc/download.mdwn2
-rw-r--r--doc/git.mdwn3
-rw-r--r--doc/ikiwiki-transition.mdwn2
-rw-r--r--doc/ikiwiki/directive/aggregate.mdwn5
-rw-r--r--doc/ikiwiki/directive/comment.mdwn38
-rw-r--r--doc/ikiwiki/directive/format.mdwn8
-rw-r--r--doc/ikiwiki/openid.mdwn2
-rw-r--r--doc/ikiwiki/wikilink.mdwn2
-rw-r--r--doc/ikiwiki/wikilink/discussion.mdwn4
-rw-r--r--doc/ikiwikiusers.mdwn1
-rw-r--r--doc/news/version_3.12.mdwn2
-rw-r--r--doc/news/version_3.13.mdwn24
-rw-r--r--doc/news/version_3.14.mdwn13
-rw-r--r--doc/plugins/comments.mdwn2
-rw-r--r--doc/plugins/contrib.mdwn5
-rw-r--r--doc/plugins/contrib/headinganchors.mdwn2
-rw-r--r--doc/plugins/contrib/headinganchors/discussion.mdwn1
-rw-r--r--doc/plugins/contrib/mailbox.mdwn18
-rw-r--r--doc/plugins/contrib/mailbox/discussion.mdwn5
-rw-r--r--doc/plugins/contrib/po.mdwn29
-rw-r--r--doc/plugins/contrib/postal.mdwn35
-rw-r--r--doc/plugins/highlight.mdwn75
-rw-r--r--doc/plugins/more/discussion.mdwn7
-rw-r--r--doc/plugins/openid.mdwn10
-rw-r--r--doc/plugins/txt.mdwn2
-rw-r--r--doc/shortcuts.mdwn2
-rw-r--r--doc/style.css18
-rw-r--r--doc/todo/Allow_disabling_edit_and_preferences_links.mdwn15
-rw-r--r--doc/todo/automatic_use_of_syntax_plugin_on_source_code_files.mdwn3
-rw-r--r--doc/todo/automatic_use_of_syntax_plugin_on_source_code_files/discussion.mdwn1
-rw-r--r--doc/todo/comment_by_mail.mdwn52
-rw-r--r--doc/todo/latex.mdwn6
-rw-r--r--doc/todo/matching_different_kinds_of_links.mdwn10
-rw-r--r--doc/todo/mbox.mdwn5
-rw-r--r--doc/todo/section-numbering.mdwn7
-rw-r--r--doc/todo/syntax_highlighting.mdwn141
-rw-r--r--doc/todo/syntax_highlighting/discussion.mdwn2
-rw-r--r--doc/todo/target_filter_for_brokenlinks.mdwn9
-rw-r--r--doc/todo/tracking_bugs_with_dependencies.mdwn480
-rw-r--r--doc/todo/wiki-formatted_comments_with_syntax_plugin.mdwn5
-rw-r--r--doc/users/jasonblevins.mdwn16
-rw-r--r--doc/wikitemplates.mdwn2
56 files changed, 991 insertions, 261 deletions
diff --git a/doc/anchor.mdwn b/doc/anchor.mdwn
new file mode 100644
index 000000000..012e52fa0
--- /dev/null
+++ b/doc/anchor.mdwn
@@ -0,0 +1,3 @@
+ikiwiki works with anchors in various situations.
+
+This page accumulates links to the concept of anchors.
diff --git a/doc/basewiki.mdwn b/doc/basewiki.mdwn
index c61ae3dba..8392884eb 100644
--- a/doc/basewiki.mdwn
+++ b/doc/basewiki.mdwn
@@ -18,3 +18,9 @@ It currently includes these pages:
As well as a few other files, like [[favicon.ico]], [[local.css]],
[[style.css]], and some icons.
+
+Note that an important property of the basewiki is that it should be
+self-contained. That means that the pages listed above cannot link
+to pages outside the basewiki. Ikiwiki's test suite checks that the
+basewiki is self-contained, and from time to time links have to be
+removed (or replaced with `iki` [[shortcuts]]) to keep this invariant.
diff --git a/doc/bugs/Insecure_dependency_in_mkdir.mdwn b/doc/bugs/Insecure_dependency_in_mkdir.mdwn
index 5410ebbfd..46011a7e8 100644
--- a/doc/bugs/Insecure_dependency_in_mkdir.mdwn
+++ b/doc/bugs/Insecure_dependency_in_mkdir.mdwn
@@ -151,3 +151,10 @@ dubious
>>>>>> You can check it out for yourself by pulling my fork of this, at github or my local repo.
>>>>>> github will probably be faster for you: git://github.com/kjikaqawej/ikiwiki-simon.git --[[simonraven]]
+>>>>>>> I don't know what I'm supposed to see in your github tree.. it
+>>>>>>> looks identical to an old snapshot of ikiwiki's regular git repo?
+>>>>>>> If you want to put up the .deb you're using, I could examine that.
+>>>>>>>
+>>>>>>> I was in fact able to reproduce the insecure dependency in mkdir
+>>>>>>> message -- but only if I run 'perl -T ikiwiki'.
+>>>>>>> --[[Joey]]
diff --git a/doc/bugs/SSI_include_stripped_from_mdwn.mdwn b/doc/bugs/SSI_include_stripped_from_mdwn.mdwn
index bd895127a..5519e45c6 100644
--- a/doc/bugs/SSI_include_stripped_from_mdwn.mdwn
+++ b/doc/bugs/SSI_include_stripped_from_mdwn.mdwn
@@ -1,3 +1,21 @@
If I have a &lt;--#include virtual="foo" --&gt; in some file, it gets stripped, even though other HTML comments don't get stripped. I imagine it's some plugin doing it, or IkiWiki itself, or an IkiWiki dependency, but I haven't found where this is happening. I'm trying to implement a workaround for my sidebars forcing a rebuild of the wiki every day - I use the calendar plugin - when the day changes, by using SSI.
> It is probably the [[plugins/htmlscrubber]] plugin. -- [[Jon]]
+
+> htmlscrubber does strip these, because they look like
+> a html tag to it, not a html comment. (html comments start
+> with `<!--` .. of course, they get stripped too, because
+> they can be used to hide javascript..)
+>
+> Anyway, it makes sense for the htmlscrubber to strip server-side
+> includes because otherwise your wiki could be attacked
+> by them being added to it. If you want to use both the htmlscrubber and
+> SSI together, I'd suggest you modify the [[wikitemplates]]
+> and put the SSI on there.
+>
+> Ie, `page.tmpl` has a
+> div that the sidebar is put into; if you just replace
+> that with the SSI that includes your static sidebar,
+> you should be good to go. --[[Joey]]
+
+[[done]]
diff --git a/doc/bugs/__34__more__34___doesn__39__t_work.mdwn b/doc/bugs/__34__more__34___doesn__39__t_work.mdwn
new file mode 100644
index 000000000..b2d929f13
--- /dev/null
+++ b/doc/bugs/__34__more__34___doesn__39__t_work.mdwn
@@ -0,0 +1,17 @@
+As one can see at [[plugins/more/discussion]], the [[plugins/more]] plugin doesn't work --- it renders as:
+
+ <p><a name="more"></a></p>
+
+ <p>This is the rest of my post. Not intended for people catching up on
+ their blogs at 30,000 feet. Because I like to make things
+ difficult.</p>
+
+No way to toggle visibility.
+-- Ivan Z.
+
+> More is not about toggling visibility. Perhaps you want
+> [[plugins/toggle]] More is about displaying the whole page
+> content when it's a standalone page, and only displaying a fragment when
+> it's inlined into a blog. --[[Joey]] [[done]]
+
+>> I see, thanks for bothering with the reply, I didn't understand this. --Ivan Z.
diff --git a/doc/bugs/aggregate_global_feed_names.mdwn b/doc/bugs/aggregate_global_feed_names.mdwn
new file mode 100644
index 000000000..27127ce27
--- /dev/null
+++ b/doc/bugs/aggregate_global_feed_names.mdwn
@@ -0,0 +1,13 @@
+[[plugins/aggregate]] takes a name parameter that specifies a global name
+for a feed. This causes some problems:
+
+* If a site has multiple pages that aggregate, and they use the same
+ name, one will win and get the global name, the other will claim it's
+ working, but it's really showing what the other aggregated.
+* If an aggregate directive is moved from page A to page B, and the wiki
+ refreshed, aggregate does not realize the feed moved, and so it will
+ keep aggregated pages under `A/feed_name/*`. To work around this bug,
+ you have to delete A, refresh (maybe with --aggregate?), and then add B.
+
+Need to find a way to not make the name be global. Perhaps it needs to
+include the name of the page that contains the directive?
diff --git a/doc/bugs/complex_wiki-code___40__braces__41___in_wikilink-text_breaks_wikilinks.mdwn b/doc/bugs/complex_wiki-code___40__braces__41___in_wikilink-text_breaks_wikilinks.mdwn
new file mode 100644
index 000000000..364fae394
--- /dev/null
+++ b/doc/bugs/complex_wiki-code___40__braces__41___in_wikilink-text_breaks_wikilinks.mdwn
@@ -0,0 +1,9 @@
+Example (from [here](http://git.ikiwiki.info/?p=ikiwiki;a=blobdiff;f=doc/todo/matching_different_kinds_of_links.mdwn;h=26c5a072bf3cb205b238a4e6fd0882583a0b7609;hp=1d7c78d9065d78307b43a1f58a53300cde4015fa;hb=9b4c83127fdef0ceb682c104db9bfb321b17022e;hpb=df4cc4c16ca230ee99b80c80043ba54fb95f6e71)):
+<pre>
+[[`\[[!taglink TAG\]\]`|plugins/tag]]
+</pre>
+gives:
+
+[[`\[[!taglink TAG\]\]`|plugins/tag]]
+
+Expected: there is a [[ikiwiki/wikilink]] with the complex text as the displayed text. --Ivan Z.
diff --git a/doc/bugs/firefox_doesn__39__t_want_to_load_updated_pages_at_ikiwiki.info.mdwn b/doc/bugs/firefox_doesn__39__t_want_to_load_updated_pages_at_ikiwiki.info.mdwn
new file mode 100644
index 000000000..8cb47f864
--- /dev/null
+++ b/doc/bugs/firefox_doesn__39__t_want_to_load_updated_pages_at_ikiwiki.info.mdwn
@@ -0,0 +1,5 @@
+I'm using firefox-3.0.8-alt0.M41.1 (Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.4pre) Gecko/2008100921 Firefox/3.0). I have noticed that quite often it shows an old state of a page at http://ikiwiki.info, e.g., [[recentchanges]] without my last edits, or the last page I edited (say, 50 min ago) in the state it was before I edited it.
+
+Only explicitly pressing "reload" helps.
+
+Is it a bug? I haven't been noticing such problems usually on other sites. --Ivan Z.
diff --git a/doc/bugs/goto_with_bad_page_name.mdwn b/doc/bugs/goto_with_bad_page_name.mdwn
new file mode 100644
index 000000000..bc462c840
--- /dev/null
+++ b/doc/bugs/goto_with_bad_page_name.mdwn
@@ -0,0 +1,25 @@
+If goto is passed a page name that
+contains spaces or is otherwise not a valid page name,
+it will display a "page does not exist", with a create link. But,
+clicking on the link will result in "bad page name".
+
+I have found at least two ways it can happen:
+
+* If 404 is enabled, and the user goes to "http://wiki/some page with spaces"
+* If mercurial is used, it pulls the user's full name, with spaces,
+ out for `rcs_recentchanges` and that ends up on RecentChanges.
+
+When fixing, need to keep in mind that we can't just run the input through
+titlepage, since in all other circumstances, the page name is already valid
+and we don't want to doubly-encode it.
+
+Seems like the goto plugin needs to check if the page name is valid and
+pass it through titlepage if not.
+
+(As a side effect of this, 404 will start redirecting "http://wiki/some page
+with spaces" to "http://wiki/some_page_with_spaces", if the latter exists.
+That seems like a fairly good thing.)
+
+[[done]]
+
+--[[Joey]]
diff --git a/doc/bugs/no_easy_way_to_wrap_HTML_container_around_a_set_of_inlined_pages.mdwn b/doc/bugs/no_easy_way_to_wrap_HTML_container_around_a_set_of_inlined_pages.mdwn
index 85c2d0c6c..1c1cbbb73 100644
--- a/doc/bugs/no_easy_way_to_wrap_HTML_container_around_a_set_of_inlined_pages.mdwn
+++ b/doc/bugs/no_easy_way_to_wrap_HTML_container_around_a_set_of_inlined_pages.mdwn
@@ -7,3 +7,17 @@ with a template definition like
<div id="foo">\[[!inline ... pages="<TMPL_VAR raw_pages>"]]</div>
It would be much more convenient if the loop over pages happened in the template, allowing me to just stick whatever markup I want around the loop.
+
+> Unfortunatly, I don't think this can be changed at this point,
+> it would probably break a lot of stuff that relies on the current
+> template arrangement, both in ikiwiki's internals and in
+> people's own, customised inline templates. (Also, I have some plans
+> to allow a single inline to use different templates for different
+> sorts of pages, which would rely on the current one template per
+> page approach to work.)
+>
+> But there is a simple workaround.. the first template in
+> an inline has FIRST set, and the last one has LAST set.
+> So you can use that to emit your div or table top and bottom.
+>
+> [[done]] --[[Joey]]
diff --git a/doc/bugs/pagecount_is_broken.mdwn b/doc/bugs/pagecount_is_broken.mdwn
new file mode 100644
index 000000000..101230d94
--- /dev/null
+++ b/doc/bugs/pagecount_is_broken.mdwn
@@ -0,0 +1,3 @@
+The [[plugins/pagecount]] plugin seems to be broken, as it claims there are [[!pagecount ]] pages in this wiki. (if it's not 0, the bug is fixed)
+
+[[fixed|done]] --[[Joey]]
diff --git a/doc/bugs/support_for_openid2_logins.mdwn b/doc/bugs/support_for_openid2_logins.mdwn
index f78d50c3c..1d99370f6 100644
--- a/doc/bugs/support_for_openid2_logins.mdwn
+++ b/doc/bugs/support_for_openid2_logins.mdwn
@@ -8,3 +8,10 @@ I've contacted JanRain who have pointed me to:
* Some [work](http://code.sixapart.com/svn/openid/trunk/perl/) by David Recordon
However both Perl OpenID 2.x implementations have not been released and are incomplete implementations. :(
+
+> Both of the projects referenced above have since been released.
+> Net::OpenID::Consumer 0.x in Debian is indeed only an OpenID 1
+> implementation. However, Net::OpenID::Consumer 1.x claims to be
+> an OpenID 2 implementation (it's the second of the projects
+> above). I've filed a bug in Debian asking for the package to be
+> updated. --[[smcv]]
diff --git a/doc/bugs/tagged__40____41___matching_wikilinks.mdwn b/doc/bugs/tagged__40____41___matching_wikilinks.mdwn
index 1bd556f50..e7e4af7c3 100644
--- a/doc/bugs/tagged__40____41___matching_wikilinks.mdwn
+++ b/doc/bugs/tagged__40____41___matching_wikilinks.mdwn
@@ -1,7 +1,10 @@
It may be that I'm simply misunderstanding something, but what is the rationale
for having `tagged()` also match normal wikilinks?
-> It simply hasn't been implemented yet -- see the answer in [[todo/tag_pagespec_function]]. Tags and wikilinks share the same underlying implementation, although ab reasonable expectation is that they are kept separate. --Ivan Z.
+> It simply hasn't been implemented yet -- see the answer in
+> [[todo/tag_pagespec_function]]. Tags and wikilinks share the same
+> underlying implementation, although ab reasonable expectation is that
+> they are kept separate. --Ivan Z.
The following situation. I have `tagbase => 'tag'`. On some pages, scattered
over the whole wiki, I use `\[[!tag open_issue_gdb]]` to declare that this page
@@ -15,3 +18,16 @@ this is due to the wikilink being equal to a `\[[!tag ...]]`. What's the
rationale on this, or what am I doing wrong, and how to achieve what I want?
--[[tschwinge]]
+
+> What you are doing "wrong" is putting non-tag pages (i.e.
+> `/tag/open_issues_gdb.mdwn`) under your tagbase. The rationale for
+> implementing tag as it has been, I think, is one of simplicity and
+> conciseness. -- [[Jon]]
+
+>> No, he has no pages under tagbase that aren't tags. This bug
+>> is valid. [[todo/matching_different_kinds_of_links]] is probably
+>> how it will eventually be solved. --[[Joey]]
+
+> And this is an illustration why a clean work-around (without changing the software) is not possible: while thinking about [[todo/matching_different_kinds_of_links]], I thought one could work around the problem by simply explicitly including the kind of the relation into the link target (like the tagbase in tags), and by having a separate page without the "tagbase" to link to when one wants simply to refer to the tag without tagging. But this won't work: one has to at least once refer to the real tag page if one wants to talk about it, and this reference will count as tagging (unwanted). --Ivan Z.
+
+> But well, perhaps there is a workaround without introducing different kinds of links. One could modify the [[tag plugin|plugins/tag]] so that it adds 2 links to a page: for tagging -- `tagbase/TAG`, and for navigation -- `tagdescription/TAG` (displayed at the bottom). Then the `tagdescription/TAG` page would hold whatever list one wishes (with `tagged(TAG)` in the pagespec), and whenever one wants to merely refer to the tag, one should link to `tagdescription/TAG`--this link won't count as tagging. So, `tagbase/TAG` would become completely auxiliary (internal) link targets for ikiwiki, the users would edit or link to only `tagdescription/TAG`. --Ivan Z.
diff --git a/doc/bugs/unwanted_discussion_links_on_discussion_pages.mdwn b/doc/bugs/unwanted_discussion_links_on_discussion_pages.mdwn
new file mode 100644
index 000000000..c74a094ce
--- /dev/null
+++ b/doc/bugs/unwanted_discussion_links_on_discussion_pages.mdwn
@@ -0,0 +1,36 @@
+Background: some po translations (amongst which `fr.po`) translate "discussion" to an upper-cased word (in French: "Discussion").
+By the way, this is wished e.g. in German, where such a noun has to be written with an upper-cased "D", but I can not see
+the logic behind the added "D" in French.
+
+Anyway, this gettext-translated word is used to name the discussion pages, as `$discussionlink` in `Render.pm` is
+built from `gettext("discussion")`. In the same piece of code, a case-sensitive regexp that tests wether the page
+being rendered is a discussion page is case-sensitive.
+
+On the other hand, new discussion pages are created with a name built from `gettext("Discussion")` (please note the upper-cased
+"D"). Such a new page name seems to be automagically downcased.
+
+This leads to newly created discussion pages not being recognized as discussion pages by the
+`$page !~ /.*\/\Q$discussionlink\E$/` regexp, so that then end with an unwanted discussion link.
+
+A simple fix that seems to work is to make this regexp case-insensitive:
+
+ git diff IkiWiki/Render.pm
+ diff --git a/IkiWiki/Render.pm b/IkiWiki/Render.pm
+ index adae9f0..093c25b 100644
+ --- a/IkiWiki/Render.pm
+ +++ b/IkiWiki/Render.pm
+ @@ -77,7 +77,7 @@ sub genpage ($$) {
+ }
+ if ($config{discussion}) {
+ my $discussionlink=gettext("discussion");
+ - if ($page !~ /.*\/\Q$discussionlink\E$/ &&
+ + if ($page !~ /.*\/\Q$discussionlink\E$/i &&
+ (length $config{cgiurl} ||
+ exists $links{$page."/".$discussionlink})) {
+ $template->param(discussionlink => htmllink($page, $page, gettext("Discussion"), noimageinline => 1, forcesubpage => 1));
+
+But the best way would be to avoid assuming implicitely that translators will translate "discussion" and "Discussion" the same way.
+
+> [[done]] --[[Joey]]
+
+[[!tag patch]]
diff --git a/doc/download.mdwn b/doc/download.mdwn
index 507b250a5..448bdeeb9 100644
--- a/doc/download.mdwn
+++ b/doc/download.mdwn
@@ -32,6 +32,8 @@ FreeBSD has ikiwiki in its
Gentoo has an [ebuild](http://bugs.gentoo.org/show_bug.cgi?id=144453) in its bug database.
+The [openSUSE Build Service](http://software.opensuse.org/search?baseproject=ALL&p=1&q=ikiwiki) has packages for openSUSE
+
IkiWiki can be installed [from macports](http://www.macports.org/ports.php?by=name&substr=ikiwiki)
by running `sudo port install ikiwiki`.
diff --git a/doc/git.mdwn b/doc/git.mdwn
index 8aa7250a7..9e28f1464 100644
--- a/doc/git.mdwn
+++ b/doc/git.mdwn
@@ -39,6 +39,9 @@ into [[Joey]]'s working tree. This is recommended. :-)
* [[jelmer]] `git://git.samba.org/jelmer/ikiwiki.git`
* [[hendry]] `git://webconverger.org/git/ikiwiki`
* [[jon]] `git://github.com/jmtd/ikiwiki.git`
+* [[ikipostal|DavidBremner]] `git://pivot.cs.unb.ca/git/ikipostal.git`
+* [[ikimailbox|DavidBremner]] `git://pivot.cs.unb.ca/git/ikimailbox.git`
+* [[ikiplugins|DavidBremner]] `git://pivot.cs.unb.ca/git/ikiplugins.git`
## branches
diff --git a/doc/ikiwiki-transition.mdwn b/doc/ikiwiki-transition.mdwn
index 6177f5a46..3d81d659f 100644
--- a/doc/ikiwiki-transition.mdwn
+++ b/doc/ikiwiki-transition.mdwn
@@ -61,7 +61,7 @@ If this is not done explicitly, a user's plaintext password will be
automatically converted to a hash when a user logs in for the first time
after upgrade to ikiwiki 2.48.
-# deduplinks your.setup|srcdir
+# deduplinks your.setup
In the past, bugs in ikiwiki have allowed duplicate link information
to be stored in its indexdb. This mode removes such duplicate information,
diff --git a/doc/ikiwiki/directive/aggregate.mdwn b/doc/ikiwiki/directive/aggregate.mdwn
index 70174f440..ddfcd40b7 100644
--- a/doc/ikiwiki/directive/aggregate.mdwn
+++ b/doc/ikiwiki/directive/aggregate.mdwn
@@ -19,6 +19,11 @@ more aggregated feeds. For example:
\[[!inline pages="internal(example/*)"]]
+Note the use of `internal()` in the [[ikiwiki/PageSpec]] to match
+aggregated pages. By default, aggregated pages are internal pages,
+which prevents them from showing up directly in the wiki, and so this
+special [[PageSpec]] is needed to match them.
+
## usage
Here are descriptions of all the supported parameters to the `aggregate`
diff --git a/doc/ikiwiki/directive/comment.mdwn b/doc/ikiwiki/directive/comment.mdwn
new file mode 100644
index 000000000..21386dfc3
--- /dev/null
+++ b/doc/ikiwiki/directive/comment.mdwn
@@ -0,0 +1,38 @@
+The `comment` directive is supplied by the
+[[!iki plugins/comments desc=comments]] plugin, and is used to add a comment
+to a page. Typically, the directive is the only thing on a comment page,
+and is filled out by the comment plugin when a user posts a comment.
+
+Example:
+
+ \[[!comment format=mdwn
+ username="foo"
+ subject="Bar"
+ date="2009-06-02T19:05:01Z"
+ content="""
+ Blah blah.
+ """
+ ]]
+
+## usage
+
+The only required parameter is `content`, the others just add or override
+metadata of the comment.
+
+* `content` - Text to display for the comment.
+ Note that [[directives|ikiwiki/directive]]
+ may not be allowed, depending on the configuration
+ of the comment plugin.
+* `format` - Specifies the markup used for the content.
+* `subject` - Subject for the comment.
+* `date` - Date the comment was posted. Can be entered in
+ nearly any format, since it's parsed by [[!cpan TimeDate]]
+* `username` - Used to record the username (or OpenID)
+ of a logged in commenter.
+* `ip` - Can be used to record the IP address of a commenter,
+ if they posted anonymously.
+* `claimedauthor` - Records the name that the user entered,
+ if anonmous commenters are allowed to enter their (unverified)
+ name.
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/ikiwiki/directive/format.mdwn b/doc/ikiwiki/directive/format.mdwn
index 94cf1b04f..23830e9cd 100644
--- a/doc/ikiwiki/directive/format.mdwn
+++ b/doc/ikiwiki/directive/format.mdwn
@@ -18,4 +18,12 @@ some other format:
4
"""]]
+Note that if the highlight plugin is enabled, this directive can also be
+used to display syntax highlighted code. Many languages and formats are
+supported. For example:
+
+ \[[format perl """
+ print "hello, world\n";
+ """]]
+
[[!meta robots="noindex, follow"]]
diff --git a/doc/ikiwiki/openid.mdwn b/doc/ikiwiki/openid.mdwn
index 5c91dfb58..a79655284 100644
--- a/doc/ikiwiki/openid.mdwn
+++ b/doc/ikiwiki/openid.mdwn
@@ -16,7 +16,7 @@ To sign up for an OpenID, visit one of the following identity providers:
* [Videntity](http://videntity.org/)
* [LiveJournal](http://www.livejournal.com/openid/)
* [TrustBearer](https://openid.trustbearer.com/)
-* or any of the [many others out there](http://openiddirectory.com/openid-providers-c-1.html) (but not [Yahoo](http://openid.yahoo.com) [[yet|plugins/openid/discussion/#Yahoo_unsupported]]).
+* or any of the [many others out there](http://openiddirectory.com/openid-providers-c-1.html)
Your OpenID is the URL that you are given when you sign up.
[[!if test="enabled(openid)" then="""
diff --git a/doc/ikiwiki/wikilink.mdwn b/doc/ikiwiki/wikilink.mdwn
index 371c2528f..f561d5850 100644
--- a/doc/ikiwiki/wikilink.mdwn
+++ b/doc/ikiwiki/wikilink.mdwn
@@ -21,7 +21,7 @@ name as the link text. For example `\[[foo_bar|SandBox]]` links to the SandBox
page, but the link will appear like this: [[foo_bar|SandBox]].
To link to an anchor inside a page, you can use something like
-`\[[WikiLink#foo]]`
+`\[[WikiLink#foo]]` .
## Directives and WikiLinks
diff --git a/doc/ikiwiki/wikilink/discussion.mdwn b/doc/ikiwiki/wikilink/discussion.mdwn
index 0677ff7de..b146c9447 100644
--- a/doc/ikiwiki/wikilink/discussion.mdwn
+++ b/doc/ikiwiki/wikilink/discussion.mdwn
@@ -1,6 +1,6 @@
-# Creating an anchor in Markdown
+# Creating an [[anchor]] in Markdown
-Is it a native Markdown "tag" for creating an anchor? Unfortunately,
+Is it a native Markdown "tag" for creating an [[anchor]]? Unfortunately,
I haven't any information about it at
[Markdown syntax](http://daringfireball.net/projects/markdown/syntax) page.
diff --git a/doc/ikiwikiusers.mdwn b/doc/ikiwikiusers.mdwn
index 989f05dfc..fdae9c047 100644
--- a/doc/ikiwikiusers.mdwn
+++ b/doc/ikiwikiusers.mdwn
@@ -38,7 +38,6 @@ Projects & Organizations
* [Chaos Computer Club Düsseldorf](https://www.chaosdorf.de)
* [monkeysphere](http://web.monkeysphere.info/)
* [The Walden Effect](http://www.waldeneffect.org/)
-* The [Fortran Wiki](http://fortranwiki.org/)
* [Monotone](http://monotone.ca/wiki/FrontPage/)
* The support pages for [Trinity Centre for High Performance Computing](http://www.tchpc.tcd.ie/support/)
* [St Hugh of Lincoln Catholic Primary School in Surrey](http://www.sthugh-of-lincoln.surrey.sch.uk/)
diff --git a/doc/news/version_3.12.mdwn b/doc/news/version_3.12.mdwn
index 722316233..1e1862bb0 100644
--- a/doc/news/version_3.12.mdwn
+++ b/doc/news/version_3.12.mdwn
@@ -1,4 +1,4 @@
-You may want to run `ikiwiki-transition deduplinks /path/to/srcdir`
+You may want to run `ikiwiki-transition deduplinks my.setup`
after upgrading to this version of ikiwiki. This command will
optimise your wiki's saved state, removing duplicate information
that can slow ikiwiki down.
diff --git a/doc/news/version_3.13.mdwn b/doc/news/version_3.13.mdwn
new file mode 100644
index 000000000..0c8f7ab8b
--- /dev/null
+++ b/doc/news/version_3.13.mdwn
@@ -0,0 +1,24 @@
+News for ikiwiki 3.13:
+
+ The `ikiwiki-transition deduplinks` command introduced in the
+ last release was buggy. If you followed the NEWS file instructions
+ and ran it, you should run `ikiwiki -setup` to rebuild your wiki
+ to fix the problem.
+
+ikiwiki 3.13 released with [[!toggle text="these changes"]]
+[[!toggleable text="""
+ * ikiwiki-transition: If passed a nonexistant srcdir, or one not
+ containing .ikiwiki, abort with an error rather than creating it.
+ * Allow underlaydir to be overridden without messing up inclusion
+ of other underlays via add\_underlay.
+ * More friendly display of markdown, textile in edit form selector
+ (jmtd)
+ * Allow curly braces to be used in pagespecs, and avoid a whole class
+ of potential security problems, by avoiding performing any string
+ interpolation on user-supplied data when translating pagespecs.
+ * ikiwiki-transition: Allow setup files to be passed to all subcommands
+ that need a srcdir.
+ * ikiwiki-transition: deduplinks was broken and threw away all
+ metadata stored by plugins in the index. Fix this bug.
+ * listdirectives: Avoid listing \_comment directives and generally
+ assume any directive starting with \_ is likewise internal."""]] \ No newline at end of file
diff --git a/doc/news/version_3.14.mdwn b/doc/news/version_3.14.mdwn
new file mode 100644
index 000000000..83c2b9188
--- /dev/null
+++ b/doc/news/version_3.14.mdwn
@@ -0,0 +1,13 @@
+ikiwiki 3.14 released with [[!toggle text="these changes"]]
+[[!toggleable text="""
+ * highlight: New plugin supporting syntax highlighting of pretty much
+ anything.
+ * debian/control: Add suggests for libhighlight-perl, although
+ that package is not yet created by Debian's highlight source package.
+ (See #529869)
+ * format: Provide a htmlizefallback hook that other plugins
+ can use to handle formats that are not suitable for general-purpose
+ htmlize hooks. Used by highlight.
+ * Fix test suite to not rely on an installed copy of ikiwiki after
+ underlaydir change. Closes: #[530502](http://bugs.debian.org/530502)
+ * Danish translation update. Closes: #[530877](http://bugs.debian.org/530877)"""]] \ No newline at end of file
diff --git a/doc/plugins/comments.mdwn b/doc/plugins/comments.mdwn
index c13a6daa6..7e2232411 100644
--- a/doc/plugins/comments.mdwn
+++ b/doc/plugins/comments.mdwn
@@ -19,7 +19,7 @@ users can only post comments.
Individual comments are stored as internal-use pages named something like
`page/comment_1`, `page/comment_2`, etc. These pages internally use a
-`\[[!_comment]]` [[ikiwiki/directive]].
+[[comment_directive|ikiwiki/directive/comment]].
There are some global options for the setup file:
diff --git a/doc/plugins/contrib.mdwn b/doc/plugins/contrib.mdwn
index e22b13f71..a03e6a95d 100644
--- a/doc/plugins/contrib.mdwn
+++ b/doc/plugins/contrib.mdwn
@@ -1,6 +1,5 @@
-Contributed [[plugins]]:
-
-(See [[install]] for installation help.)
+These plugins are provided by third parties and are not currently
+included in ikiwiki. See [[install]] for installation help.
[[!inline pages="plugins/contrib/* and !*/Discussion"
feedpages="created_after(plugins/contrib/navbar)" archive="yes"
diff --git a/doc/plugins/contrib/headinganchors.mdwn b/doc/plugins/contrib/headinganchors.mdwn
index c80cc0b49..becbf89a5 100644
--- a/doc/plugins/contrib/headinganchors.mdwn
+++ b/doc/plugins/contrib/headinganchors.mdwn
@@ -1,6 +1,6 @@
[[!template id=plugin name=headinganchors author="[[PaulWise]]"]]
-This is a simple plugin to add ids to all headings, based on their text. It
+This is a simple plugin to add ids (which will serve as [[anchor]]s) to all headings, based on their text. It
works as a postprocessing filter, allowing it to work on mdwn, wiki, html,
rst and any other format that produces html. The code is available here:
diff --git a/doc/plugins/contrib/headinganchors/discussion.mdwn b/doc/plugins/contrib/headinganchors/discussion.mdwn
new file mode 100644
index 000000000..91fe04a6d
--- /dev/null
+++ b/doc/plugins/contrib/headinganchors/discussion.mdwn
@@ -0,0 +1 @@
+Isn't this functionality a part of what [[plugins/toc]] needs and does? Then probably the [[plugins/toc]] plugin's code could be split into the part that implements the [[plugins/contrib/headinganchors]]'s functionality and the TOC generation itself. That will bring more order into the code and the set of available plugins. --Ivan Z.
diff --git a/doc/plugins/contrib/mailbox.mdwn b/doc/plugins/contrib/mailbox.mdwn
new file mode 100644
index 000000000..b7a9f81c7
--- /dev/null
+++ b/doc/plugins/contrib/mailbox.mdwn
@@ -0,0 +1,18 @@
+[[!template id=plugin name=mailbox author="[[DavidBremner]]"]]
+[[!tag type/format]]
+
+The `mailbox` plugin adds support to ikiwiki for
+rendering mailbox file into a page displaying the mails
+in the mailbox. It supports mbox, maildir, and MH folders,
+does threading, and deals with MIME.
+
+One hitch I noticed was that it is not currently possible to treat a
+maildir or an MH directory as a page (i.e. just call it foo.mh and have it
+transformed to page foo). I'm not sure if this is possible and worthwhile
+to fix. It is certainly workable to use a [[!mailbox ]] directive.
+-- [[DavidBremner]]
+
+This plugin is not in ikiwiki yet, but can be downloaded
+from <http://pivot.cs.unb.ca/git/ikimailbox.git>
+
+
diff --git a/doc/plugins/contrib/mailbox/discussion.mdwn b/doc/plugins/contrib/mailbox/discussion.mdwn
new file mode 100644
index 000000000..00fb0c05f
--- /dev/null
+++ b/doc/plugins/contrib/mailbox/discussion.mdwn
@@ -0,0 +1,5 @@
+# The remote repo
+
+For some reason, `git fetch` from http://pivot.cs.unb.ca/git/ikimailbox.git/ didn't work very smoothly for me: it hung, and I had to restart it 3 times before the download was complete.
+
+I'm writing this just to let you know that there might be some problems with such connections to your http-server. --Ivan Z.
diff --git a/doc/plugins/contrib/po.mdwn b/doc/plugins/contrib/po.mdwn
index 5b33f6716..665e48343 100644
--- a/doc/plugins/contrib/po.mdwn
+++ b/doc/plugins/contrib/po.mdwn
@@ -153,6 +153,14 @@ Any thoughts on this?
>> basewiki, which seems like it should be pretty easy to do, and would be
>> a great demo! --[[Joey]]
>>
+>>> I have a complete translation of basewiki into danish, and am working with
+>>> others on preparing one in german. For a complete translated user
+>>> experience, however, you will also need templates translated (there are a few
+>>> translatable strings there too). My not-yet-merged po4a Markdown improvements
+>>> (see [bug#530574](http://bugs.debian.org/530574)) correctly handles multiple
+>>> files in a single PO which might be relevant for template translation handling.
+>>> --[[JonasSmedegaard]]
+>>
>>> I've merged your changes into my own branch, and made great
>>> progress on the various todo items. Please note my repository
>>> location has changed a few days ago, my user page was updated
@@ -383,6 +391,9 @@ daring a timid "please pull"... or rather, please review again :)
>>> "discussion". Also, I consider `$config{cgi}` and `%links` (etc)
>>> documented parts of the plugin interface, which won't change; po could
>>> rely on them to avoid this minor problem. --[[Joey]]
+>>>>
+>>>> Done in my branch. --[[intrigeri]]
+>>>>
>
> * Is there any real reason not to allow removing a translation?
> I'm imagining a spammy translation, which an admin might not
@@ -423,3 +434,21 @@ daring a timid "please pull"... or rather, please review again :)
>> --[[intrigeri]]
>>
>>> Did you get a chance to? --[[Joey]]
+
+ * As discussed at [[todo/l10n]] the templates needs to be translatable too. They
+ should be treated properly by po4a using the markdown option - at least with my
+ later patches in [bug#530574](http://bugs.debian.org/530574)) applied.
+
+ * It seems to me that the po plugin (and possibly other parts of ikiwiki) wrongly
+ uses gettext. As I understand it, gettext (as used currently in ikiwiki) always
+ lookup a single language, That might make sense for a single-language site, but
+ multilingual sites should emit all strings targeted at the web output in each own
+ language.
+
+ So generally the system language (used for e.g. compile warnings) should be separated
+ from both master language and slave languages.
+
+ Preferrably the gettext subroutine could be extended to pass locale as optional
+ secondary parameter overriding the default locale (for messages like "N/A" as
+ percentage in po plugin). Alternatively (with above mentioned template support)
+ all such strings could be externalized as templates that can then be localized.
diff --git a/doc/plugins/contrib/postal.mdwn b/doc/plugins/contrib/postal.mdwn
new file mode 100644
index 000000000..b2f875393
--- /dev/null
+++ b/doc/plugins/contrib/postal.mdwn
@@ -0,0 +1,35 @@
+[[!template id=plugin name=postal author="[[DavidBremner]]"]]
+[[!tag type/useful]]
+
+The `postal` plugin allows users to send mail to
+a special address to comment on a page. It uses the [[mailbox]]
+plugin to display their comments in the wiki.
+
+This plugin is not in ikiwiki yet, but can be downloaded
+from <http://pivot.cs.unb.ca/git/ikipostal.git>
+
+Details:
+
+ * Adds a mailto: url to each page matching some pagespec
+ (currently every page gets a comment footer)
+
+ * This mailto url goes to an address identifying the page (something like
+ user-iki-blog~I\_hate\_markdown@host.fqdn.tld).
+ [more details](http://www.cs.unb.ca/~bremner/blog/posts/encoding)
+
+ * on the mail receiving end, these messages are either deleted, or ran through
+ a filter to be turned into blog posts. I have
+[written](http://pivot.cs.unb.ca/git/?p=ikipostal.git;a=blob_plain;f=filters/postal-accept.pl;hb=HEAD)
+ a filter that decodes the address and writes the message into an appropriate
+mailbox. The changes are then checked into version control; typically a hook then updates the html version of the wiki.
+ * work in progress can be
+
+ - [cloned](http://pivot.cs.unb.ca/git/ikipostal.git), or
+ - [browsed](http://pivot.cs.unb.ca/git/?p=ikipostal.git;a=summary)
+
+ * I would be interested in any ideas people have about security.
+
+The current version of this plugin is now running on my home page. See for example
+[a recent post in my blog](http://www.cs.unb.ca/~bremner/blog/posts/can-i-haz-a-distributed-rss/).
+Unfortunately although the [[mailbox|todo/mbox]] renderer supports threading, I haven't had
+a chance to implement comments on comments yet. --[[DavidBremner]]
diff --git a/doc/plugins/highlight.mdwn b/doc/plugins/highlight.mdwn
new file mode 100644
index 000000000..44ced80f7
--- /dev/null
+++ b/doc/plugins/highlight.mdwn
@@ -0,0 +1,75 @@
+[[!template id=plugin name=highlight author="[[Joey]]"]]
+[[!tag type/format]]
+
+This plugin allows ikiwiki to syntax highlight source code, using
+a fast syntax highlighter that supports over a hundred programming
+languages and file formats.
+
+## prerequisites
+
+You will need to install the perl bindings to the
+[highlight library](http://www.andre-simon.de/), which in Debian
+are in the [[!debpkg libhighlight-perl]] package.
+
+## embedding highlighted code
+
+To embed highlighted code on a page, you can use the
+[[format]] plugin.
+
+For example:
+
+ \[[!format c """
+ void main () {
+ printf("hello, world!");
+ }
+ """]]
+
+ \[[!format diff """
+ -bar
+ +foo
+ """]]
+
+You can do this for any extension or language name supported by
+the [highlight library](http://www.andre-simon.de/) -- basically anything
+you can think of should work.
+
+## highlighting entire source files
+
+To enable syntax highlighting of entire standalone source files, use the
+`tohighlight` setting in your setup file to control which files should be
+syntax highlighted. Here is a typical setting for it, enabling highlighting
+for files with the extensions .c, etc, and also for any files named
+"Makefile".
+
+ tohighlight => ".c .h .cpp .pl .py Makefile:make",
+
+It knows what language to use for most filename extensions (see
+`/etc/highlight/filetypes.conf` for a partial list), but if you want to
+bind an unusual filename extension, or any file without an extension
+(such as a Makefile), to a language, you can do so by appending a colon
+and the name of the language, as illustrated for Makefiles above.
+
+With the plugin configured this way, source files become full-fledged
+wiki pages, which means they can include [[WikiLinks|ikiwiki/wikilink]]
+and [[directives|ikiwiki/directive]] like any other page can, and are also
+affected by the [[smiley]] plugin, if it is enabled. This can be annoying
+if your code accidentially contains things that look like those.
+
+On the other hand, this also allows your syntax highlighed
+source code to contain markdown formatted comments and hyperlinks
+to other code files, like this:
+
+ /* \[[!format mdwn """
+ This comment will be formatted as *markdown*!
+
+ See \[[bar.h]].
+ ""]] */
+
+Finally, bear in mind that this lets anyone who can edit a page in your
+wiki also edit source code files that are in your wiki. Use appropriate
+caution.
+
+## colors
+
+The colors etc used for the syntax highlighting are entirely configurable
+by CSS. See ikiwiki's [[style.css]] for the defaults.
diff --git a/doc/plugins/more/discussion.mdwn b/doc/plugins/more/discussion.mdwn
new file mode 100644
index 000000000..f369d1e12
--- /dev/null
+++ b/doc/plugins/more/discussion.mdwn
@@ -0,0 +1,7 @@
+# Test:
+
+[[!more linktext="click for more" text="""
+This is the rest of my post. Not intended for people catching up on
+their blogs at 30,000 feet. Because I like to make things
+difficult.
+"""]]
diff --git a/doc/plugins/openid.mdwn b/doc/plugins/openid.mdwn
index d4aa18c7d..91fc7cddc 100644
--- a/doc/plugins/openid.mdwn
+++ b/doc/plugins/openid.mdwn
@@ -4,10 +4,12 @@
This plugin allows users to use their [OpenID](http://openid.net/) to log
into the wiki.
-The plugin needs the [[!cpan Net::OpenID::Consumer]] perl module. The
-[[!cpan LWPx::ParanoidAgent]] perl module is used if available, for added
-security. Finally, the [[!cpan Crypt::SSLeay]] perl module is needed to support
-users entering "https" OpenID urls.
+The plugin needs the [[!cpan Net::OpenID::Consumer]] perl module.
+Version 1.x is needed in order for OpenID v2 to work.
+
+The [[!cpan LWPx::ParanoidAgent]] perl module is used if available, for
+added security. Finally, the [[!cpan Crypt::SSLeay]] perl module is needed
+to support users entering "https" OpenID urls.
This plugin has a configuration option. You can set `--openidsignup`
to the url of a third-party site where users can sign up for an OpenID. If
diff --git a/doc/plugins/txt.mdwn b/doc/plugins/txt.mdwn
index 77d94d450..420898d09 100644
--- a/doc/plugins/txt.mdwn
+++ b/doc/plugins/txt.mdwn
@@ -8,7 +8,7 @@ Unlike other [[type/format]] plugins, no formatting of markup in
txt files is done; the file contents is displayed to the user as-is,
with html markup characters such as ">" escaped.
-The only exceptions are that [[WikiLinks|WikiLink]] and
+The only exceptions are that [[WikiLinks|ikiwiki/WikiLink]] and
[[directives|ikiwiki/directive]] are still expanded by
ikiwiki, and that, if the [[!cpan URI::Find]] perl module is installed, URLs
in the txt file are converted to hyperlinks.
diff --git a/doc/shortcuts.mdwn b/doc/shortcuts.mdwn
index c352f2452..14cd5ff2b 100644
--- a/doc/shortcuts.mdwn
+++ b/doc/shortcuts.mdwn
@@ -60,7 +60,7 @@ This page controls what shortcut links the wiki supports.
* [[!shortcut name=man url="http://linux.die.net/man/%s"]]
* [[!shortcut name=ohloh url="http://www.ohloh.net/projects/%s"]]
-To add a new shortcut, use the [[`shortcut`|ikiwiki/directive/shortcut]]
+To add a new shortcut, use the `shortcut`
[[ikiwiki/directive]]. In the url, "%s" is replaced with the
text passed to the named shortcut, after url-encoding it, and '%S' is
replaced with the raw, non-encoded text. The optional `desc` parameter
diff --git a/doc/style.css b/doc/style.css
index 74d968ddf..e6512aed8 100644
--- a/doc/style.css
+++ b/doc/style.css
@@ -389,3 +389,21 @@ span.color {
border: 1px solid #aaa;
padding: 3px;
}
+
+/* Used by the highlight plugin. */
+
+pre.hl { color:#000000; background-color:#ffffff; }
+.hl.num { color:#2928ff; }
+.hl.esc { color:#ff00ff; }
+.hl.str { color:#ff0000; }
+.hl.dstr { color:#818100; }
+.hl.slc { color:#838183; font-style:italic; }
+.hl.com { color:#838183; font-style:italic; }
+.hl.dir { color:#008200; }
+.hl.sym { color:#000000; }
+.hl.line { color:#555555; }
+.hl.mark { background-color:#ffffbb; }
+.hl.kwa { color:#000000; font-weight:bold; }
+.hl.kwb { color:#830000; }
+.hl.kwc { color:#000000; font-weight:bold; }
+.hl.kwd { color:#010181; }
diff --git a/doc/todo/Allow_disabling_edit_and_preferences_links.mdwn b/doc/todo/Allow_disabling_edit_and_preferences_links.mdwn
index a356c69df..5b9cc8742 100644
--- a/doc/todo/Allow_disabling_edit_and_preferences_links.mdwn
+++ b/doc/todo/Allow_disabling_edit_and_preferences_links.mdwn
@@ -52,3 +52,18 @@ Patch:
>>> is not controlled by any plugin. It would be nice if it were; I am
>>> trying to achieve a configuration where the only action supported
>>> via CGI is blog-style comments. --[Zack](http://zwol.livejournal.com/)
+
+>>> Like [[puck]], I'd like to keep search available but I want to disable all
+>>> login facitilities and thus disable the "Preferences" link.
+>>>
+>>> After digging a little bit in the source code, my first attempt was to make
+>>> the "Preferences" link appear only if there is `sessioncgi` hooks
+>>> registered. But this will not work as the [[plugins/inline]] plugin also
+>>> defines it.
+>>>
+>>> Looking for `auth` hooks currently would not work as at least
+>>> [[plugins/passwordauth]] does not register one.
+>>>
+>>> Adding a new `canlogin` hook looks like overkill to me. [[Joey]], how
+>>> about making registration of the `auth` hook mandatory for all plugins
+>>> making sense of the "Preferences" link? --[[Lunar]]
diff --git a/doc/todo/automatic_use_of_syntax_plugin_on_source_code_files.mdwn b/doc/todo/automatic_use_of_syntax_plugin_on_source_code_files.mdwn
index cd5ff34de..71b4b88f0 100644
--- a/doc/todo/automatic_use_of_syntax_plugin_on_source_code_files.mdwn
+++ b/doc/todo/automatic_use_of_syntax_plugin_on_source_code_files.mdwn
@@ -12,3 +12,6 @@ this would allow the use of ikiwiki for [[!wikipedia literate programming]].
* I have started something along these lines see [[plugins/contrib/sourcehighlight]]. For some reason I started with source-highlight [[DavidBremner]]
* I wonder if this is similar to what you want: <http://iki.u32.net/setup/Highlight_Code_Plugin/>
+
+> The new [[plugins/highlight]] plugin is in ikiwiki core and supports
+> source code files natively. [[done]] --[[Joey]]
diff --git a/doc/todo/automatic_use_of_syntax_plugin_on_source_code_files/discussion.mdwn b/doc/todo/automatic_use_of_syntax_plugin_on_source_code_files/discussion.mdwn
index 8bc75420d..64bc21ee0 100644
--- a/doc/todo/automatic_use_of_syntax_plugin_on_source_code_files/discussion.mdwn
+++ b/doc/todo/automatic_use_of_syntax_plugin_on_source_code_files/discussion.mdwn
@@ -51,6 +51,7 @@ I hit a wall the following example (the last commit in the above repo).
</div>
>>> I don't know what is going wrong for you... source-highlight, Markdown or something else.
+>>>> It's a well-known bug in old versions of markdown. --[[Joey]]
>>> I do find it interesting the way the sourcecode `div` and the list get interleaved. That
>>> just looks like a Markdown thing though.
>>> In any case, I've updated the patch below to include most of your changes. -- [[Will]]
diff --git a/doc/todo/comment_by_mail.mdwn b/doc/todo/comment_by_mail.mdwn
index 6d3eeb044..87e57417e 100644
--- a/doc/todo/comment_by_mail.mdwn
+++ b/doc/todo/comment_by_mail.mdwn
@@ -1,53 +1,3 @@
I would like to allow comments on ikiwiki pages without CGI.
-I have in mind something like
- * Use a pagetemplate hook
- in a plugin (DONE)
- * add a mailto: url to each page matching some pagespec
- (currently every page gets a comment footer)
- * this mailto url goes to an address identifying the page (something like
- user-iki-blog~I\_hate\_markdown@host.fqdn.tld). (DONE)
- [more details](http://www.cs.unb.ca/~bremner/blog/posts/encoding)
-
- * on the mail receiving end, these messages are either deleted, or ran through
- a filter to be turned into blog posts. As a first step, I have
-[written](http://pivot.cs.unb.ca/git/?p=ikipostal.git;a=blob_plain;f=filters/postal-filer.pl;hb=010357a08e9)
-a filter that decodes the address and writes the message into an appropriate
-mailbox. I would be interested in any ideas people have about security.
-
- * the same plugin can check for comments on a particular page next time the wiki
- is generated, and add a link. (more or less done)
- > If the filter just checks in the posts into revision control, the
- > post-commit hook would handle updating the wiki to include those
- > posts as they come in. --[[Joey]]
- * work in progress can be
-
- - [cloned](http://pivot.cs.unb.ca/git/ikiperl.git), or
- - [browsed](http://pivot.cs.unb.ca/git/?p=ikipostal.git;a=summary)
-
-
-Any comments? Write them here or send them to [[DavidBremner]]
-
-> I don't want to derail this with too much blue-skying, but I was thinking
-> earlier that it would be nice if ikiwiki could do something sensible with
-> mailbox files, such as turning them into a (threaded?) blog display.
->
-> One reason I was thinking about that was just that it would be nice to
-> be able to use ikiwiki for mailing list archives. But another reason was
-> that it would be nice to solve the problem described in
-> [[discussion_page_as_blog]]. For that you really want a threaded system,
-> and mailbox file formats already have threading.
->
-> If that were done, it would tie into what you're working on in an
-> interesting way, since the incoming mail would only need to be committed to
-> the appropriate mailbox file, with ikiwiki then running to process it.
-> --[[Joey]]
->> It is an interesting idea. I like that it uses an arbitrary MUA
->> as a "moderation" interface. After I killed a debian BTS entry with
->> clumsy pseudoheader editing I think any
->> reference info should also be encoded into the address.
-
-The current version of this plugin is now running on my home page. See for example
-[a recent post in my blog](http://www.cs.unb.ca/~bremner/blog/posts/can-i-haz-a-distributed-rss/).
-Unfortunately although the [[mailbox|todo/mbox]] renderer supports threading, I haven't had
-a chance to implement comments on comments yet. [[DavidBremner]]
+> [[done]], see [[plugins/contrib/postal]]
diff --git a/doc/todo/latex.mdwn b/doc/todo/latex.mdwn
index 604c5e87f..4363003c1 100644
--- a/doc/todo/latex.mdwn
+++ b/doc/todo/latex.mdwn
@@ -7,10 +7,14 @@ render via [HeVeA](http://pauillac.inria.fr/~maranget/hevea/index.html),
similar. Useful for mathematics, as well as for stuff like the LaTeX version
of the ikiwiki [[/logo]].
+> [[users/JasonBlevins]] has also a plugin for including [[LaTeX]] expressions (by means of `itex2MML`) -- [[plugins/mdwn_itex]] (look at his page for the link). --Ivan Z.
+
----
ikiwiki could also support LaTeX as a document type, again rendering to HTML.
+> [[users/JasonBlevins]] has also a [[plugins/pandoc]] plugin (look at his page for the link): in principle, [Pandoc](http://johnmacfarlane.net/pandoc/) can read and write [[LaTeX]]. --Ivan Z.
+
----
Conversely, how about adding a plugin to support exporting to LaTeX?
@@ -25,6 +29,8 @@ Conversely, how about adding a plugin to support exporting to LaTeX?
>>>> Interesting, just yesterday I was playing with pandoc to make PDFs from my Markdown. Could someone advise me on how to embed these PDFs into ikiwiki? I need some guidance in implementing this. --[[JosephTurian]]
+>>>> [[users/JasonBlevins]] has a [[plugins/pandoc]] plugin (look at his page for the link). --Ivan Z.
+
----
[here](http://ng.l4x.org/gitweb/gitweb.cgi?p=ikiwiki.git/.git;a=blob;f=IkiWiki/Plugin/latex.pm) is a first stab at
diff --git a/doc/todo/matching_different_kinds_of_links.mdwn b/doc/todo/matching_different_kinds_of_links.mdwn
index b71d7cc5f..26c5a072b 100644
--- a/doc/todo/matching_different_kinds_of_links.mdwn
+++ b/doc/todo/matching_different_kinds_of_links.mdwn
@@ -35,3 +35,13 @@ Besides pagespecs, the `rel=` attribute could be used for styles. --Ivan Z.
> was not available, which is why I didn't make it differentiate from
> normal links.) Might be better to go ahead and add the variable to
> core though. --[[Joey]]
+
+I saw somewhere else here some suggestions for the wiki-syntax for specifying the relation name of a link. One more suggestion---[the syntax used in Semantic MediaWiki](http://en.wikipedia.org/wiki/Semantic_MediaWiki#Basic_usage), like this:
+
+<pre>
+... the capital city is \[[Has capital::Berlin]] ...
+</pre>
+
+So a part of the effect of [[`\[[!taglink TAG\]\]`|plugins/tag]] could be represented as something like `\[[tag::TAG]]` or (more understandable relation name in what concerns the direction) `\[[tagged::TAG]]`.
+
+I don't have any opinion on this syntax (whether it's good or not)...--Ivan Z.
diff --git a/doc/todo/mbox.mdwn b/doc/todo/mbox.mdwn
index f7744563c..a6af0c3c5 100644
--- a/doc/todo/mbox.mdwn
+++ b/doc/todo/mbox.mdwn
@@ -3,7 +3,7 @@ I'd like to be able to drop an unmodified RFC2822 email message into ikiwiki, an
> We're discussing doing just that (well, whole mailboxes, really) over in
> [[comment_by_mail]] --[[Joey]]
>> The
->> [mailbox](http://pivot.cs.unb.ca/git/?p=ikimailbox.git;a=summary)
+>> [[plugins/contrib/mailbox]]
>> plugin is roughly feature complete at this point. It can read mbox, maildir, and
>> MH folders, does threading, and deals with MIME (now with
>> pagespec based sanity checking). No doubt lots of things could be
@@ -15,5 +15,4 @@ I'd like to be able to drop an unmodified RFC2822 email message into ikiwiki, an
>> It is certainly workable
>>> to use a \[[!mailbox ]] directive. -- [[DavidBremner]]
-> Your gitweb doesn't tell me where I can git pull this from, which I'd
-> like to do ... --[[Joey]]
+[[done]]
diff --git a/doc/todo/section-numbering.mdwn b/doc/todo/section-numbering.mdwn
new file mode 100644
index 000000000..3a2d232a8
--- /dev/null
+++ b/doc/todo/section-numbering.mdwn
@@ -0,0 +1,7 @@
+[[!tag wishlist]]
+
+Optional automatic section numbering would help reading: otherwise, a reader (like me) gets lost in the structure of a long page.
+
+I guess it is implementable with complex CSS... but one has first to compose this CSS in any case. So, this wish still has a todo status. --Ivan Z.
+
+And another aspect why this is related to ikiwiki, not just authoring a CSS, is that the style of the numbers (genereated by CSS probably) should match the style of the numbers in ikiwiki's [[plugins/toc]]. --Ivan Z.
diff --git a/doc/todo/syntax_highlighting.mdwn b/doc/todo/syntax_highlighting.mdwn
index b5d083ba5..3d122829b 100644
--- a/doc/todo/syntax_highlighting.mdwn
+++ b/doc/todo/syntax_highlighting.mdwn
@@ -1,48 +1,76 @@
There's been a lot of work on contrib syntax highlighting plugins. One should be
picked and added to ikiwiki core.
-Ideally, it should support both converting whole source files into wiki
+We want to support both converting whole source files into wiki
pages, as well as doing syntax highlighting as a preprocessor directive
-(which is either passed the text, or reads it from a file).
+(which is either passed the text, or reads it from a file). But,
+the [[ikiwiki/directive/format]] directive makes this easy enough to
+do if the plugin only supports whole source files. So, syntax plugins
+do no really need their own preprocessor directive, unless it makes
+things easier for the user.
## The big list of possibilities
* [[plugins/contrib/highlightcode]] uses [[!cpan Syntax::Highlight::Engine::Kate]],
operates on whole source files only, has a few bugs (see
[here](http://u32.net/Highlight_Code_Plugin/), and needs to be updated to
- support [[bugs/multiple_pages_with_same_name]].
+ support [[bugs/multiple_pages_with_same_name]]. (Currently a 404 :-( )
* [[!cpan IkiWiki-Plugin-syntax]] only operates as a directive.
Interestingly, it supports multiple highlighting backends, including Kate
and Vim.
* [[plugins/contrib/syntax]] only operates as a directive
([[not_on_source_code_files|automatic_use_of_syntax_plugin_on_source_code_files]]),
and uses [[!cpan Text::VimColor]].
-* [[plugins/contrib/sourcehighlight]] uses src-highlight, and operates on
+* [[plugins/contrib/sourcehighlight]] uses source-highlight, and operates on
whole source files only. Needs to be updated to
support [[bugs/multiple_pages_with_same_name]].
* [[sourcecode|todo/automatic_use_of_syntax_plugin_on_source_code_files/discussion]]
- also uses src-highlight, and operates on whole source files.
+ also uses source-highlight, and operates on whole source files.
Updated to work with the fix for [[bugs/multiple_pages_with_same_name]]. Untested with files with no extension, e.g. `Makefile`.
-* [[users/jasonblevins]]'s code plugin uses src-highlight, and supports both
- while file and directive use.
+* [[users/jasonblevins]]'s code plugin uses source-highlight, and supports both
+ whole file and directive use.
-* [hlsimple](http://pivot.cs.unb.ca/git/?p=ikiplugins.git;a=blob_plain;f=IkiWiki/Plugin/hlsimple.pm;hb=HEAD) is a wrapper for the the perl module Syntax::Highlight::Engine::Simple. This is pure perl, pretty simple, uses css. It ought to be pretty fast (according to the author, and just because it is not external).
+* [hlsimple](http://pivot.cs.unb.ca/git/?p=ikiplugins.git;a=blob_plain;f=IkiWiki/Plugin/hlsimple.pm;hb=HEAD) is a wrapper for the the perl module [[!cpan Syntax::Highlight::Engine::Simple]]. This is pure perl, pretty simple, uses css. It ought to be pretty fast (according to the author, and just because it is not external).
On the other hand, there are not many predefined languages yet. Defining language syntaxes is about as much
work as source-highlight, but in perl. I plan to package the base module for debian. Perhaps after the author
releases the 5 or 6 language definitions he has running on his web site, it might be suitable for inclusion in ikiwiki. [[DavidBremner]]
-## General problems
-
-* Using non-perl syntax highlighting backends is slow. I'd prefer either
- using a perl module, or a multiple-backend solution that can use a perl
- module as one option. (Or, if there's a great highlighter python module,
- we could use an external plugin..)
-* Currently no single plugin supports both modes of operation (directive
- and whole source file to page).
-
- > This is now fixed by the [[ikiwiki/directive/format]] directive for all
- > whole-source-file plugins, right?
-
+* [[plugins/highlight]] uses [highlight](http://www.andre-simon.de) via
+ its swig bindings. It optionally supports whole files, but also
+ integrates with the format directive to allow formatting of *any* of
+ highlight's supported formats. (For whole files, it uses either
+ keepextension or noextension, as appropriate for the type of file.)
+
+## General problems / requirements
+
+* Using non-perl syntax highlighting backends is slower. All things equal,
+ I'd prefer either using a perl module, or a multiple-backend solution that
+ can use a perl module as one option. (Or, if there's a great highlighter
+ python module, we could use an external plugin..)
+
+ Of course, some perl modules are also rather slow.. Kate, for example
+ can only process about 33 lines of C code, or 14 lines of
+ debian/changelog per second. That's **30 times slower than markdown**!
+
+ By comparison, source-highlight can do about 5000 lines of C code per
+ second... And launching the program 100 times on an empty file takes about
+ 5 seconds, which isn't bad. And, it has a C++ library, which it
+ seems likely perl bindings could be written for, to eliminate
+ even that overhead.
+ > [highlight](http://www.andre-simon.de) has similar features to source-highlight, and swig bindings
+ > that should make it trivial in principle to call from perl. I like highlight a bit better because
+ > it has a pass-through feature that I find very useful. My memory is unfortunately a bit fuzzy as to how
+ > well the swig bindings work. [[DavidBremner]]
+
+* Engines that already support a wide variety of file types are of
+ course preferred. If the engine doesn't support a particular type
+ of file, it could fall back to doing something simple like
+ adding line numbers. (IkiWiki-Plugin-syntax does this.)
+* XHTML output.
+* Emitting html that uses CSS to control the display is preferred,
+ since it allows for easy user customization. (Engine::Simple does
+ this; Kate can be configured to do it; source-highlight can be
+ made to do it via the switches `--css /dev/null --no-doc`)
* Nothing seems to support
[[wiki-formatted_comments|wiki-formatted_comments_with_syntax_plugin]]
inside source files. Doing this probably means post-processing the
@@ -69,65 +97,24 @@ releases the 5 or 6 language definitions he has running on his web site, it migh
* The whole-file plugins all get confused if there is a `foo.c` and a `foo.h`.
This is trivially fixable now by passing the keepextension option when
- registering the htmlize hooks, though.
+ registering the htmlize hooks, though. There's also a noextension option
+ that should handle the
+ case of source files with names that do not contain an extension (ie,
+ "Makefile") -- in this case you just register the while filename
+ in the htmlize hook.
* Whole-file plugins register a bunch of htmlize hooks. The wacky thing
about it is that, when creating a new page, you can then pick "c" or
- "h" or "pl" etc from the dropdown that normally has "mdwn" etc in it.
- Is this a bug, or a feature? (Even if a feature, plugins with many
- extensions make the dropdown unusable.. One way to deal with that is have
- a config setting that lists what extensions to offer highlighting for.
- Most people won't need/want the dozens some engines support.)
-* The per page highlighters can't handle creating wiki pages from
- "Makefile", or other files without a significant extension.
- Not clear how to fix this, as ikiwiki is very oriented toward file
- extensions. The workaround is to use a directive on a wiki page, pulling
- in the Makefile.
-
- > I wonder how hard it would be to make a patch whereby a file with
- > no `.` in the name, and a name that matches a filetype, and where
- > that filetype was registered `keepextension`, then the file is just
- > chosen as the appropriate type. This would allow `Makefile` to
- > work.
-
-like this:
-
- diff --git a/IkiWiki.pm b/IkiWiki.pm
- index 8d728c9..1bd46a9 100644
- --- a/IkiWiki.pm
- +++ b/IkiWiki.pm
- @@ -618,6 +618,8 @@ sub pagetype ($) {
-
- if ($page =~ /\.([^.]+)$/) {
- return $1 if exists $hooks{htmlize}{$1};
- + } elsif ($hooks{htmlize}{$page}{keepextension}) {
- + return $page;
- }
- return;
- }
-
-## format directive
-
-Rather than making syntax highlight plugins have to provide a preprocessor
-directive as well as handling whole source files, perhaps a generic format
-directive could be used:
-
- \[[!format pl """..."""]]
-
-That would run the text through the pl htmlizer, from the syntax hightligh
-plugin. OTOH, if "rst" were given, it would run the text through the rst
-htmlizer. So, more generic, allows mixing different types of markup on one
-page, as well as syntax highlighting. Does require specifying the type of
-format, instead of allowing it to be guessed (which some syntax highlighters
-can do). (This directive is now implemented..)
-
-Hmm, this would also allow comments inside source files to have mdwn
-embedded in them, without making the use of mdwn a special case, or needing
-to postprocess the syntax highlighter output to find comments.
-
- /* \[[!format mdwn """
+ "h" or "pl" etc from the dropdown that normally has "Markdown" etc in it.
+ Is this a bug, or a feature? Even if a feature, plugins with many
+ extensions make the dropdown unusable..
- This is a comment in my C file. You can use mdwn in here.
+ Perhaps the thing to do here is to use the new `longname` parameter to
+ the format hook, to give them all names that will group together at or
+ near the end of the list. Ie: "Syntax: perl", "Source code: c", etc.
- """]] */
+---
-Note that this assumes that directives are expanded in source files.
+I'm calling this [[done]] since I added the [[plugins/highlight]]
+plugin. There are some unresolved issues touched on here,
+but they either have the own other bug reports, or are documented
+as semi-features in the docs to the plugin. --[[Joey]]
diff --git a/doc/todo/syntax_highlighting/discussion.mdwn b/doc/todo/syntax_highlighting/discussion.mdwn
index 7a4095c65..27cb7084b 100644
--- a/doc/todo/syntax_highlighting/discussion.mdwn
+++ b/doc/todo/syntax_highlighting/discussion.mdwn
@@ -24,3 +24,5 @@ repository? --[[JasonBlevins]]
>> [[sourcecode|todo/automatic_use_of_syntax_plugin_on_source_code_files/discussion]]
>> plugin only adds the file extensions listed in the config. This shouldn't cause
>> massive drop-down menu pollution. -- [[Will]]
+
+>>> That seems to be the way to go! --[[Joey]]
diff --git a/doc/todo/target_filter_for_brokenlinks.mdwn b/doc/todo/target_filter_for_brokenlinks.mdwn
new file mode 100644
index 000000000..137277c21
--- /dev/null
+++ b/doc/todo/target_filter_for_brokenlinks.mdwn
@@ -0,0 +1,9 @@
+[[!tag wishlist]]
+
+Currently, [[plugins/brokenlinks]] supports filtering by the place where a broken wikilink is used.
+
+Filtering by the target of the broken link would also be useful, e.g.,
+
+ \[[!brokenlinks matching="tagbase/*"]]
+
+would list the tags not yet "filled out". --Ivan Z.
diff --git a/doc/todo/tracking_bugs_with_dependencies.mdwn b/doc/todo/tracking_bugs_with_dependencies.mdwn
index 8b36f1e59..3a761731b 100644
--- a/doc/todo/tracking_bugs_with_dependencies.mdwn
+++ b/doc/todo/tracking_bugs_with_dependencies.mdwn
@@ -196,21 +196,108 @@ account all comments above (which doesn't mean it is above reproach :) ). --[[W
> Very belated code review of last version of the patch:
>
> * `is_globlist` is no longer needed
+
+>> Good :)
+
> * I don't understand why the pagespec match regexp is changed
> from having flags `igx` to `ixgs`. Don't see why you
> want `.` to match '\n` in it, and don't see any `.` in the regexp
> anyway?
+
+>> Because you have to define all the named pagespecs in the pagespec, you sometimes end up with very long pagespecs. I found it useful to split them over multiple lines. That didn't work at one point and I added the 's' to make it work. I may have further altered the regex since then to make the 's' redundant. Remove it and see if multi-line pagespecs still work. :)
+
+>>> Well, I can tell you that multi-line pagespecs are supported w/o
+>>> your patch .. I use them all the time. The reason I find your
+>>> use of `/s` unlikely is because without it `\s` already matches
+>>> a newline. Only if you want to treat a newline as non-whitespace
+>>> is `/s` typically necessary. --[[Joey]]
+
> * Some changes of `@_` to `%params` in `pagespec_makeperl` do not
> make sense to me. I don't see where \%params is defined and populated,
> except with `\$params{specFunc}`.
+
+>> I'm not a perl hacker. This was a mighty battle for me to get going.
+>> There is probably some battlefield carnage from my early struggles
+>> learning perl left here. Part of this is that @_ / @params already
+>> existed as a way of passing in extra parameters. I didn't want to
+>> pollute that top level namespace - just at my own parameter (a hash)
+>> which contained the data I needed.
+
+>>> I think I understand how the various `%params`
+>>> (there's not just one) work in your code now, but it's really a mess.
+>>> Explaining it in words would take pages.. It could be fixed by,
+>>> in `pagespec_makeperl` something like:
+>>>
+>>> my %specFuncs;
+>>> push @_, specFuncs => \%specFuncs;
+>>>
+>>> With that you have the hash locally available for populating
+>>> inside `pagespec_makeperl`, and when the `match_*` functions
+>>> are called the same hash data will be available inside their
+>>> `@_` or `%params`. No need to change how the functions are called
+>>> or do any of the other hacks.
+>>>
+>>> Currently, specFuncs is populated by building up code
+>>> that recursively calls `pagespec_makeperl`, and is then
+>>> evaluated when the pagespec gets evaluated. My suggested
+>>> change to `%params` will break that, but that had to change
+>>> anyway.
+>>>
+>>> It probably has a security hole, and is certianly inviting
+>>> one, since the pagespec definition is matched by a loose regexp (`.*`)
+>>> and then subject to string interpolation before being evaluated
+>>> inside perl code. I recently changed ikiwiki to never interpolate
+>>> user-supplied strings when translating pagespecs, and that
+>>> needs to happen here too. The obvious way, it seems to me,
+>>> is to not generate perl code, but just directly run perl code that
+>>> populates specFuncs.
+
+>>>> I don't think this is as bad as you make out, but your addition of the
+>>>> data array will break with the recursion my patch adds in pagespec_makeperl.
+>>>> To fix that I'll need to pass a reference to that array into pagespec_makeperl.
+>>>> I think I can then do the same thing to $params{specFuncs}. -- [[Will]]
+
+>>>>> You're right -- I did not think the recursive case through.
+>>>>> --[[Joey]]
+
> * Seems that the only reason `match_glob` has to check for `~` is
> because when a named spec appears in a pagespec, it is translated
> to `match_glob("~foo")`. If, instead, `pagespec_makeperl` checked
> for named specs, it could convert them into `check_named_spec("foo")`
> and avoid that ugliness.
+
+>> Yeah - I wanted to make named specs syntactically different on my first pass. You are right in that this could be made a fallback - named specs always override pagenames.
+
> * The changes to `match_link` seem either unecessary, or incomplete.
> Shouldn't it check for named specs and call
> `check_named_spec_existential`?
+
+>> An earlier version did. Then I realised it wasn't actually needed in that case - match_link() already included a loop that was like a type of existential matching. Each time through the loop it would
+>> call match_glob(). match_glob() in turn will handle the named spec. I tested this version briefly and it seemed to work. I remember looking at this again later and wondering if I had mis-understood
+>> some of the logic in match_link(), which might mean there are cases where you would need an explicit call to check_named_spec_existential() - I never checked it properly after having that thought.
+
+>>> In the common case, `match_link` does not call `match_glob`,
+>>> because the link target it is being asked to check for is a single
+>>> page name, not a glob.
+
+>>>> A named pagespec should fall into the glob case. These two pagespecs should be the same:
+
+ link(a*)
+
+>>>> and
+
+ define(aStar, a*) and link(~aStar)
+
+>>>> In the first case, we want the pagespec to match any page that links to a page matching the glob.
+>>>> In the second case, we want the pagespec to match any page that links to a page matching the named spec.
+>>>> match_link() was already doing existential part. The patches to this code were simply to remove the `lc()`
+>>>> call from the named pagespec name. Can that `lc` be removed entirely? -- [[Will]]
+
+>>>>> I think we could get rid of it. `bestlink` will lc it itself
+>>>>> if the uppercase version does not exist; `match_glob` matches
+>>>>> insensitively.
+>>>>> --[[Joey]]
+
> * Generally, the need to modify `match_*` functions so that they
> check for and handle named pagespecs seems suboptimal, if
> only because there might be others people may want to use named
@@ -221,120 +308,304 @@ account all comments above (which doesn't mean it is above reproach :) ). --[[W
> that is not a page name at all, and it could be weird
> if such a parameter were accidentially interpreted as a named
> pagespec. (But, that seems unlikely to happen.)
+
+>> Possibly. I'm not sure which I prefer between the current solution and that one. Each have advantages and disadvantages.
+>> It really isn't much code for the match functions to add a call to check_named_spec_existential().
+
+>>> But if a plugin adds its own match function, it has
+>>> to explicitly call that code to support named pagespecs.
+
+>>>> Yes, and it can do that in just three lines of code. But if we automatically check for named pagespecs all the time we
+>>>> potentially break any matching function that doesn't accept pages, or wants to use multiple arguments.
+
+>>>>> 3 lines of code, plus the functions called become part of the API,
+>>>>> don't forget about that..
+>>>>>
+>>>>> Yes, I think that is the tradeoff, the question is whether to export
+>>>>> the additional complexity needed for that flexability.
+>>>>>
+>>>>> I'd be suprised if multiple argument pagespecs become necessary..
+>>>>> with the exception of this patch there has been no need for them yet.
+>>>>>
+>>>>> There are lots of pagespecs that take data other than pages,
+>>>>> indeed, that's really the common case. So far, none of them
+>>>>> seem likely to take data that starts with a `~`. Perhaps
+>>>>> the thing to do would be to check if `~foo` is a known,
+>>>>> named pagespec, and if not, just pass it through unchanged.
+>>>>> Then there's little room for ambiguity, and this also allows
+>>>>> pagespecs like `glob(~foo*)` to match the literal page `~foo`.
+>>>>> (It will make pagespec_merge even harder tho.. see below.)
+>>>>> --[[Joey]]
+
+>>>>>> I've already used multi-argument pagespec match functions in
+>>>>>> my data plugin. It is used for having different types of links. If
+>>>>>> you want to have multiple types of links, then the match function
+>>>>>> for them needs to take both the link name and the link type.
+>>>>>> I'm trying to think of a way we could have both - automatically
+>>>>>> handle the existential case unless the function indicates somehow
+>>>>>> that it'll do it itself. Any ideas? -- [[Will]]
+
> * I need to check if your trick to avoid infinite recursion
> works if there are two named specs that recursively
> call one-another. I suspect it does, but will test this
> myself..
->
+
+>> It worked for me. :)
+
+> * I also need to verify if memoizing the named pagespecs has
+> really guarded against very expensive pagespecs DOSing the wiki..
+
> --[[Joey]]
+>> There is one issue that I've been thinking about that I haven't raised anywhere (or checked myself), and that is how this all interacts with page dependencies.
+>> Firstly, I'm not sure anymore that the `pagespec_merge` function will continue to work in all cases.
+
+>>> The problem I can see there is that if two pagespecs
+>>> get merged and both use `~foo` but define it differently,
+>>> then the second definition might be used at a point when
+>>> it shouldn't (but I haven't verified that really happens).
+>>> That could certianly be a show-stopper. --[[Joey]]
+
+>>>> I think this can happen in the new closure based code. I don't think this could happen in the old code. -- [[Will]]
+
+>>>> Even if that works, this is a good argument for having a syntactic difference between named pagespecs and normal pages.
+>>>> If you're joining two pagespecs with 'or', you don't want a named pagespec in the first part overriding a page name in the
+>>>> second part. Oh, and I assume 'or' has the right operator precedence that "a and b or c" is "(a and b) or c", and not "a and (b or c)" -- [[Will]]
+
+>>>>> Looks like its bracketed in the code anyway... -- [[Will]]
+
+>>>> Perhaps the thing to do is to have a `clear_defines()`
+>>>> function, then merging `A` and `B` yields `(A) or (clear_defines() and (B))`
+>>>> That would deal with both the cases where `A` and `B` differently
+>>>> define `~foo` as well as with the case where `A` defines `~foo` while
+>>>> `B` uses it to refer to a literal page.
+>>>> --[[Joey]]
+
+>>>>> I don't think this will work with the new patch, and I don't think it was needed with the old one.
+>>>>> Under the old patch, pagespec_makeperl() generated a string of unevaluated, self-contained, perl
+>>>>> code. When a new named pagespec was defined, a recursive call was made to get the perl code
+>>>>> for the pagespec, and then that code was used to add something like `$params{specFuncs}->{name} = sub {recursive code} and `
+>>>>> to the result of the calling function. This means that at pagespec testing time, when this code is executed, the
+>>>>> specFuncs hash is built up as the pagespec is checked. In the case of the 'or' used above, later redefinitions of
+>>>>> a named pagespec would have redefined the specFunc at the right time. It should have just worked. However...
+
+>>>>> Since my original patch, you started using closures for security reasons (and I can see the case for that). Unfortunately this
+>>>>> means that the generated perl code is no longer self-contained - it needs to be evaluated in the same closure it was generated
+>>>>> so that it has access to the data array. To make this work with the recursive call I had two options: a) make the data array a
+>>>>> reference that I pass around through the pagespec_makeperl() functions and have available when the code is finally evaluated
+>>>>> in pagespec_translate(), or b) make sure that each pagespec is evaluated in its correct closure and a perl function is returned, not a
+>>>>> string containing unevaluated perl code.
+
+>>>>> I went with option b). I did it in such a way that the hash of specfuncs is built up at translation time, not at execution time. This
+>>>>> means that with the new code you can call specfuncs that get defined out of order:
+
+ ~test and define(~test, blah)
+
+>>>>> but it also means that using a simple 'or' to join two pagespecs wont work. If you do something like this:
+
+ ~test and define(~test, foo) and define(~test, baz)
+
+>>>>> then the last definition (baz) takes precedence.
+>>>>> In the process of writing this I think I've come up with a way to change this back the way it was, still using closures. -- [[Will]]
+
+>> Secondly, it seems that there are two types of dependency, and ikiwiki
+>> currently only handles one of them. The first type is "Rebuild this
+>> page when any of these other pages changes" - ikiwiki handles this.
+>> The second type is "rebuild this page when set of pages referred to by
+>> this pagespec changes" - ikiwiki doesn't seem to handle this. I
+>> suspect that named pagespecs would make that second type of dependency
+>> more important. I'll try to come up with a good example. -- [[Will]]
+
+>>> Hrm, I was going to build an example of this with backlinks, but it
+>>> looks like that is handled as a special case at the moment (line 458 of
+>>> render.pm). I'll see if I can breapk
+>>> things another way. Fixing this properly would allow removal of that special case. -- [[Will]]
+
+>>>> I can't quite understand the distinction you're trying to draw
+>>>> between the two types of dependencies. Backlinks are a very special
+>>>> case though and I'll be suprised if they fit well into pagespecs.
+>>>> --[[Joey]]
+
+>>>>> The issue is that the existential pagespec matching allows you to build things that have similar
+>>>>> problems to backlinks.
+>>>>> e.g. the following inline:
+
+ \[[!inline pages="define(~done, link(done)) and link(~done)" archive=yes]]
+
+>>>>> includes any page that links to a page that links to done. Now imagine I add a new link to 'done' on
+>>>>> some random page somewhere - a page which some other page links to which didn't previously get included - the set of pages accepted by the pagespec, and hence the set of
+>>>>> pages inlined, will change. But, there is no dependency anywhere on the page that I altered, so
+>>>>> ikiwiki will not rebuild the page with the inline in it. What is happening is that the page that I altered affects
+>>>>> the set of pages matched by the pagespec without itself being matched by the pagespec, and hence included in the dependency list.
+
+>>>>> To make this work well, I think you need to recognise two types of dependencies for each page (and no
+>>>>> special cases for particular types of links, eg backlinks). The first type of dependency says, "The content of
+>>>>> this page depends upon the content of these other pages". The `add_depends()` in the shortcuts
+>>>>> plugin is of this form: any time the shortcuts page is edited, any page with a shortcut on it
+>>>>> is rebuilt. The inline plugin also needs to add dependencies of this form to detect when the inlined
+>>>>> content changes. By contrast, the map plugin does not need a dependency of this form, because it
+>>>>> doesn't actually care about the content of any pages, just which pages it needs to include (which we'll handle next).
+
+>>>>> The second type of dependency says, "The content of this page depends upon the exact set of pages matched
+>>>>> by this pagespec". The first type of dependency was about the content of some pages, the second type is about
+>>>>> which pages get matched by a pagespec. This is the type of dependency tracking that the map plugin needs.
+>>>>> If the set of pages matched by map pagespec changes, then the page with the map on it needs to be rebuilt to show a different list of pages.
+>>>>> Inline needs this type of dependency as well as the previous type - This type handles a change in which pages
+>>>>> are inlined, the previous type handles a change in the content of any of those pages. Shortcut does not need this type of
+>>>>> dependency. Most of the places that use `add_depends()` seem to need this type of dependency rather than the first type.
+
+>>>>>> Note that inline and map currently achieve the second type of dependency by
+>>>>>> explicitly calling `add_depends` for each page the displayed.
+>>>>>> If any of those pages are removed, the regular pagespec would not
+>>>>>> match them -- since they're gone. However, the explicit dependency
+>>>>>> on them does cause them to match. It's an ugly corner I'd like to
+>>>>>> get rid of. --[[Joey]]
+
+>>>>> Implementation Details: The first type of dependency can be handled very similarly to the current
+>>>>> dependency system. You just need to keep a list of pages that the content depends upon. You could
+>>>>> keep that list as a pagespec, but if you do this you might want to check that the pagespec doesn't change,
+>>>>> possibly by adding a dependency of the second type along with the dependency of the first type.
+
+>>>>>> An example of the current system not tracking enough data is
+>>>>>> where A inlines B which inlines C. A change to C will cause B to
+>>>>>> rebuild, but A will not "notice" that B has implicitly changed.
+>>>>>> That example suggests it might be fixable without explicitly storing
+>>>>>> data, by causing a rebuild of B to be treated as a change to B.
+>>>>>> --[[Joey]]
+
+>>>>> The second type of dependency is a little more tricky. For each page, we'd need a list of pagespecs that
+>>>>> the page depended on, and for each pagespec you'd want to store the list of pages that currently match it.
+>>>>> On refresh, you'd need to check each pagespec to see if the set of pages that match it has changed, and if
+>>>>> that set has changed, then rebuild the dependent page(s). Oh, and for this second type of dependency, I
+>>>>> don't think you can merge pagespecs. If I wanted to know if either "\*" or "link(done)" changes, then just checking
+>>>>> to see if the set of pages matched by "\* or link(done)" changes doesn't work.
+
+>>>>> The current system works because even though you usually want dependencies of the second type, the set of pages
+>>>>> referred to by a pagespec can only change if one of those pages itself changes. i.e. A dependency check of the
+>>>>> first type will catch a dependency change of the second type with current pagespecs.
+>>>>> This doesn't work with backlinks, and it doesn't work with existential matching. Backlinks are currently special-cased. I don't know
+>>>>> how to special-case existential matching - I suspect you're better off just getting the dependency tracking right.
+
+>>>>> I also tried to come up with other possible solutions: e.g. can we find the dependencies for a pagespec? That
+>>>>> would be the set of pages where a change on one of those pages could lead to a change in the set of pages matched by the pagespec.
+>>>>> For old-style pagespecs without backlinks, the dependency set for a pagespec is the same as the set of pages the pagespec matches.
+>>>>> Unfortunately, with existential matching, the set of pages that each
+>>>>> pagespec depends upon can quickly become "*", which is not very useful. -- [[Will]]
+
+Patch updated to use closures rather than inline generated code for named pagespecs. Also includes some new use of ErrorReason where appropriate. -- [[Will]]
+
+> * Perl really doesn't need forward declarations, honest!
+
+>> It complained (warning, not error) when I didn't use the forward declaration. :(
+
+> * I have doubts about memoizing the anonymous sub created by
+> `pagespec_translate`.
+
+>> This is there explicitly to make sure that runtime is polynomial and not exponential.
+
+> * Think where you wrote `+{}` you can just write `{}`
+
+>> Possibly :) -- [[Will]]
+
----
diff --git a/IkiWiki.pm b/IkiWiki.pm
- index 4e4da11..8b3cdfe 100644
+ index 061a1c6..1e78a63 100644
--- a/IkiWiki.pm
+++ b/IkiWiki.pm
- @@ -1550,7 +1550,16 @@ sub globlist_to_pagespec ($) {
-
- sub is_globlist ($) {
- my $s=shift;
- - return ( $s =~ /[^\s]+\s+([^\s]+)/ && $1 ne "and" && $1 ne "or" );
- + return ! ($s =~ /
- + (^\s*
- + [^\s(]+ # single item
- + (\( # possibly with parens after it
- + ([^)]* # with stuff inside those parens
- + (\([^)]*\))*)* # maybe even nested parens
- + \))?\s*$
- + ) |
- + (\s and \s) | (\s or \s) # or we find 'and' or 'or' somewhere
- + /xs);
- }
-
- sub safequote ($) {
- @@ -1631,7 +1640,7 @@ sub pagespec_merge ($$) {
+ @@ -1774,8 +1774,12 @@ sub pagespec_merge ($$) {
return "($a) or ($b)";
}
-sub pagespec_translate ($) {
- +sub pagespec_makeperl ($) {
+ +# is perl really so dumb it requires a forward declaration for recursive calls?
+ +sub pagespec_translate ($$);
+ +
+ +sub pagespec_translate ($$) {
my $spec=shift;
+ + my $specFuncsRef=shift;
- # Support for old-style GlobLists.
- @@ -1650,12 +1659,14 @@ sub pagespec_translate ($) {
+ # Convert spec to perl code.
+ my $code="";
+ @@ -1789,7 +1793,9 @@ sub pagespec_translate ($) {
|
\) # )
|
- \w+\([^\)]*\) # command(params)
- + define\(\s*~\w+\s*,((\([^()]*\)) | ([^()]+))+\) # define(~specName, spec) - spec can contain parens 1 deep
+ + define\(\s*~\w+\s*,((\([^()]*\)) | ([^()]+))+\) # define(~specName, spec) - spec can contain parens 1 deep
+ |
+ \w+\([^()]*\) # command(params) - params cannot contain parens
|
[^\s()]+ # any other text
)
- \s* # ignore whitespace
- - }igx) {
- + }igxs) {
- my $word=$1;
- if (lc $word eq 'and') {
- $code.=' &&';
- @@ -1666,16 +1677,23 @@ sub pagespec_translate ($) {
+ @@ -1805,10 +1811,19 @@ sub pagespec_translate ($) {
elsif ($word eq "(" || $word eq ")" || $word eq "!") {
$code.=' '.$word;
}
- elsif ($word =~ /^(\w+)\((.*)\)$/) {
- + elsif ($word =~ /^define\(\s*~(\w+)\s*,(.*)\)$/s) {
- + $code .= " (\$params{specFuncs}->{$1}="; # (exists \$params{specFuncs}) &&
- + $code .= "memoize(";
- + $code .= &pagespec_makeperl($2);
- + $code .= ")";
- + $code .= ") ";
+ + elsif ($word =~ /^define\(\s*(~\w+)\s*,(.*)\)$/s) {
+ + my $name = $1;
+ + my $subSpec = $2;
+ + my $newSpecFunc = pagespec_translate($subSpec, $specFuncsRef);
+ + return if $@ || ! defined $newSpecFunc;
+ + $specFuncsRef->{$name} = $newSpecFunc;
+ + push @data, qq{Created named pagespec "$name"};
+ + $code.="IkiWiki::SuccessReason->new(\$data[$#data])";
+ }
+ elsif ($word =~ /^(\w+)\((.*)\)$/s) {
if (exists $IkiWiki::PageSpec::{"match_$1"}) {
- - $code.="IkiWiki::PageSpec::match_$1(\$page, ".safequote($2).", \@_)";
- + $code.="IkiWiki::PageSpec::match_$1(\$page, ".safequote($2).", \%params)";
+ push @data, $2;
+ - $code.="IkiWiki::PageSpec::match_$1(\$page, \$data[$#data], \@_)";
+ + $code.="IkiWiki::PageSpec::match_$1(\$page, \$data[$#data], \@_, specFuncs => \$specFuncsRef)";
}
else {
- $code.=' 0';
- }
+ push @data, qq{unknown function in pagespec "$word"};
+ @@ -1817,7 +1832,7 @@ sub pagespec_translate ($) {
}
else {
- - $code.=" IkiWiki::PageSpec::match_glob(\$page, ".safequote($word).", \@_)";
- + $code.=" IkiWiki::PageSpec::match_glob(\$page, ".safequote($word).", \%params)";
+ push @data, $word;
+ - $code.=" IkiWiki::PageSpec::match_glob(\$page, \$data[$#data], \@_)";
+ + $code.=" IkiWiki::PageSpec::match_glob(\$page, \$data[$#data], \@_, specFuncs => \$specFuncsRef)";
}
}
- @@ -1683,8 +1701,18 @@ sub pagespec_translate ($) {
- $code=0;
+ @@ -1826,7 +1841,7 @@ sub pagespec_translate ($) {
}
- + return 'sub { my $page=shift; my %params = @_; '.$code.' }';
- +}
- +
- +sub pagespec_translate ($) {
- + my $spec=shift;
- +
- + my $code = pagespec_makeperl($spec);
- +
- + # print STDERR "Spec '$spec' generated code '$code'\n";
- +
no warnings;
- return eval 'sub { my $page=shift; '.$code.' }';
- + return eval $code;
+ + return eval 'memoize (sub { my $page=shift; '.$code.' })';
}
sub pagespec_match ($$;@) {
- @@ -1699,7 +1727,7 @@ sub pagespec_match ($$;@) {
+ @@ -1839,7 +1854,7 @@ sub pagespec_match ($$;@) {
+ unshift @params, 'location';
+ }
- my $sub=pagespec_translate($spec);
- return IkiWiki::FailReason->new("syntax error in pagespec \"$spec\"") if $@;
- - return $sub->($page, @params);
- + return $sub->($page, @params, specFuncs => {});
- }
+ - my $sub=pagespec_translate($spec);
+ + my $sub=pagespec_translate($spec, +{});
+ return IkiWiki::ErrorReason->new("syntax error in pagespec \"$spec\"")
+ if $@ || ! defined $sub;
+ return $sub->($page, @params);
+ @@ -1850,7 +1865,7 @@ sub pagespec_match_list ($$;@) {
+ my $spec=shift;
+ my @params=@_;
+ - my $sub=pagespec_translate($spec);
+ + my $sub=pagespec_translate($spec, +{});
+ error "syntax error in pagespec \"$spec\""
+ if $@ || ! defined $sub;
+
+ @@ -1872,7 +1887,7 @@ sub pagespec_match_list ($$;@) {
sub pagespec_valid ($) {
- @@ -1748,11 +1776,78 @@ sub new {
+ my $spec=shift;
+
+ - my $sub=pagespec_translate($spec);
+ + my $sub=pagespec_translate($spec, +{});
+ return ! $@;
+ }
+
+ @@ -1919,6 +1934,68 @@ sub new {
package IkiWiki::PageSpec;
@@ -342,15 +613,14 @@ account all comments above (which doesn't mean it is above reproach :) ). --[[W
+ my $page=shift;
+ my $specName=shift;
+ my %params=@_;
- +
- + error("Unable to find specFuncs in params to check_named_spec()!") unless exists $params{specFuncs};
+ +
+ + return IkiWiki::ErrorReason->new("Unable to find specFuncs in params to check_named_spec()!")
+ + unless exists $params{specFuncs};
+
+ my $specFuncsRef=$params{specFuncs};
- +
- + return IkiWiki::FailReason->new("Named page spec '$specName' is not valid")
+ +
+ + return IkiWiki::ErrorReason->new("Named page spec '$specName' is not valid")
+ unless (substr($specName, 0, 1) eq '~');
- +
- + $specName = substr($specName, 1);
+
+ if (exists $specFuncsRef->{$specName}) {
+ # remove the named spec from the spec refs
@@ -361,7 +631,7 @@ account all comments above (which doesn't mean it is above reproach :) ). --[[W
+ $specFuncsRef->{$specName} = $sub;
+ return $result;
+ } else {
- + return IkiWiki::FailReason->new("Page spec '$specName' does not exist");
+ + return IkiWiki::ErrorReason->new("Page spec '$specName' does not exist");
+ }
+}
+
@@ -370,14 +640,14 @@ account all comments above (which doesn't mean it is above reproach :) ). --[[W
+ my $specName=shift;
+ my $funcref=shift;
+ my %params=@_;
- +
- + error("Unable to find specFuncs in params to check_named_spec_existential()!") unless exists $params{specFuncs};
+ +
+ + return IkiWiki::ErrorReason->new("Unable to find specFuncs in params to check_named_spec_existential()!")
+ + unless exists $params{specFuncs};
+ my $specFuncsRef=$params{specFuncs};
+
- + return IkiWiki::FailReason->new("Named page spec '$specName' is not valid")
+ + return IkiWiki::ErrorReason->new("Named page spec '$specName' is not valid")
+ unless (substr($specName, 0, 1) eq '~');
- + $specName = substr($specName, 1);
- +
+ +
+ if (exists $specFuncsRef->{$specName}) {
+ # remove the named spec from the spec refs
+ # when we recurse to avoid infinite recursion
@@ -389,7 +659,7 @@ account all comments above (which doesn't mean it is above reproach :) ). --[[W
+ my $tempResult = $funcref->($page, $nextpage, %params);
+ if ($tempResult) {
+ $specFuncsRef->{$specName} = $sub;
- + return $tempResult;
+ + return IkiWiki::SuccessReason->new("Existential check of '$specName' matches because $tempResult");
+ }
+ }
+ }
@@ -397,12 +667,14 @@ account all comments above (which doesn't mean it is above reproach :) ). --[[W
+ $specFuncsRef->{$specName} = $sub;
+ return IkiWiki::FailReason->new("No page in spec '$specName' was successfully matched");
+ } else {
- + return IkiWiki::FailReason->new("Named page spec '$specName' does not exist");
+ + return IkiWiki::ErrorReason->new("Named page spec '$specName' does not exist");
+ }
+}
+
- sub match_glob ($$;@) {
- my $page=shift;
+ sub derel ($$) {
+ my $path=shift;
+ my $from=shift;
+ @@ -1937,6 +2014,10 @@ sub match_glob ($$;@) {
my $glob=shift;
my %params=@_;
@@ -410,30 +682,31 @@ account all comments above (which doesn't mean it is above reproach :) ). --[[W
+ return check_named_spec($page, $glob, %params);
+ }
+
- my $from=exists $params{location} ? $params{location} : '';
-
- # relative matching
- @@ -1782,11 +1877,12 @@ sub match_internal ($$;@) {
+ $glob=derel($glob, $params{location});
+
+ my $regexp=IkiWiki::glob2re($glob);
+ @@ -1959,8 +2040,9 @@ sub match_internal ($$;@) {
sub match_link ($$;@) {
my $page=shift;
- my $link=lc(shift);
- + my $fulllink=shift;
+ + my $fullLink=shift;
my %params=@_;
- + my $link=lc($fulllink);
+ + my $link=lc($fullLink);
+ $link=derel($link, $params{location});
my $from=exists $params{location} ? $params{location} : '';
- -
- +
- # relative matching
- if ($link =~ m!^\.! && defined $from) {
- $from=~s#/?[^/]+$##;
- @@ -1804,19 +1900,32 @@ sub match_link ($$;@) {
+ @@ -1975,25 +2057,37 @@ sub match_link ($$;@) {
}
else {
return IkiWiki::SuccessReason->new("$page links to page $p matching $link")
- if match_glob($p, $link, %params);
- + if match_glob($p, $fulllink, %params);
+ + if match_glob($p, $fullLink, %params);
+ $p=~s/^\///;
+ $link=~s/^\///;
+ return IkiWiki::SuccessReason->new("$page links to page $p matching $link")
+ - if match_glob($p, $link, %params);
+ + if match_glob($p, $fullLink, %params);
}
}
return IkiWiki::FailReason->new("$page does not link to $link");
@@ -455,23 +728,24 @@ account all comments above (which doesn't mean it is above reproach :) ). --[[W
sub match_created_before ($$;@) {
my $page=shift;
my $testpage=shift;
- + my @params=@_;
+ my %params=@_;
+ -
+
+ if (substr($testpage, 0, 1) eq '~') {
- + return check_named_spec_existential($page, $testpage, \&match_created_before, @params);
+ + return check_named_spec_existential($page, $testpage, \&match_created_before, %params);
+ }
+ +
+ $testpage=derel($testpage, $params{location});
if (exists $IkiWiki::pagectime{$testpage}) {
- if ($IkiWiki::pagectime{$page} < $IkiWiki::pagectime{$testpage}) {
- @@ -1834,6 +1943,11 @@ sub match_created_before ($$;@) {
- sub match_created_after ($$;@) {
- my $page=shift;
+ @@ -2014,6 +2108,10 @@ sub match_created_after ($$;@) {
my $testpage=shift;
- + my @params=@_;
- +
+ my %params=@_;
+
+ if (substr($testpage, 0, 1) eq '~') {
- + return check_named_spec_existential($page, $testpage, \&match_created_after, @params);
+ + return check_named_spec_existential($page, $testpage, \&match_created_after, %params);
+ }
+ +
+ $testpage=derel($testpage, $params{location});
if (exists $IkiWiki::pagectime{$testpage}) {
- if ($IkiWiki::pagectime{$page} > $IkiWiki::pagectime{$testpage}) {
diff --git a/doc/todo/wiki-formatted_comments_with_syntax_plugin.mdwn b/doc/todo/wiki-formatted_comments_with_syntax_plugin.mdwn
index a5244c9ef..7a4a295d4 100644
--- a/doc/todo/wiki-formatted_comments_with_syntax_plugin.mdwn
+++ b/doc/todo/wiki-formatted_comments_with_syntax_plugin.mdwn
@@ -2,3 +2,8 @@
wiki syntax within the comments of code pretty-printed with the
[[plugins/contrib/syntax]] plugin. This would allow the use of links and
formatting in comments.
+
+> You can do this using the [[plugins/highlight]] plugin, but you have
+> to explicitly put a format directive in the comment to do it. Thus,
+> I'm leaving this open for now.. ideally, comments would be detected,
+> and formatted as markdown. --[[Joey]]
diff --git a/doc/users/jasonblevins.mdwn b/doc/users/jasonblevins.mdwn
index 0030a30d3..b50e4844a 100644
--- a/doc/users/jasonblevins.mdwn
+++ b/doc/users/jasonblevins.mdwn
@@ -2,12 +2,12 @@
I'm currently hosting a private ikiwiki for keeping research notes
which, with some patches and a plugin (below), will
-convert inline LaTeX expressions to MathML. I'm working towards a
+convert inline [[todo/LaTeX]] expressions to [[MathML]]. I'm working towards a
patchset and instructions for others to do the same.
I've setup a test ikiwiki [here](http://xbeta.org/colab/) where I've
started keeping a few notes on my progress. There is an example of
-inline SVG on the homepage (note that the logo scales along with the
+inline [[todo/SVG]] on the homepage (note that the logo scales along with the
font size). There are a few example mathematical expressions in the
[sandbox](http://xbeta.org/colab/sandbox/). The MathML is generated
automatically from inline LaTeX expressions using an experimental
@@ -32,23 +32,23 @@ These plugins are experimental. Use them at your own risk. Read the
perldoc documentation for more details. Patches and suggestions are
welcome.
- * [mdwn_itex][] - Works with the `mdwn` plugin to convert inline LaTeX
- expressions to MathML using `itex2MML`.
+ * [mdwn_itex][] - Works with the [[`mdwn`|plugins/mdwn]] plugin to convert inline [[todo/LaTeX]]
+ expressions to [[MathML]] using `itex2MML`.
* [h1title][] - If present, use the leading level 1 Markdown header to
set the page title and remove it from the page body.
- * [code][] - Whole file and inline code snippet syntax highlighting
+ * [code][] - Whole file and inline code snippet [[todo/syntax highlighting]]
via GNU Source-highlight. The list of supported file extensions is
configurable. There is also some preliminary [documentation][code-doc].
See the [FortranWiki](http://fortranwiki.org) for examples.
- * [metamail][] - a plugin for loading metadata from email-style
+ * [metamail][] - a plugin for loading metadata from [[email]]-style
headers at top of a file (e.g., `title: Page Title` or
`date: November 2, 2008 11:14 EST`).
- * [pandoc][] - Markdown page processing via [Pandoc](http://johnmacfarlane.net/pandoc/) (a Haskell library for converting from one markup format to another). LaTeX and
- reStructuredText are optional.
+ * [pandoc][] - [[ikiwiki/Markdown]] page processing via [Pandoc](http://johnmacfarlane.net/pandoc/) (a Haskell library for converting from one markup format to another). [[todo/LaTeX]] and
+ [[reStructuredText|plugins/rst]] are optional.
* [path][] - Provides path-specific template conditionals such as
`IS_HOMEPAGE` and `IN_DIR_SUBDIR`.
diff --git a/doc/wikitemplates.mdwn b/doc/wikitemplates.mdwn
index f365cd5aa..6c0480cea 100644
--- a/doc/wikitemplates.mdwn
+++ b/doc/wikitemplates.mdwn
@@ -36,6 +36,8 @@ located in /usr/share/ikiwiki/templates by default.
[[plugins/comments]] plugin.
* `commentmoderation.tmpl` - This template is used to produce the comment
moderation form.
+* `recentchanges.tmpl` - This template is used for listing a change
+ on the RecentChanges page.
The [[plugins/pagetemplate]] plugin can allow individual pages to use a
different template than `page.tmpl`.