diff options
Diffstat (limited to 'doc')
644 files changed, 15277 insertions, 2471 deletions
diff --git a/doc/banned_users.mdwn b/doc/banned_users.mdwn index d2bec90f0..c44f8c587 100644 --- a/doc/banned_users.mdwn +++ b/doc/banned_users.mdwn @@ -1,4 +1,10 @@ -Banned users can be configured in the setup file. +Banned users can be configured in the setup file via the `banned_users` +setting. This is a list of user names, or [[PageSpecs|ikiwiki/PageSpec]] +to ban. Using a PageSpec is useful to block an IP address. + +For example: + + banned_users => ['evilspammer', 'ip(192.168.1.1)'], If a banned user attempts to use the ikiwiki CGI, they will receive a 403 Forbidden webpage indicating they are banned. diff --git a/doc/basewiki/index.mdwn b/doc/basewiki/index.mdwn index 05834e079..4187c1162 100644 --- a/doc/basewiki/index.mdwn +++ b/doc/basewiki/index.mdwn @@ -4,4 +4,4 @@ All wikis are supposed to have a [[SandBox]], so this one does too. ---- -This wiki is powered by [ikiwiki](http://ikiwiki.info/). +This wiki is powered by [[ikiwiki]]. diff --git a/doc/bugs/2.45_Compilation_error.mdwn b/doc/bugs/2.45_Compilation_error.mdwn index c69c2fc25..63147b656 100644 --- a/doc/bugs/2.45_Compilation_error.mdwn +++ b/doc/bugs/2.45_Compilation_error.mdwn @@ -189,3 +189,10 @@ Would you suggest I try rebuilding perl without this patch? Debian has a huge pe it's not straightforward for me to see if they do something similar to Arch. > I think Debian has a similar patch. + +--- + +[[done]] -- apparently this was a problem due to a distribution's +customisation to perl, or something. Seems to late now to track down what, +unfortunatly. And ikiwiki's Makefile no longer uses the "-libdir" switch +that seemed to trigger the bug. --[[Joey]] diff --git a/doc/bugs/404_plugin_and_lighttpd.mdwn b/doc/bugs/404_plugin_and_lighttpd.mdwn new file mode 100644 index 000000000..e478d98c3 --- /dev/null +++ b/doc/bugs/404_plugin_and_lighttpd.mdwn @@ -0,0 +1,35 @@ +Lighttpd apparently sets REDIRECT_STATUS=200 for the server.error-handler-404 page. This breaks the [[plugins/404]] plugin which checks this variable for 404 before processing the URI. It also doesn't seem to set REDIRECT_URL. + +I was able to fix my server to check the REQUEST_URI for ikiwiki.cgi and to continue processing if it was not found, passing $ENV{SEVER_NAME} . $ENV{REQUEST_URI} as the first parameter to cgi_page_from_404. However, my perl is terrible and I just made it work rather than figuring out exactly what to do to get it to work on both lighttpd and apache. + +This is with lighttpd 1.4.19 on Debian. + +> /cgi-bin/ikiwiki.cgi?do=goto also provides redirection in the same way, +> if that's any help? You might need to set the lighttpd 404 handler to +> that, then compose REDIRECT_URL from other variables if necessary. +> +> I originally wrote the plugin for Apache; [[weakish]] contributed the +> lighttpd docs and might know more about how to make it work there. +> --[[smcv]] + +>> As I said, I got it working for me, but somebody who knows perl should probably look at it with the aim of making it work for everyone. +>> I considered having lighttpd construct a proper url for the 404 redirect itself, but I don't know if it can do something like that or not. +>> For what it's worth, here's the change I made to the module: + + sub cgi ($) { + my $cgi=shift; + if ($ENV{REQUEST_URI} !~ /ikiwiki\.cgi/) { + my $page = cgi_page_from_404( + Encode::decode_utf8($ENV{SERVER_NAME} . $ENV{REQUEST_URI}), + $config{url}, $config{usedirs}); + IkiWiki::Plugin::goto::cgi_goto($cgi, $page); + } + + # if (exists $ENV{REDIRECT_STATUS} && + # $ENV{REDIRECT_STATUS} eq '404') { + # my $page = cgi_page_from_404( + # Encode::decode_utf8($ENV{REDIRECT_URL}), + # $config{url}, $config{usedirs}); + # IkiWiki::Plugin::goto::cgi_goto($cgi, $page); + # } + } diff --git a/doc/bugs/Another_UTF-8_problem.mdwn b/doc/bugs/Another_UTF-8_problem.mdwn index 031576f00..d67ed2fa0 100644 --- a/doc/bugs/Another_UTF-8_problem.mdwn +++ b/doc/bugs/Another_UTF-8_problem.mdwn @@ -11,3 +11,6 @@ with my pretty standard Ubuntu gutsy Firefox installation? --[[tschwinge]] > removed that line to fix it. --[[Joey]] [[!tag done]] + +Now we test it for Cyrillic and Western letters: +Протестируем кириллицу и ещё «_другие_» буквы: grüne Öl & hôtel — 3² × 2° --Shoorick diff --git a/doc/bugs/Building_a_sidebar_does_not_regenerate_the_subpages.mdwn b/doc/bugs/Building_a_sidebar_does_not_regenerate_the_subpages.mdwn index 596719a8b..419292930 100644 --- a/doc/bugs/Building_a_sidebar_does_not_regenerate_the_subpages.mdwn +++ b/doc/bugs/Building_a_sidebar_does_not_regenerate_the_subpages.mdwn @@ -6,4 +6,3 @@ If sandbox/page.mdwn has been generated and sandbox/sidebar.mdwn is created, the # adding a new sidebar page. So adding such a page # currently requires a wiki rebuild. add_depends($page, $sidebar_page); - diff --git a/doc/bugs/Cannot_inline_pages_with_apostrophes_in_title.mdwn b/doc/bugs/Cannot_inline_pages_with_apostrophes_in_title.mdwn index 7daf52f2a..3e1fe823e 100644 --- a/doc/bugs/Cannot_inline_pages_with_apostrophes_in_title.mdwn +++ b/doc/bugs/Cannot_inline_pages_with_apostrophes_in_title.mdwn @@ -3,3 +3,5 @@ page produces nothing. It looks like the inline plugin is failing to do the translation from apostrophe to `_39_` that other parts of the system do, so although one can make wikilinks to such pages and have them detected as existing (for instance, by the conditional plugin), inline looks in the wrong place and doesn't see the page. > I can't reproduce that (btw, an apostrophe would be `__39__`) --[[Joey]] + +[[done]] diff --git a/doc/bugs/Comments_dissapeared.mdwn b/doc/bugs/Comments_dissapeared.mdwn new file mode 100644 index 000000000..787f18c98 --- /dev/null +++ b/doc/bugs/Comments_dissapeared.mdwn @@ -0,0 +1,69 @@ +Although I have comments enabled and I have been using them successfully for ages now, I've come to notice that they have stopped working in the last week or two. + +I am running version 3.20100312 with the following configuration: + +<http://static.natalian.org/2010-03-27/natalian.txt> + +In my (HTML5 modified page.tmpl) it doesn't seem to enter the "TMPL_IF COMMENTS" block anymore. I tried the stock page.tmpl and they didn't seem to work either, so the variable name hasn't changed has it? + +Any other ideas? With thanks, + + comments_pagespec => 'archives/* and !*/Discussion', + +> Your setup file only allows comments to pages under archives. That +> seems unlikely to be right, so I guess it is causing your problem. +> --[[Joey]] + +That's the only place where I want comments. <http://natalian.org/archives/> +Has the pagespec changed? Is it `archives/*/*` or something like that? + +It worked just fine with this configuration. I swear I have not modified it. :) -- [[Kai Hendry]] + +> No changes that I can think of. 'archives/*' will match *all* pages under +> archives. Anyway, I can see in your site's rss feed that comments are +> enabled for posts, since they have comments tags there. And +> in fact I see comments on eg +> <http://natalian.org/archives/2010/03/25/BBC_News_complaints/>. +> +> So I suspect you have simply not rebuilt your wiki after making some +> change that fixed the comments, and so only newer pages are getting them. +> --[[Joey]] + +I have tried rebuilding on my squeeze system and still comments don't appear. Any clues how to debug this? +<http://natalian.org/comments/> + +I was worried is was due to a time skew problem I was experiencing on my VPS in the last month, though the time is right now and still comments do not appear on blog posts like <http://natalian.org/archives/2010/03/25/BBC_News_complaints/> + +# Debugging templates + +`sudo apt-get install libhtml-template-compiled-perl` + + hendry@webconverger templates$ cat test-template.perl + #!/usr/bin/perl + use HTML::Template::Compiled; + local $HTML::Template::Compiled::DEBUG = 1; + my $htc = HTML::Template::Compiled->new( + filename => "$ARGV[0]", + ); + eval { + print $htc->output; + }; + if ($@) { + # reports as text + my $msg = $htc->debug_code; + # reports as a html table + my $msg_html = $htc->debug_code('html'); + } + hendry@webconverger templates$ ./test-template.perl page.tmpl + Missing closing tag for 'IF' atend of page.tmpl line 159 + + +I think the problem was before that it was `<TMPL_IF COMMENTS>` and now it is `<TMPL_IF NAME="COMMENTS">` ? + + + +# Solved + +A merge with the templates in master with my [html5](http://git.webconverger.org/?p=ikiwiki;a=shortlog;h=refs/heads/html5) branch looks like it has solved the problem. Also see [[bugs/html5_support]]. + +[[bugs/done]] diff --git a/doc/bugs/Error:_Your_login_session_has_expired._.mdwn b/doc/bugs/Error:_Your_login_session_has_expired._.mdwn index 046d6e10d..b993cd8e7 100644 --- a/doc/bugs/Error:_Your_login_session_has_expired._.mdwn +++ b/doc/bugs/Error:_Your_login_session_has_expired._.mdwn @@ -41,4 +41,6 @@ Whilst trying to edit http://hugh.vm.bytemark.co.uk/ikiwiki.cgi via OpenID. Any Thanks for you excellent analysis. The bug was due to old pre-3.0 **templates** laying about. After deleting them, ikiwiki defaults to its own templates. Clever. :-) +Great, this saved me big time! It is a google 1st hit. I had the same with accidentally using old templates. Thanks! --[[cstamas]] + [[bugs/done]] diff --git a/doc/bugs/Exception:_Unknown_function___96__this__39___.mdwn b/doc/bugs/Exception:_Unknown_function___96__this__39___.mdwn new file mode 100644 index 000000000..189ba740f --- /dev/null +++ b/doc/bugs/Exception:_Unknown_function___96__this__39___.mdwn @@ -0,0 +1,70 @@ +I'm very excited to try out ikiwiki, since it should fit my purposes extremely well, but I'm having trouble with the search plugin. I'm pretty sure that right after I installed ikiwiki and needed dependencies, the search plugin was working fine. However, now when I try to use search, I get "Exception: Unknown function `this'" error on a blank page. I'm not sure how I should go about debugging this issue - my server's (I use Lighttpd 1.4.22) error log has no mention of the exception and there's nothing in /var/log/syslog either. + +What might be causing this exception and how I might go about debugging exceptions? + +> Appears to be coming from your xapian omega cgi binary. If you +> run `strings /usr/lib/cgi-bin/omega/omega` you can see it has +> "Exception: " in it, and I have found some similar (but not identical) +> error messages from xapian in a web search. +> +> I don´t know what to suggest, other than upgrade/downgrade/reinstall +> xapian-omega, and contacting the xapian developers for debugging. +> You could try rebuilding your wiki in case it is somehow +> caused by a problem with the xapian database. Failing everything, you +> could switch to [[google_search_plugin|plugins/google]]. --[[Joey]] + +>> Thanks, Joey. With your help I was able to figure out what was wrong. It's a fun little bug (or feature): the title of my wiki had string `$this` in title and that's what was causing the omega binary to choke. My wiki's title was inserted without escaping into the query template used by omega. Omega treated `$this` in the title as a function name and threw an exception because no such function was defined. To avoid this behavior, I used an html entity in the title, so `$this` became `$this`. I don't think that the wiki title should be inserted into the template without escaping - it can produce an error that's not trivial to debug. If users want to modify the html in the title, they should be editing respective templates, not typing html in the wiki title input. What do you think? --[[dkobozev]] + +>>> Sounds like a bug in omega, and one that probably would affect other +>>> users of omega too. Ikiwiki could work around it by pre-escaping +>>> data before passing it to xapian. I have not quite managed to reproduce it though; +>>> tried setting a page title to '$this' and 'foo $this'. +>>> That's with version 1.0.18 of omega. +>>> --[[Joey]] + +>>>> I tried it with both omega 1.0.13 and omega 1.0.18 and the issue is present in both. If I view the contents of {$srcdir}/.ikiwiki/xapian/templates/query, I can see that the wiki title is inserted verbatim and there are calls to `$setmap`, `$set` and `$def` etc in the template. --[[dkobozev]] + +>>>>> I don't see how that's relevant. It would help if you showed me +>>>>> exactly something that could be inserted into a page to cause the +>>>>> problem. --[[Joey]] + +>>>>>> Correct me if I'm wrong: ikiwiki generates an Omega template from its own templates, such as searchquery.tmpl and puts it into {$srcdir}/.ikiwiki/xapian/templates/query. Omega has its own template syntax, where function names are prefixed with dollar signs (`$`). So, when I call my wiki `$foobar`, ikiwiki generates an Omega template that looks like this snippet: + + <div id="container"> + <div class="pageheader"> + <div class="header"> + <span> + <a href="http://example.com">$foobar</ a>/search + </span> + </div> + </div> <!-- .pageheader --> + + <div id="content"> + $setmap{prefix,title,S} + $setmap{prefix,link,XLINK} + $set{thousand,$.}$set{decimal,.}$setmap{BN,,Any Country,uk,England,fr,France} + ${ + $def{PREV, + $if{$ne{$topdoc,0},<INPUT TYPE=image NAME="<" ALT="<" + SRC="/images/xapian-omega/prev.png" BORDER=0 HEIGHT=30 WIDTH=30>, + <IMG ALT="" SRC="/images/xapian-omega/prevoff.png" HEIGHT=30 WIDTH=30>} + +>>>>>> So `$foobar` clashes with Omega's template tags. Does this help? + +>>>>>>> Ahh. I had somehow gotten it into my head that you were talking +>>>>>>> about the title of a single page, not of the whole wiki. But +>>>>>>> you were clear all along it was the wiki title. Sorry for +>>>>>>> misunderstanding. I've put in a complete fix for this problem. +>>>>>>> if this was in [[bugs]], I'd close it. :) --[[Joey]] + +>>>>>>>> Rather than escaping `$` as an HTML entity, it would be more natural +>>>>>>>> to escape it as `$$` (since you are escaping it for Omega, not for +>>>>>>>> the web browser. +>>>>>>>> +>>>>>>>> Also if ikiwiki can put arbitrary text inside the parameters of an +>>>>>>>> OmegaScript command, you should also escape `{`, `}` and `,` as +>>>>>>>> `$(`, `$)` and `$.`. It's only necessary to do so inside the +>>>>>>>> parameters of a command, but it will work and be easier to escape +>>>>>>>> them in any substituted text. --OllyBetts + +[[done]] diff --git a/doc/bugs/External_links_with_Creole.mdwn b/doc/bugs/External_links_with_Creole.mdwn new file mode 100644 index 000000000..3d800b04e --- /dev/null +++ b/doc/bugs/External_links_with_Creole.mdwn @@ -0,0 +1,3 @@ +When using Creole for markup, creating an external link appears to be impossible. Neither \[[Outside URL|http://example.com]] nor <<http://example.com>> nor \[Outside URL]\(http://example.com) work. The first gets rendered as a broken WikiLink, the second get eaten and the last is not parsed in anyway so you end up with that exact text in your page. + +I'd have made this as a Creole page as a practical demonstration, but that doesn't seem possible here. Here's a page with an example: <https://www.icanttype.org//demo/CreoleExternalLinks> diff --git a/doc/bugs/Git:_changed_behavior_w.r.t._timestamps.mdwn b/doc/bugs/Git:_changed_behavior_w.r.t._timestamps.mdwn new file mode 100644 index 000000000..164e62075 --- /dev/null +++ b/doc/bugs/Git:_changed_behavior_w.r.t._timestamps.mdwn @@ -0,0 +1,214 @@ +After some months, I just updated my local ikiwiki sources, and rebuilt +the Hurd web pages, <http://git.savannah.gnu.org/cgit/hurd/web.git/>. + +I was confused, having switched to the new automatic (thanks!) --gettime +mechanism, why on some pages the timestamps had changed compared to my +previous use of --getctime and setting files' mtimes (using a script) +according to the last Git commit. For example: + +community/weblogs/ArneBab/2008-08-02-gnu_hurd_and_x.html + +old: + + Last edited <span class="date">2008-09-11 18:11:53 UTC</span> + <!-- Created <span class="date">2008-09-11 17:47:08 UTC</span> --> + +new: + + Last edited <span class="date">2008-09-11 18:12:22 UTC</span> + <!-- Created <span class="date">2008-09-11 17:47:50 UTC</span> --> + + +I had a look at what git.pm is doing, and began to manually replay / +investigate: + + $ git log --pretty=fuller --name-only --relative -- community/weblogs/ArneBab/2008-08-02-gnu_hurd_and_x.mdwn + commit 8f1b97bfe45b2f173e3a7d55dee226a9e289a695 + Author: arnebab <arne_bab@web.de> + AuthorDate: Thu Sep 11 20:11:53 2008 +0200 + Commit: arnebab <arne_bab@web.de> + CommitDate: Thu Sep 11 20:11:53 2008 +0200 + + Added a link to the X.org guide in this wiki. + + community/weblogs/ArneBab/2008-08-02-gnu_hurd_and_x.mdwn + + commit 3ef8b7d80d80572c436c4c60c71879bc74409816 + Author: arnebab <arne_bab@web.de> + AuthorDate: Thu Sep 11 19:47:08 2008 +0200 + Commit: arnebab <arne_bab@web.de> + CommitDate: Thu Sep 11 19:47:08 2008 +0200 + + Minor update on the enty trying to get X working -> 'watch this place for updates' + + community/weblogs/ArneBab/2008-08-02-gnu_hurd_and_x.mdwn + +OK, these are my old dates. + + $ git log --pretty=format:%ci --name-only --relative -- community/weblogs/ArneBab/2008-08-02-gnu_hurd_and_x.mdwn + 2008-09-11 20:11:53 +0200 + community/weblogs/ArneBab/2008-08-02-gnu_hurd_and_x.mdwn + + 2008-09-11 19:47:08 +0200 + community/weblogs/ArneBab/2008-08-02-gnu_hurd_and_x.mdwn + + $ git log --pretty=format:%ct --name-only --relative -- community/weblogs/ArneBab/2008-08-02-gnu_hurd_and_x.mdwn + 1221156713 + community/weblogs/ArneBab/2008-08-02-gnu_hurd_and_x.mdwn + + 1221155228 + community/weblogs/ArneBab/2008-08-02-gnu_hurd_and_x.mdwn + + $ date -d @1221156713 + Thu Sep 11 18:11:53 UTC 2008 + $ date -d @1221155228 + Thu Sep 11 17:47:08 UTC 2008 + +That's all consistent. + + +But: + + $ perl -le 'use Storable; my $index=Storable::retrieve("indexdb"); use Data::Dumper; print Dumper $index' + [...] + 'community/weblogs/ArneBab/2008-08-02-gnu_hurd_and_x.mdwn' => { + 'ctime' => '1221155270', + 'dest' => [ + 'community/weblogs/ArneBab/2008-08-02-gnu_hurd_and_x.html' + ], + 'typedlinks' => { + 'tag' => {} + }, + 'mtime' => 1221156742, + 'depends_simple' => { + 'sidebar' => 1 + }, + 'links' => [ + 'community/weblogs/ArneBab/2008-08-02-gnu_hurd_and_x/discussion', + 'Hurd/DebianXorg' + ], + 'state' => { + [...] + + $ date -d @1221156742 + Thu Sep 11 18:12:22 UTC 2008 + $ date -d @1221155270 + Thu Sep 11 17:47:50 UTC 2008 + +That's different, and it matches what the new ikiwiki writes into the +HTML file. + + +Back to Git again, this time without specifying the file: + + $ git log --pretty=format:%ct --name-only --relative + [...] + 1221255713 + 1221255655 + unsorted/PortingIssues.mdwn + + 1221156742 [Thu Sep 11 18:12:22 UTC 2008] + 1221156713 [Thu Sep 11 18:11:53 UTC 2008] + community/weblogs/ArneBab/2008-08-02-gnu_hurd_and_x.mdwn + + 1221156267 + 1221156235 + index.mdwn + + 1221156122 + 1221156091 + index.mdwn + + 1221155942 + 1221155910 + index.mdwn + + 1221155270 [Thu Sep 11 17:47:50 UTC 2008] + 1221155228 [Thu Sep 11 17:47:08 UTC 2008] + community/weblogs/ArneBab/2008-08-02-gnu_hurd_and_x.mdwn + + 1221154986 + community/gsoc.mdwn + community/gsoc/project_ideas.mdwn + + 1221147244 + whatsnew.html + [...] + +Aha! + +... and some more detail: + + $ git log --pretty=fuller --name-only --relative + [...] + commit e4e89e1683012c879012522105a3471a00714613 + Author: Samuel Thibault <samuel.thibault@ens-lyon.org> + AuthorDate: Fri Sep 12 23:40:55 2008 +0200 + Commit: Samuel Thibault <samuel.thibault@ens-lyon.org> + CommitDate: Fri Sep 12 23:40:55 2008 +0200 + + MSG_NOSIGNAL and IPV6_PKTINFO got fixed + + unsorted/PortingIssues.mdwn + + commit c389fae98dff86527be62f895ff7272e4ab1932c + Merge: 0339e3e 8f1b97b + Author: GNU Hurd wiki engine <web-hurd@gnu.org> + AuthorDate: Thu Sep 11 18:12:22 2008 +0000 + Commit: GNU Hurd wiki engine <web-hurd@gnu.org> + CommitDate: Thu Sep 11 18:12:22 2008 +0000 + + Merge branch 'master' of wiki@192.168.10.50:wiki + + commit 8f1b97bfe45b2f173e3a7d55dee226a9e289a695 + Author: arnebab <arne_bab@web.de> + AuthorDate: Thu Sep 11 20:11:53 2008 +0200 + Commit: arnebab <arne_bab@web.de> + CommitDate: Thu Sep 11 20:11:53 2008 +0200 + + Added a link to the X.org guide in this wiki. + + community/weblogs/ArneBab/2008-08-02-gnu_hurd_and_x.mdwn + [...] + +So, merges are involved there. + +What (the new) ikiwiki code does, is use the timestamp when the merge was +done instead of the timestamp when the commit was done. Is this +intentional? Otherwise I could supply a patch. + +--[[tschwinge]] + +> In order to be nice and fast, the git backend runs git log once +> and records data for all files. Rather than looking at the log for a +> given file. So amoung other things, it does not follow renames. +> +> AFAICS, git log only shows merges modifying files if it was a conflicted +> merge. As the file is then actually modified to resolve the merge +> I think it makes sense to count the merge as the last modification in +> that case. --[[Joey]] + +>> That'd be reasonable, but `git log` will also show merges that are not +>> conflicting (as in my case). + +>>> Actually when displaying a merge, `git log --stat` only lists files that +>>> were actually modified in a new way as part of the merge resolution. +>>> Ie, if the merge resolution only joins together some of the parent +>>> hunks, the file is not listed as having been modified. +>>> +>>> So, no, ikiwiki's use of git log will not show files modified in +>>> non-conflicting merges. +>>> --[[Joey]] + +>> Yet, I'm not totally disagreeing with your choice. With this `git +>> log` invocation, you're not able to tell from its output whether a +>> conflict was resolved or not. + +>> Also, it's a bit like the *should we use the **author timestamp** or +>> **commit timestamp*** discussion. Your code will always use the +>> latest timestamp. + +>> I guess I'll get my head wrapped around that, and it's fine, so this is +>> [[done]]. + +>> --[[tschwinge]] diff --git a/doc/bugs/HTML_for_parentlinks_makes_theming_hard.mdwn b/doc/bugs/HTML_for_parentlinks_makes_theming_hard.mdwn new file mode 100644 index 000000000..0cbef403d --- /dev/null +++ b/doc/bugs/HTML_for_parentlinks_makes_theming_hard.mdwn @@ -0,0 +1,45 @@ +I'm trying to make a pretty theme for ikiwiki and I'm making progress (or at least I think I am :-). However I've noticed an issue when it comes to theming. On the front page the wiki name is put inside the "title" span and on all the other pages, it's put in the "parentlinks" span. See here: + +From [my dev home page](http://adam.shand.net/iki-dev/): + +<code> +<div class="header"> +<span> +<span class="parentlinks"> + +</span> +<span class="title"> +adam.shand.net/iki-dev +</span> +</span><!--.header--> + +</div> +</code> + +From a sub-page of [my dev home page](http://adam.shand.net/iki-dev/recipes/navajo_fry_bread/): + +<code> +<div class="header"> +<span> +<span class="parentlinks"> + +<a href="../">adam.shand.net/iki-dev</a>/ + +</span> +<span class="title"> +recipes +</span> +</span><!--.header--> + +</div> +</code> + +I understand the logic behind doing this (on the front page it is the title as well as the name of the wiki) however if you want to do something different with the title of a page vs. the name of the wiki it makes things pretty tricky. + +I'll just modify the templates for my own site but I thought I'd report it as a bug in the hopes that it will be useful to others. + +Cheers, +Adam. + +---- +> I just noticed that it's also different on the comments, preferences and edit pages. I'll come up with a diff and see what you guys think. -- Adam. diff --git a/doc/bugs/Highlight_extension_uses_hard_coded_paths.mdwn b/doc/bugs/Highlight_extension_uses_hard_coded_paths.mdwn new file mode 100644 index 000000000..1b9cb2e2d --- /dev/null +++ b/doc/bugs/Highlight_extension_uses_hard_coded_paths.mdwn @@ -0,0 +1 @@ +The [[plugins/highlight]] plugin hard codes some paths up the top of the plugin. This means that you need to edit the ikiwiki source if you have highlight installed in a non-standard location (e.g. if you have done a user-level install of the highlight package). diff --git a/doc/bugs/Inline_doesn__39__t_wikilink_to_pages.mdwn b/doc/bugs/Inline_doesn__39__t_wikilink_to_pages.mdwn index 32f9f1245..13b80b436 100644 --- a/doc/bugs/Inline_doesn__39__t_wikilink_to_pages.mdwn +++ b/doc/bugs/Inline_doesn__39__t_wikilink_to_pages.mdwn @@ -2,10 +2,6 @@ It seems that the [[ikiwiki/directive/inline]] directive doesn't generate wikili \[[!inline pages="bugs/* and !*/discussion and backlink(bugs)" feeds=no postform=no archive=yes show="10"]] -But here it is: - -[[!inline pages="bugs/* and !*/discussion and backlink(bugs)" feeds=no postform=no archive=yes show="10"]] - and note that it only included the 'normal' wikilinks (and also note that this page is not marked done even though the done page is inlined). One might also wonder if inline would make this page link to any internal links on those inlined pages too, but I think that would be overkill. diff --git a/doc/bugs/Links_to_missing_pages_should_always_be_styled.mdwn b/doc/bugs/Links_to_missing_pages_should_always_be_styled.mdwn new file mode 100644 index 000000000..73213209a --- /dev/null +++ b/doc/bugs/Links_to_missing_pages_should_always_be_styled.mdwn @@ -0,0 +1,5 @@ +When the CGI URL is not defined, links to missing pages appear as plain, unstyled text. I think the 'createlink' span should always wrap this text, even when the actual question mark linking to the CGI for the create action is missing. This ensures consistent styling regardless of whether the CGI is available or not (and is thus useful for example when the same wiki has clones with the CGI link and clones without). + +A proposed patch is available [on my ikiwiki clone](http://git.oblomov.eu/ikiwiki/patch/290d1b498f00f63e6d41218ddb76d87e68ed5081) + +[[!tag patch cgi done]] diff --git a/doc/bugs/Problems_with_graphviz.pm_plug-in.mdwn b/doc/bugs/Problems_with_graphviz.pm_plug-in.mdwn index c9f698158..bc80125ad 100644 --- a/doc/bugs/Problems_with_graphviz.pm_plug-in.mdwn +++ b/doc/bugs/Problems_with_graphviz.pm_plug-in.mdwn @@ -9,31 +9,12 @@ The graphviz.pm plug-in currently attempts to read PNG data in UTF-8 mode, which It also generates image URLs relative to the page being rendered, which means the URLs wont work when previewing a graph from the CGI script. +(preview bug split to [[Problems_with_graphviz.pm_plug-in_previews]]) + >> Here is an updated patch againt ikiwiki-2.5: >>> [[Applied|done]], thanks. --[[Joey]] - --- IkiWiki/Plugin/graphviz.pm.orig 2007-07-27 11:35:05.000000000 +0200 - +++ IkiWiki/Plugin/graphviz.pm 2007-07-27 11:36:02.000000000 +0200 - @@ -69,7 +69,12 @@ sub render_graph (\%) { - } - } - - - return "<img src=\"".urlto($dest, $params{page})."\" />\n"; - + if ($params{preview}) { - + return "<img src=\"".urlto($dest, "")."\" />\n"; - + } - + else { - + return "<img src=\"".urlto($dest, $params{page})."\" />\n"; - + } - } - - sub graph (@) { - - ->> --[[HenrikBrixAndersen]] - - The patch below fixes these two issues. --- graphviz.pm.orig Thu Jun 7 15:45:16 2007 diff --git a/doc/bugs/Problems_with_graphviz.pm_plug-in_previews.mdwn b/doc/bugs/Problems_with_graphviz.pm_plug-in_previews.mdwn new file mode 100644 index 000000000..c77bbeeaf --- /dev/null +++ b/doc/bugs/Problems_with_graphviz.pm_plug-in_previews.mdwn @@ -0,0 +1,54 @@ +(split from [[Problems_with_graphviz.pm_plug-in]]) + +[graphviz] generates image URLs relative to the page being rendered, which means the URLs wont work when previewing a graph from the CGI script. + +>> Here is an updated patch againt ikiwiki-2.5: + +>>> Applied, thanks. --[[Joey]] + + --- IkiWiki/Plugin/graphviz.pm.orig 2007-07-27 11:35:05.000000000 +0200 + +++ IkiWiki/Plugin/graphviz.pm 2007-07-27 11:36:02.000000000 +0200 + @@ -69,7 +69,12 @@ sub render_graph (\%) { + } + } + + - return "<img src=\"".urlto($dest, $params{page})."\" />\n"; + + if ($params{preview}) { + + return "<img src=\"".urlto($dest, "")."\" />\n"; + + } + + else { + + return "<img src=\"".urlto($dest, $params{page})."\" />\n"; + + } + } + + sub graph (@) { + + +>> --[[HenrikBrixAndersen]] + +>>> Despite this patch I am still experiencing the problem. Normal page source for a graph contains: + + <div id="content"> + <p><img src="./graph-c9fd2a197322feb417bdedbca5e99f5aa65b3f06.png" /></p> + + </div> + +>>> preview contains + + <div id="preview"> + <p><img src="./demo/diagrams/graph-c9fd2a197322feb417bdedbca5e99f5aa65b3f06.png" /></p> + + </div> + +>>> I don't quite understand why, this makes sense from the CGI path (in my +>>> case from the root of the site). The browsers appear to be trying to fetch +>>> `/demo/diagrams/demo/diagrams/graph-c9fd2a197322feb417bdedbca5e99f5aa65b3f06.png` +>>> (i.e., prepending the required relpath twice). -- [[Jon]] + +>>>> Yeah, that patch may have been right once, but it's wrong now; +>>>> preview mode uses `<base>` to make urls work the same as they would +>>>> when viewing the html page. +>>>> +>>>> Perhaps this was not noticed for a while while because it only +>>>> shows up if previewing an *unchanged* graph on a page that has already +>>>> been built before. Fixed now. [[done]] --[[Joey]] diff --git a/doc/bugs/SSI_include_stripped_from_mdwn.mdwn b/doc/bugs/SSI_include_stripped_from_mdwn.mdwn index 5519e45c6..270da86d3 100644 --- a/doc/bugs/SSI_include_stripped_from_mdwn.mdwn +++ b/doc/bugs/SSI_include_stripped_from_mdwn.mdwn @@ -10,7 +10,7 @@ If I have a <--#include virtual="foo" --> in some file, it gets stripped, > Anyway, it makes sense for the htmlscrubber to strip server-side > includes because otherwise your wiki could be attacked > by them being added to it. If you want to use both the htmlscrubber and -> SSI together, I'd suggest you modify the [[wikitemplates]] +> SSI together, I'd suggest you modify the [[templates]] > and put the SSI on there. > > Ie, `page.tmpl` has a diff --git a/doc/bugs/Search_summary_includes_text_from_navigational_elements.mdwn b/doc/bugs/Search_summary_includes_text_from_navigational_elements.mdwn new file mode 100644 index 000000000..b774c4531 --- /dev/null +++ b/doc/bugs/Search_summary_includes_text_from_navigational_elements.mdwn @@ -0,0 +1,22 @@ +Each listed result for a search will show some example text from the beginning of the linked page. It strips out HTML elements, but if there's any navigational text items, they will stay. + +For example, each search result on ikiwiki.info shows "(title) ikiwiki/ (title) Edit RecentChanges History Preferences Discussion" at the start of its results. + +A way to name some CSS ids that should be removed in search results within the ikiwiki setup file would work. Here's something similar that a friend proposed: + +http://leaf.dragonflybsd.org/mailarchive/users/2009-11/msg00077.html + +(bin attachment on that page is actually a .diff.) + +> So I was looking at this and I relized that while the search plugin used +> to use the format hook, and so there was no way to avoid it seeing all +> the gunk around the page body, it was changed a while ago for different +> reasons to use its own hook, postscan. So there's really no reason not +> to move postscan so it runs before said gunk is added to the page. +> (Aside from a small risk of breaking other third-party plugins that +> somehow use postscan.) +> +> I've implemented that in git, and it drops the navigation elements nicely. +> It's perhaps less general than allowing specific divs to be skipped from +> search, but it seems good enough. Please thank the dragonfly guys for their +> work on this. [[done]] --[[Joey]] diff --git a/doc/bugs/Tab_delimited_tables_don__39__t_work.mdwn b/doc/bugs/Tab_delimited_tables_don__39__t_work.mdwn new file mode 100644 index 000000000..39d57a4fe --- /dev/null +++ b/doc/bugs/Tab_delimited_tables_don__39__t_work.mdwn @@ -0,0 +1,22 @@ +Table directive should support tab-delimited data, especially important since this is the format you will get if copy/pasting from an HTML table or spreadsheet (Gnumeric, OO Calc, Excel). Test case which fails: + +[[!table format=dsv delimiter="\t" data=""" +1 2 +2 4 +"""]] + +> They do work, but C-style backslash escapes aren't recognised, +> so the syntax `delimiter="\t"` (as in your test case) looks +> for the literal string `\t`. Replacing `\t` with a literal +> tab character makes it work - here's a test (I changed the data +> to make the table layout more obvious): +> +> [[!table format=dsv delimiter=" " data=""" +left 2 +2 right +alpha beta +"""]] +> +> So, I think this can be considered [[not_a_bug|done]]? --[[smcv]] + +>> I've clarified the documentation. --[[smcv]] diff --git a/doc/bugs/__34__First_post__34___deletion_does_not_refresh_front_page.mdwn b/doc/bugs/__34__First_post__34___deletion_does_not_refresh_front_page.mdwn new file mode 100644 index 000000000..2367335a7 --- /dev/null +++ b/doc/bugs/__34__First_post__34___deletion_does_not_refresh_front_page.mdwn @@ -0,0 +1,6 @@ +When I created an ikiwiki site (on Branchable) using the blog template, it added a "First post", which was fine. +Deleting that post removed it, but the front page did not get the re-generated, so it was still there. +--[[liw]] + +> This is a bug involving the `page()` pagespec. Deleted +> pages matching this pagespec are not noticed. --[[Joey]] [[done]] diff --git a/doc/bugs/absolute_sizes_in_default_CSS.mdwn b/doc/bugs/absolute_sizes_in_default_CSS.mdwn new file mode 100644 index 000000000..bb3c0c7a0 --- /dev/null +++ b/doc/bugs/absolute_sizes_in_default_CSS.mdwn @@ -0,0 +1,39 @@ +While toying around with some font sizes on my persona ikiwiki I discovered that some font sizes in the default CSS are fixed rather than relative. Here's a git patch that replaces them with relative font sizes (assuming the default 12pt/16px base font size recommended by the W3C): + +[[done]] --[[Joey]] + +<pre> +From 01c14db255bbb727d8dd1e72c3f6f2f25f07e757 Mon Sep 17 00:00:00 2001 +From: Giuseppe Bilotta <giuseppe.bilotta@gmail.com> +Date: Tue, 17 Aug 2010 00:48:24 +0200 +Subject: [PATCH] Use relative font-sizes + +--- + doc/style.css | 4 ++-- + 1 files changed, 2 insertions(+), 2 deletions(-) + +diff --git a/doc/style.css b/doc/style.css +index 66d962b..fa4b2a3 100644 +--- a/doc/style.css ++++ b/doc/style.css +@@ -14,7 +14,7 @@ nav { + + .header { + margin: 0; +- font-size: 22px; ++ font-size: 140%; + font-weight: bold; + line-height: 1em; + display: block; +@@ -22,7 +22,7 @@ nav { + + .inlineheader .author { + margin: 0; +- font-size: 18px; ++ font-size: 112%; + font-weight: bold; + display: block; + } +-- +1.7.2.rc0.231.gc73d +</pre> diff --git a/doc/bugs/align_doesn__39__t_always_work_with_img_plugin_.mdwn b/doc/bugs/align_doesn__39__t_always_work_with_img_plugin_.mdwn new file mode 100644 index 000000000..e986bdc82 --- /dev/null +++ b/doc/bugs/align_doesn__39__t_always_work_with_img_plugin_.mdwn @@ -0,0 +1,7 @@ +Using the img plugin to inline an image, the "align" parameter doesn't work as expected if you also include a "caption". + +As best as I can tell this is because the "caption" parameter works by wrapping the image inside a table which means that the "align" parameter is aligning within the table cell rather then the page itself. + +-- AdamShand + +> I agree, this is annoying... and [[done]]! --[[Joey]] diff --git a/doc/bugs/anonok_vs._httpauth.mdwn b/doc/bugs/anonok_vs._httpauth.mdwn new file mode 100644 index 000000000..bff37e18b --- /dev/null +++ b/doc/bugs/anonok_vs._httpauth.mdwn @@ -0,0 +1,118 @@ +I've got a wiki where editing requires [[plugins/httpauth]] (with +`cgiauthurl` working nicely). I now want to let the general public +edit Discussion subpages, so I enabled [[plugins/anonok]] and set +`anonok_pagespec` to `'*/Discussion'`, but HTTP auth is still being +required for those. + +(Actually, what I'll really want to do is probably [[plugins/lockedit]] +and a whitelist of OpenIDs in `locked_pages`...) + +--[[schmonz]] + +> The only way I can see to support this combination is for httpauth with +> cgiauthurl to work more like other actual login types. Which would mean +> that on editing a page that needs authentication, ikiwiki would redirect +> them to the Signin page, which would then have a link they could follow +> to bounce through the cgiauthurl and actually sign in. This would be +> significantly different than the regular httpauth process, in which the +> user signs in in passing. --[[Joey]] + +>> My primary userbase has grown accustomed to the seamlessness of +>> httpauth with SPNEGO, so I'd rather not reintroduce a seam into +>> their web-editing experience in order to let relatively few outsiders +>> edit relatively few pages. When is the decision made about whether +>> the current page can be edited by the current user (if any)? What +>> if there were a way to require particular auth plugins for particular +>> PageSpecs? --[[schmonz]] + +>>> The decision about whether a user can edit a page is made by plugins +>>> such as signinedit and lockedit, that also use canedit hooks to redirect +>>> the user to a signin page if necessary. +>>> +>>> A tweak on my earlier suggestion would be to have httpauth notice when the +>>> Signin page is being built and immediatly redirect to the cgiauthurl +>>> before the page can be shown to the user. This would, though, not play +>>> well with other authentication methods like openid, since the user +>>> would never see the Signin form. --[[Joey]] + +>>>> Would I be able to do what I want with a local plugin that +>>>> abuses canedit (and auth) to reach in and call the appropriate +>>>> plugin's auth method -- e.g., if the page matches */Discussion, +>>>> call `openid:auth()`, else `httpauth:auth()`? --[[schmonz]] + +>>>>> That seems it would be +>>>>> annoying for httpauth users (who were not currently authed), +>>>>> as they would then see the openid signin form when going to edit a +>>>>> Discussion page. +>>>>> --[[Joey]] + +>>>>>> I finally see the problem, I think. When you initially +>>>>>> suggested "a link they could follow to bounce through the +>>>>>> cgiauthurl", presumably this could _be_ the Edit link for +>>>>>> non-Discussion pages, so that the typical case of an httpauth +>>>>>> user editing an editable-only-by-httpauth page doesn't visibly +>>>>>> change. And then the Edit link for Discussion subpages could do +>>>>>> as you suggest, adding one click for the httpauth user, who won't +>>>>>> often need to edit those subpages. --[[schmonz]] + +>> On reflection, I've stopped being bothered by the +>> redirect-to-signin-page approach. (It only needs to happen once per +>> browser session, anyway.) Can we try that? --[[schmonz]] + +Here is an attempt. With this httpauth will only redirect to the +`cgiauth_url` when a page is edited, and it will defer to other plugins +like anonok first. I have not tested this. --[[Joey]] + +<pre> +diff --git a/IkiWiki/Plugin/httpauth.pm b/IkiWiki/Plugin/httpauth.pm +index 127c321..a18f8ca 100644 +--- a/IkiWiki/Plugin/httpauth.pm ++++ b/IkiWiki/Plugin/httpauth.pm +@@ -9,6 +9,8 @@ use IkiWiki 3.00; + sub import { + hook(type => "getsetup", id => "httpauth", call => \&getsetup); + hook(type => "auth", id => "httpauth", call => \&auth); ++ hook(type => "canedit", id => "httpauth", call => \&canedit, ++ last => 1); + } + + sub getsetup () { +@@ -33,9 +35,21 @@ sub auth ($$) { + if (defined $cgi->remote_user()) { + $session->param("name", $cgi->remote_user()); + } +- elsif (defined $config{cgiauthurl}) { +- IkiWiki::redirect($cgi, $config{cgiauthurl}.'?'.$cgi->query_string()); +- exit; ++} ++ ++sub canedit ($$$) { ++ my $page=shift; ++ my $cgi=shift; ++ my $session=shift; ++ ++ if (! defined $cgi->remote_user() && defined $config{cgiauthurl}) { ++ return sub { ++ IkiWiki::redirect($cgi, $config{cgiauthurl}.'?'.$cgi->query_string()); ++ exit; ++ }; ++ } ++ else { ++ return undef; + } + } + +</pre> + +> With `anonok` enabled, this works for anonymous editing of an +> existing Discussion page. auth is still needed to create one. --[[schmonz]] + +>> Refreshed above patch to fix that. --[[Joey]] + +>> Remaining issue: This patch will work with anonok, but not openid or +>> passwordauth, both of which want to display a login page at the same +>> time that httpauth is redirecting to the cgiauthurl. As mentioned above, +>> the only way to deal with that would be to add a link to the signin page +>> that does the httpauth signin. --[[Joey]] + +>>> That's dealt with in final version. [[done]] --[[Joey]] diff --git a/doc/bugs/attachment_upload_does_not_work_for_windows_clients.mdwn b/doc/bugs/attachment_upload_does_not_work_for_windows_clients.mdwn new file mode 100644 index 000000000..4e8c7bdcf --- /dev/null +++ b/doc/bugs/attachment_upload_does_not_work_for_windows_clients.mdwn @@ -0,0 +1,34 @@ +It seems as if windows clients (IE) submit filenames with backslash as directory separator. +(no surprise :-). + +But the attachment plugin translates these backslashes to underscore, making the +whole path a filename. + +> As far as I can see, that just means that the file will be saved with +> a filename something like `c:__92__My_Documents__92__somefile`. +> I don't see any "does not work" here. Error message? +> +> Still, I don't mind adding a special case, though obviously not in +> `basename`. [[done]] --[[Joey]] + +>> Well, it's probably something else also, I get **bad attachment filename**. +>> Now, that could really be a bad filename, problem is that it wasn't. I even +>> tried applying the **wiki_file_prune_regexps** one by one to see what was +>> causing it. No problem there. The strange thing is that the error shows up +>> when using firefox on windows too. But the backslash hack fixes at least the +>> incorrect filename from IE (firefox on windows gave me the correct filename. +>> I'll do some more digging... :-) /jh + +This little hack fixed the backslash problem, although I wonder if that +really is the problem? +(Everything works perfectly from linux clients of course. :-) + + sub basename ($) { + my $file=shift; + + $file=~s!.*/+!!; + $file=~s!.*\\+!!; + return $file; + } + +Should probably be `$file=~s!.*[/\\]+!!` :-) diff --git a/doc/bugs/barfs_on_recentchange_entry_for_a_change_removing_an_invalid_pagespec.mdwn b/doc/bugs/barfs_on_recentchange_entry_for_a_change_removing_an_invalid_pagespec.mdwn index 42e6b9e27..c3cbff43e 100644 --- a/doc/bugs/barfs_on_recentchange_entry_for_a_change_removing_an_invalid_pagespec.mdwn +++ b/doc/bugs/barfs_on_recentchange_entry_for_a_change_removing_an_invalid_pagespec.mdwn @@ -39,3 +39,6 @@ a year ago in September 2007. > ikiwiki. (Doesn't quite seem to be version 2.53.x either) Try with a current > version, and see if you can send me a source tree that can reproduce the > problem? --[[Joey]] + +Did not hear back, so calling this [[done]], unless I hear differently. +--[[Joey]] diff --git a/doc/bugs/bestlink_change_update_issue.mdwn b/doc/bugs/bestlink_change_update_issue.mdwn index 5ce4a93d2..c26e40d10 100644 --- a/doc/bugs/bestlink_change_update_issue.mdwn +++ b/doc/bugs/bestlink_change_update_issue.mdwn @@ -1,13 +1,32 @@ * Has bugs updating things if the bestlink of a page changes due to adding/removing a page. For example, if Foo/Bar links to "Baz", which is Foo/Baz, and Foo/Bar/Baz gets added, it will update the links in Foo/Bar - to point to it, but will forget to update the linkbacks in Foo/Baz. + to point to it, but will forget to update the backlinks in Foo/Baz. -* And if Foo/Bar/Baz is then removed, it forgets to update Foo/Bar to link - back to Foo/Baz. + The buggy code is in `refresh()`, when it determines what + links, on what pages, have changed. It only looks at + changed/added/deleted pages when doing this. But when Foo/Bar/Baz + is added, Foo/Bar is not changed -- so the change it its + backlinks is not noticed. -As of 1.33, this is still true. The buggy code is the %linkchanged -calculation in refresh(), which doesn't detect that the link has changed in -this case. + To fix this, it needs to consider, when rebuilding Foo/Bar for the changed + links, what oldlinks Foo/Bar had. If one of the oldlinks linked to + Foo/Baz, and not links to Foo/Bar/Baz, it could then rebuild Foo/Baz. -Still true in 1.43 although the code is much different now.. + Problem is that in order to do that, it needs to be able to tell that + the oldlinks linked to Foo/Baz. Which would mean either calculating + all links before the scan phase, or keeping a copy of the backlinks + from the last build, and using that. The first option would be a lot + of work for this minor issue.. it might be less expensive to just rebuild + *all* pages that Foo/Bar links to. + + Keeping a copy of the backlinks has some merit. It could also be + incrementally updated. + + This old bug still exists as of 031d1bf5046ab77c796477a19967e7c0c512c417. + +* And if Foo/Bar/Baz is then removed, Foo/Bar gets a broken link, + instead of changing back to linking to Foo/Baz. + + This part was finally fixed by commit + f1ddf4bd98821a597d8fa1532092f09d3d9b5483. diff --git a/doc/bugs/bestlink_returns_deleted_pages.mdwn b/doc/bugs/bestlink_returns_deleted_pages.mdwn new file mode 100644 index 000000000..874f18ead --- /dev/null +++ b/doc/bugs/bestlink_returns_deleted_pages.mdwn @@ -0,0 +1,75 @@ +To reproduce: + +1. Add the backlinkbug plugin below to ikiwiki. +2. Create a page named test.mdwn somewhere in the wiki. +3. Refresh ikiwiki in verbose mode. Pages whose bestlink is the test.mwdn page will be printed to the terminal. +4. Delete test.mdwn. +5. Refresh ikiwiki in verbose mode again. The same pages will be printed to the terminal again. +6. Refresh ikiwiki in verbose mode another time. Now no pages will be printed. + +bestlink() checks %links (and %pagecase) to confirm the existance of the page. +However, find_del_files() does not remove the deleted page from %links (and %pagecase). + +Since find_del_files removes the deleted page from %pagesources and %destsources, +won't it make sense for bestlink() to check %pagesources first? --[[harishcm]] + +> This same problem turned out to also be the root of half of ikiwiki's +> second-oldest bug, [[bestlink_change_update_issue]]. +> +> Fixing it is really a bit involved, see commit +> f1ddf4bd98821a597d8fa1532092f09d3d9b5483. The fix I committed fixes +> bestlink to not return deleted pages, but only *after* the needsbuild and +> scan hooks are called. So I was able to fix it for every case except the +> one you gave! Sorry for that. To fix it during beedsbuild and scan, +> a much more involved approach would be needed. AFAICS, no existing plugin +> in ikiwiki uses bestlink in needsbuild or scan though. +> +> If the other half of [[bestlink_change_update_issue]] is fixed, +> maybe by keeping a copy of the old backlinks info, then that fix could be +> applied here too. --[[Joey]] + +>> Cool that was fast! Well at least half the bug is solved :) For now I'll +>> probably try using a workaround if using bestlink within the needsbuild +>> or scan hooks. Maybe by testing if pagemtime equals zero. --[[harishcm]] + +>>> Yeah, and bestlink could also do that. However, it feels nasty to have +>>> it need to look at pagemtime. --[[Joey]] + +---- + + #!/usr/bin/perl + # Plugin to reproduce bestlink returning deleted pages. + # Run with ikiwiki in verbose mode. + + package IkiWiki::Plugin::bestlinkbug; + + use warnings; + use strict; + use IkiWiki 3.00; + + sub import { + hook(type => "getsetup", id => "bestlinkbug", call => \&getsetup); + hook(type => "needsbuild", id => "bestlinkbug", call => \&needsbuild); + } + + sub getsetup () { + return + plugin => { + safe => 1, + rebuild => 0, + }, + } + + sub needsbuild (@) { + my $needsbuild=shift; + + foreach my $page (keys %pagestate) { + my $testpage=bestlink($page, "test") || next; + + debug("$page"); + } + } + + 1 + + diff --git a/doc/bugs/broken_parentlinks.mdwn b/doc/bugs/broken_parentlinks.mdwn index caf1eeb0e..556d89b65 100644 --- a/doc/bugs/broken_parentlinks.mdwn +++ b/doc/bugs/broken_parentlinks.mdwn @@ -10,7 +10,7 @@ a dead link for every subpage. This is a bug, but fixing it is very tricky. Consider what would happen if example.mdwn were created: example/page.html and the rest of example/* -would need to be updated to change the parentlink from a bare work to a +would need to be updated to change the parentlink from a bare word to a link to the new page. Now if example.mdwn were removed again, they'd need to be updated again. So example/* depends on example. But it's even more tricky, because if example.mdwn is modified, we _don't_ want to rebuild @@ -19,6 +19,10 @@ example/*! ikiwiki doesn't have a way to represent this dependency and can't get one without a lot of new complex code being added. +> Note that this code has now been added. In new terms, example/* has a +> presence dependency on example. So this bug is theoretically fixable now. +> --[[Joey]] + For now the best thing to do is to make sure that you always create example if you create example/foo. Which is probably a good idea anyway.. @@ -27,3 +31,20 @@ example if you create example/foo. Which is probably a good idea anyway.. Note that this bug does not exist if the wiki is built with the "usedirs" option, since in that case, the parent link will link to a subdirectory, that will just be missing the index.html file, but still nicely usable. +--[[Joey]] + +---- + +<http://www.gnu.org/software/hurd/hurd/translator/writing.html> does not exist. +Then, on +<http://www.gnu.org/software/hurd/hurd/translator/writing/example.html>, in the +*parentlinks* line, *writing* links to the top-level *index* file. It should +rather not link anywhere at all. --[[tschwinge]] + +> So, the bug has changed behavior a bit. Rather than a broken link, we get +> a link to the toplevel page. This, FWIW, is because the template now +> uses this for each parentlink: + + <a href="<TMPL_VAR URL>"><TMPL_VAR PAGE></a>/ + +> Best workaround is still to enable usedirs. --[[Joey]] diff --git a/doc/bugs/brokenlinks_accumulates_duplicate_items.mdwn b/doc/bugs/brokenlinks_accumulates_duplicate_items.mdwn new file mode 100644 index 000000000..7fe92f7a9 --- /dev/null +++ b/doc/bugs/brokenlinks_accumulates_duplicate_items.mdwn @@ -0,0 +1,27 @@ +After several runs of ikiwiki --refresh, the page I use with the [[plugin/brokenlinks]] directive on it accumulates multiple repeated lists of pages on the RHS. For example: + + * ?freebies from free kilowatts, free kilowatts, free kilowatts, free kilowatts, free kilowatts + +In this case the page "free kilowatts" has one link to "freebies" (it's tagged freebies). + +I think this may just be links caused by tags, actually. + +ikiwiki version 3.14159265. + +-- [[Jon]] + +> Is it possible that you upgraded from a version older than 3.12, +> and have not rebuilt your wiki since, but just refreshed? And did not run +> `ikiwiki-transition deduplinks`? If so, suggest you rebuild the wiki, +> or run that, either would probably fix the problem. +> +> If you can get to the problem after rebuilding with the current ikiwiki, +> and then refreshing a few times, I guess I will need a copy of the wiki +> source and the `.ikiwiki` directory to reproduce this. --[[Joey]] + +>> Hi Joey, thanks for your response. I've reproduced it post rebuild and after having ran ikiwiki-transition and many refreshes (both resulting from content changes and otherwise) unfortunately, with ikiwiki 3.14159265 (different machine to above report, though). I will contact you privately to provide a git URL and a copy of my .ikiwiki. -- [[Jon]] + +>>> Found the bug that was causing duplicates to get in, and fixed it. +>>> [[done]] --[[Joey]] + +>>>> Good work Joey, thanks! -- [[Jon]] diff --git a/doc/bugs/build_fails_oddly_when_older_ikiwiki_is_installed.mdwn b/doc/bugs/build_fails_oddly_when_older_ikiwiki_is_installed.mdwn new file mode 100644 index 000000000..7b252031b --- /dev/null +++ b/doc/bugs/build_fails_oddly_when_older_ikiwiki_is_installed.mdwn @@ -0,0 +1,31 @@ +I got this failure when trying to build ikiwiki version 3.20100403: + + $ perl Makefile.PL INSTALL_BASE=/opt/ikiwiki PREFIX= + Writing Makefile for IkiWiki + $ make + +*...snip...* + + ./pm_filter /opt/ikiwiki 3.20100403 /opt/ikiwiki/lib/perl5 < ikiwiki.in > ikiwiki.out + chmod +x ikiwiki.out + ./pm_filter /opt/ikiwiki 3.20100403 /opt/ikiwiki/lib/perl5 < ikiwiki-transition.in > ikiwiki-transition.out + chmod +x ikiwiki-transition.out + ./pm_filter /opt/ikiwiki 3.20100403 /opt/ikiwiki/lib/perl5 < ikiwiki-calendar.in > ikiwiki-calendar.out + chmod +x ikiwiki-calendar.out + HOME=/home/me /usr/bin/perl -Iblib/lib ikiwiki.out -libdir . -dumpsetup ikiwiki.setup + Use of uninitialized value $IkiWiki::Setup::config{"setuptype"} in concatenation (.) or string at IkiWiki/Setup.pm line 53. + Can't locate IkiWiki/Setup/.pm in @INC (@INC contains: . /opt/ikiwiki/lib/perl5/i486-linux-gnu-thread-multi /opt/ikiwiki/lib/perl5 blib/lib /etc/perl /usr/local/lib/perl/5.10.1 /usr/local/share/perl/5.10.1 /usr/lib/perl5 /usr/share/perl5 /usr/lib/perl/5.10 /usr/share/perl/5.10 /usr/local/lib/site_perl .) at (eval 35) line 3. + + make: *** [ikiwiki.setup] Error 2 + +Note that I had been trying to upgrade with an installed ikiwiki 3.20091114 +already in place under /opt/ikiwiki. The build does not fail for me +if I first remove the old ikiwiki installation, nor does it fail with +3.20100403 or newer installed at /opt/ikiwiki. Hence this is not +really a critical bug, although it's somewhat perplexing to me why it +ought to make a difference. + +> So, using INSTALL_BASE causes a 'use lib' to be hardcoded into the `.out` +> files; which overrides the -libdir and the -I, and so the old version +> of IkiWiki.pm is used. +> [[fixed|done]] --[[Joey]] diff --git a/doc/bugs/bzr_2.0_breaks_bzr_plugin.mdwn b/doc/bugs/bzr_2.0_breaks_bzr_plugin.mdwn new file mode 100644 index 000000000..39500af20 --- /dev/null +++ b/doc/bugs/bzr_2.0_breaks_bzr_plugin.mdwn @@ -0,0 +1,87 @@ +Version 2.0 of bzr seems to break the bzr plugin. + +I traced this to the bzr_log method in the plugin, and patching that seems to fix it. The plugin just needs to parse the input little bit differently. +--liw + +> Patch applied, [[done]] (but, it would be good if it could be tested with +> an older bzr, and it's a pity bzr's human-targeted log has to be parsed, +> I assume there is no machine-targeted version?) --[[Joey]] + + From fb897114124e627fd3acf5af8e784c9a77419a81 Mon Sep 17 00:00:00 2001 + From: Lars Wirzenius <liw@liw.fi> + Date: Sun, 4 Apr 2010 21:05:07 +1200 + Subject: [PATCH] Fix bzr plugin to work with bzr 2.0. + + The output of "bzr log" seems to have changed a bit, so we change the + parsing accordingly. This has not been tested with earlier versions of + bzr. + + Several problems seemed to occur, all in the bzr_log subroutine: + + 1. The @infos list would contain an empty hash, which would confuse the + rest of the program. + 2. This was because bzr_log would push an empty anonymous hash to the + list whenever it thought a new record would start. + 3. However, a new record marker (now?) also happens at th end of bzr log + output. + 4. Now we collect the record to a hash that gets pushed to the list only + if it is not empty. + 5. Also, sometimes bzr log outputs "revno: 1234 [merge]", so we catch only + the revision number. + 6. Finally, there may be non-headers at the of the output, so we ignore + those. + --- + IkiWiki/Plugin/bzr.pm | 23 ++++++++++++++++------- + 1 files changed, 16 insertions(+), 7 deletions(-) + + diff --git a/IkiWiki/Plugin/bzr.pm b/IkiWiki/Plugin/bzr.pm + index 1ffdc23..e813331 100644 + --- a/IkiWiki/Plugin/bzr.pm + +++ b/IkiWiki/Plugin/bzr.pm + @@ -73,28 +73,37 @@ sub bzr_log ($) { + my @infos = (); + my $key = undef; + + + my $hash = {}; + while (<$out>) { + my $line = $_; + my ($value); + if ($line =~ /^message:/) { + $key = "message"; + - $infos[$#infos]{$key} = ""; + + $$hash{$key} = ""; + } + elsif ($line =~ /^(modified|added|renamed|renamed and modified|removed):/) { + $key = "files"; + - unless (defined($infos[$#infos]{$key})) { $infos[$#infos]{$key} = ""; } + + unless (defined($$hash{$key})) { $$hash{$key} = ""; } + } + elsif (defined($key) and $line =~ /^ (.*)/) { + - $infos[$#infos]{$key} .= "$1\n"; + + $$hash{$key} .= "$1\n"; + } + elsif ($line eq "------------------------------------------------------------\n") { + + if (keys %$hash) { + + push (@infos, $hash); + + } + + $hash = {}; + $key = undef; + - push (@infos, {}); + } + - else { + + elsif ($line =~ /: /) { + chomp $line; + - ($key, $value) = split /: +/, $line, 2; + - $infos[$#infos]{$key} = $value; + + if ($line =~ /^revno: (\d+)/) { + + $key = "revno"; + + $value = $1; + + } else { + + ($key, $value) = split /: +/, $line, 2; + + } + + $$hash{$key} = $value; + } + } + close $out; + -- + 1.7.0 diff --git a/doc/bugs/cgi_does_not_use_templatedir_overlay.mdwn b/doc/bugs/cgi_does_not_use_templatedir_overlay.mdwn index eb300e7c4..d86a3ac3e 100644 --- a/doc/bugs/cgi_does_not_use_templatedir_overlay.mdwn +++ b/doc/bugs/cgi_does_not_use_templatedir_overlay.mdwn @@ -20,3 +20,7 @@ However, when I make a change via the CGI (which has been created by the last se > problems; it could check if the templatedir exists, and check that it's > readable. This would add some extra system calls to every ikiwiki run, > and I'm not convinced it's worth it. --[[Joey]] + +>> Closing this bug since I never heard back that it was not one +>> of the above two problems, and I consider both problems local +>> configuration errors. --[[Joey]] [[done]] diff --git a/doc/bugs/clearenv_not_present_at_FreeBSD_.mdwn b/doc/bugs/clearenv_not_present_at_FreeBSD_.mdwn new file mode 100644 index 000000000..f38c86e03 --- /dev/null +++ b/doc/bugs/clearenv_not_present_at_FreeBSD_.mdwn @@ -0,0 +1,5 @@ +When build wrapper on FreeBSD system, is error occured with clearenv reference. clearenv() das not exists at FreeBSD system, use workaround environ[0]=NULL; +P.S. new git instalation, FreeBSD 7.x + +> `#include <stupid-standards.h>` fixed with nasty ifdefs to handle tcc w/o +> breaking everything else. [[done]] --[[Joey]] diff --git a/doc/bugs/clearenv_not_present_at_FreeBSD_/discussion.mdwn b/doc/bugs/clearenv_not_present_at_FreeBSD_/discussion.mdwn new file mode 100644 index 000000000..713198b61 --- /dev/null +++ b/doc/bugs/clearenv_not_present_at_FreeBSD_/discussion.mdwn @@ -0,0 +1 @@ +Mmmm... i see. But it not setup under FreeBSD without magic manual passes. diff --git a/doc/bugs/comments_not_searchable.mdwn b/doc/bugs/comments_not_searchable.mdwn new file mode 100644 index 000000000..6fda89bd2 --- /dev/null +++ b/doc/bugs/comments_not_searchable.mdwn @@ -0,0 +1,19 @@ +The text of comments (and other internal pages) does not get indexed by the +search plugin. + +Search indexes content passed to the postscan hook. +Comments are inlined, but inline's speed hack avoids adding inlined +content to the page until the format hook. + +And hmm, that's somewhat desirable, because we don't want searches +to find content that is inlined onto another page. + +That suggests that the fix could be to call the postscan hook +for internal pages. + +However, the search postscan hook tells xapian the page url, +and uses `urlto($page)` to do it. And that won't work for +an internal page. Guess it could be modified to tell xapian the +permalink. --[[Joey]] + +> [[done]] --[[Joey]] diff --git a/doc/bugs/comments_preview_unsafe_with_allowdirectives.mdwn b/doc/bugs/comments_preview_unsafe_with_allowdirectives.mdwn new file mode 100644 index 000000000..7f9fb67e9 --- /dev/null +++ b/doc/bugs/comments_preview_unsafe_with_allowdirectives.mdwn @@ -0,0 +1,8 @@ +If `comments_allowdirectives` is set, previewing a comment can run +directives that create files. (Eg, img.) Unlike editpage, it does not +keep track of those files and expire them. So the files will linger in +destdir forever. + +Probably when the user then tries to save the comment, ikiwiki will refuse +to overwrite the unknown file, and will crash. +--[[Joey]] diff --git a/doc/bugs/conflicts.mdwn b/doc/bugs/conflicts.mdwn new file mode 100644 index 000000000..bef0f54cd --- /dev/null +++ b/doc/bugs/conflicts.mdwn @@ -0,0 +1,32 @@ +The `conflicts` testcase has 4 failing test cases. The underlaying problem +is that there are multiple possible source files that can create the same +destination files. + +1. `foo.mdwn` is in srcdir, rendered to destdir. Then + it is removed, and `foo` is added, which will be rendered + raw to destdir. Since the `foo/` directory still exists, + it fails. +1. `foo` is added to srcdir, rendered raw to destdir. + Then it is removed from srcdir, and `foo.mdwn` is added. + The `foo` file is still present in the destdir, and mkdir + of the directory `foo/` fails. +1. `foo.mdwn` renders to `foo/index.html`. Then `foo/index.html` + is added to the srcdir, using rawhtml. It renders to the same + thing. +1. `foo/index.html` in srcdir is rendered to same thing in destdir + using rawhtml. Then `foo.mdwn` is added; renders same thing. + +Note that another case, that of page `foo.mdwn` and page `foo.txt`, that +both render to `foo/index.html`, used to cause problems, but no longer +crashes ikiwiki. It now only complains in this situation, and which +file "wins" is undefined. The fix for this relied on both pages being +named `foo`; but in the above cases, the source files have different +pagenames. + +One approach: Beef up checking in `will_render` to detect when the same +destination file is rendered by multiple pages. Or when one page renders +a file that is a parent directory of the rendered file of another page. +It could warn, rather than erroring. The last page rendered would "win"; +generating the destdir file. + +[[done]] diff --git a/doc/bugs/debbug_shortcut_should_expand_differently.mdwn b/doc/bugs/debbug_shortcut_should_expand_differently.mdwn index d34c40244..b93b20a32 100644 --- a/doc/bugs/debbug_shortcut_should_expand_differently.mdwn +++ b/doc/bugs/debbug_shortcut_should_expand_differently.mdwn @@ -9,3 +9,9 @@ instead of There are problems with code. bug #123456 is a good example of... Thanks, --[[madduck]] + +> Tschwinge changed it to expand to "Debian bug #xxxx". Which happens to +> sidestep the start of sentence problem. I think it makes sense to be +> explicit about whose bug it is, in general -- but you can always edit the +> shortcuts page for your own wiki to use something shorter and more +> implicit. --[[Joey]] [[done]] diff --git a/doc/bugs/defintion_lists_appear_to_be_disabled.mdwn b/doc/bugs/defintion_lists_appear_to_be_disabled.mdwn new file mode 100644 index 000000000..6dac9c8b8 --- /dev/null +++ b/doc/bugs/defintion_lists_appear_to_be_disabled.mdwn @@ -0,0 +1,54 @@ +Adding text of the format + + Apple + : Pomaceous fruit of plants of the genus Malus in + the family Rosaceae. + : An american computer company. + + Orange + : The fruit of an evergreen tree of the genus Citrus. + +Does not result in expected HTML as described in the [MultiMarkdown Syntax Guide](http://fletcherpenney.net/multimarkdown/users_guide/multimarkdown_syntax_guide/): + +Should be + + <dl xmlns="http://www.w3.org/1999/xhtml"> + <dt>Apple</dt> + <dd> + <p>Pomaceous fruit of plants of the genus Malus in + the family Rosaceae.</p> + </dd> + <dd> + <p>An american computer company.</p> + </dd> + <dt>Orange</dt> + <dd> + <p>The fruit of an evergreen tree of the genus Citrus.</p> + </dd> + </dl> + +But instead it gives: + + <p>Apple + : Pomaceous fruit of plants of the genus Malus in + the family Rosaceae. + : An american computer company.</p> + + <p>Orange + : The fruit of an evergreen tree of the genus Citrus.</p> + +> ikiwiki's markdown support does not include support for multimarkdown by +> default. If you want to enable that, you can turn on the `multimarkdown` +> option in the setup file. --[[Joey]] + +>> Sorry, I should have indicated, I have multimarkdown enabled: + + # mdwn plugin + # enable multimarkdown features? + multimarkdown => 1, + +>>Other features appear to be working, tables and footnotes for instance. See current install: <http://wiki.infosoph.org> + +>>> Ok, in that case it's a bug in the perl module. Forwarded to +>>> <http://github.com/bobtfish/text-markdown/issues#issue/6> --[[Joey]] +>>> [[!tag done]] diff --git a/doc/bugs/deletion_warnings.mdwn b/doc/bugs/deletion_warnings.mdwn new file mode 100644 index 000000000..668626b49 --- /dev/null +++ b/doc/bugs/deletion_warnings.mdwn @@ -0,0 +1,89 @@ +Seen while deleting a blog's calendar pages: + +--[[Joey]] + +[[done]] -- the new `page()` pagespec needed to check if there was a source +file for the page, and was leaking undef. + +<pre> + 427250f..ff6c054 master -> origin/master +Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688. +Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668. +Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692. +Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688. +Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668. +Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692. +Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688. +Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668. +Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692. +Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688. +Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668. +Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692. +Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688. +Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668. +Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692. +Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688. +Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668. +Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692. +Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688. +Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668. +Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692. +Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688. +Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668. +Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692. +Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688. +Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668. +Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692. +Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688. +Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668. +Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692. +Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688. +Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668. +Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692. +Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688. +Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668. +Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692. +Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688. +Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668. +Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692. +Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688. +Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668. +Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692. +Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688. +Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668. +Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692. +Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688. +Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668. +Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692. +Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688. +Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668. +Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692. +Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688. +Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668. +Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692. +Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688. +Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668. +Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692. +Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688. +Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668. +Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692. +Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688. +Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668. +Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692. +Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688. +Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668. +Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692. +Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688. +Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668. +Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692. +Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688. +Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668. +Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692. +Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688. +Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668. +Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692. +Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688. +Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668. +Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692. +</pre> + diff --git a/doc/bugs/depends_simple_mixup.mdwn b/doc/bugs/depends_simple_mixup.mdwn new file mode 100644 index 000000000..a5910d02e --- /dev/null +++ b/doc/bugs/depends_simple_mixup.mdwn @@ -0,0 +1,88 @@ +The [[bugs]] page, at least before I commit this, has a bug at the top that +has been modified to link to done, and ikiwiki's dependency calculations +failed to notice and update the bugs page. Looking at the indexdb, I saw +that the page was not included in the `depends_simple` of the bugs page. + +I was able to replicate the problem locally by starting off with the page +marked done (when it did appear in the bugs page `depends_simple` +(appropriatly as a link dependency, since a change to the page removing the +link would make it match)), then removing the done link. + +At that point, it vanished from `depends_simple`. Presumably because +the main (pagespec) depends for the bugs page now matched it, as a content +dependency. But, it seems to me it should still be listed in +`depends_simple` here. This, I think, is the cause of the bug. + +Then re-add the done link, and the dependency calc code breaks down, +not noticing that bugs dependeded on the page and needs to be updated. + +Ok.. Turns out this was not a problem with the actual influences +calculation or dependency calculation code. Whew! `match_link` +just didn't set the influence correctly when failing. fixed + +--[[Joey]] + +--- + +Update: Reopening this because the fix for it rather sucks. + +I made `match_link` return on failure an influence of +type DEPEND_LINKS. So, a tag page that inlines `tagged(foo)` +gets a `depends_simple` built up that contains link dependencies for +*every* page in the wiki. A very bloaty way to represent the dependency! + +Per [[todo/dependency_types]], `link(done)` only needs to list in +`depends_simple` the pages that currently match. If a page is modified +to add the link, the regular dependency calculation code notices that +a new page matches. If a page that had the link is modified to remove it, +the `depends_simple` lets ikiwiki remember that the now non-matching page +matched before. + +Where that fell down was `!link(done)`. A page matching that was not added +to `depends_simple`, because the `link(done)` did not match it. If the page +is modified to add the link, the regular dependency calculation code +didn't notice, since the pagespec no longer matched. + +In this case, `depends_simple` needs to contain all pages +that do *not* match `link(done)`, but before my change, it contained +all pages that *do* match. After my change, it contained all pages. + +---- + +So, seems what is needed is a way for influence info to be manipulated by +the boolean operations that are applied. One way would be to have two +sets of influences be returned, one for successful matches, and one for +failed matches. Normally, these would be the same. For successful +`match_link`, the successful influence would be the page. +For failed `match_link`, the failed influence would be the page. + +Then, when NOTting a `*Reason`, swap the two sets of influences. +When ANDing/ORing, combine the individual sets. Querying the object for +influences should return only the successful influences. + +---- + +Would it be possible to avoid the complication of maintianing two sets of +influence info? + +Well, notice that the influence of `pagespec_match($page, "link(done)")` +is $page. Iff the match succeeds. + +Also, the influence of `pagespec_match($page, "!link(done)")` is +$page. Iff the (overall) match succeeds. + +Does that hold for all cases? If so, the code that populates +`depends_simple` could just test if the pagespec was successful, and +if not, avoid adding $page influences, while still adding any other, +non-$page influences. + +---- + +Hmm, commit f2b3d1341447cbf29189ab490daae418fbe5d02d seems +thuroughly wrong. So, what about influence info for other matches +like `!author(foo)` etc? Currently, none is returned, but it should +be a content influence. (Backlink influence data seems ok.) + +---- + +[[done]] again! diff --git a/doc/bugs/disable_sub-discussion_pages.mdwn b/doc/bugs/disable_sub-discussion_pages.mdwn index 5e9c8c9f9..39d9ba528 100644 --- a/doc/bugs/disable_sub-discussion_pages.mdwn +++ b/doc/bugs/disable_sub-discussion_pages.mdwn @@ -6,6 +6,12 @@ I do want discussion subpage, but I don't want to have, for example: discussion/ > Discussion pages should clearly be a special case that don't get Discussion > links put at the top... aaand.. [[bugs/done]]! --[[Joey]] +>> This bug appears to have returned. For example, +>> [[plugins/contrib/unixauth/discussion]] has a Discussion link. -- [[schmonz]] + +>>> Lots of case issues this time. Audited for and fixed them all. [[done]] +>>> --[[Joey]] + >>> Joey, I've just seen that you closed that bug in ikiwiki 1.37, but it seems >>> you fixed it only for English "discussion" page. The bug still occurs >>> for the international "discussion" pages. I have backported ikiwiki 1.40 diff --git a/doc/bugs/discussion_of_what__63__.mdwn b/doc/bugs/discussion_of_what__63__.mdwn index 2f469fed9..763e599bf 100644 --- a/doc/bugs/discussion_of_what__63__.mdwn +++ b/doc/bugs/discussion_of_what__63__.mdwn @@ -1,3 +1,7 @@ When searching in ikiwiki, sometimes discussion pages turn up. However, they are only titled "discussion". In order to know what topic they are discussing, you have to look at the URL. Shouldn't they be titled "foo/discussion" or "discussion of foo" or something? Thanks, --[[perolofsson]] + +> This bug was filed when ikiwiki still used hyperestradier. +> Now that it uses xapian, the search results include the full +> page name, which seems sufficient to call this [[done]] --[[Joey]] diff --git a/doc/bugs/external_links_inside_headings_don__39__t_work.mdwn b/doc/bugs/external_links_inside_headings_don__39__t_work.mdwn index 5bab283fd..51d6ad475 100644 --- a/doc/bugs/external_links_inside_headings_don__39__t_work.mdwn +++ b/doc/bugs/external_links_inside_headings_don__39__t_work.mdwn @@ -21,4 +21,4 @@ It works fine with h2 and deeper. The square brackets also appear in the output > [[done]] --[[Joey]] ->> It works here but it definitely does *not* work on my wiki; but on further experimentation, I believe my problem is being caused by JasonBlevins' [h1title](http://code.jblevins.org/ikiwiki/plugins.git/plain/h1title.pm) plugin. +>> It works here but it definitely does *not* work on my wiki; but on further experimentation, I believe my problem is being caused by JasonBlevins' [h1title](http://jblevins.org/git/ikiwiki/plugins.git/plain/h1title.pm) plugin. diff --git a/doc/bugs/filecheck_failing_to_find_files.mdwn b/doc/bugs/filecheck_failing_to_find_files.mdwn new file mode 100644 index 000000000..6501508e4 --- /dev/null +++ b/doc/bugs/filecheck_failing_to_find_files.mdwn @@ -0,0 +1,65 @@ +Using the attachment plugin, when filecheck was checking the mime-type of the attachment before allowing the attachment to be removed, it was returning with an error saying that the mime-type of the file was "unknown" (when the mime-type definitely was known!) + +It turns out that the filecheck plugin couldn't find the file, because it was merely using the $pagesources hash, rather than finding the absolute path of the file in question. + +> I don't understand why the file was not in `%pagesources`. Do you? +> --[[Joey]] + +>> The file *was* in `%pagesources`, but what returns from that is the filename relative to the `srcdir` directory; for example, `foo/bar.gif`. +>> When File::MimeInfo::Magic::magic is given that, it can't find the file. +>> But if it is given `/path/to/srcdir/foo/bar.gif` instead, then it *can* find the file, and returns the mime-type correctly. +>> --[[KathrynAndersen]] + +>>> Ok, so it's not removal specific, can in fact be triggered by using +>>> testpagespec (or really anything besides attachment, which passes +>>> the filename parameter). Nor is it limited to mimetype, all the tests in +>>> filecheck have the problem. --[[Joey]] + +>>>> Alas, not fixed. It seems I was mistaken in some of my assumptions. +>>>> It still happens when attempting to remove attachments. +>>>> With your fix, the `IkiWiki::srcfile` function is only called when the filename is not passed in, but it appears that in the case of removing attachments, the filename IS passed in, but it is the relative filename as mentioned above. Thus, the file is still not found, and the mime-type comes back as unknown. +>>>> The reason my patch worked is because, rather than checking whether a filename was passed in before applying IkiWiki::srcfile to the filename, it checks whether the file can be found, and if it cannot be found, then it applies IkiWiki::srcfile to the filename. +>>>> --[[KathrynAndersen]] + +>>>>> Can you test if this patch fixes that? --[[Joey]] + +>>>>>> Yes, it works! --[[KathrynAndersen]] + +applied && [[done]] + +<pre> +diff --git a/IkiWiki/Plugin/remove.pm b/IkiWiki/Plugin/remove.pm +index f59d026..0fc180f 100644 +--- a/IkiWiki/Plugin/remove.pm ++++ b/IkiWiki/Plugin/remove.pm +@@ -49,7 +49,7 @@ sub check_canremove ($$$) { + # This is sorta overkill, but better safe than sorry. + if (! defined pagetype($pagesources{$page})) { + if (IkiWiki::Plugin::attachment->can("check_canattach")) { +- IkiWiki::Plugin::attachment::check_canattach($session, $page, $file); ++ IkiWiki::Plugin::attachment::check_canattach($session, $page, "$config{srcdir}/$file"); + } + else { + error("removal of attachments is not allowed"); +diff --git a/IkiWiki/Plugin/rename.pm b/IkiWiki/Plugin/rename.pm +index 3908443..1a9da63 100644 +--- a/IkiWiki/Plugin/rename.pm ++++ b/IkiWiki/Plugin/rename.pm +@@ -50,7 +50,7 @@ sub check_canrename ($$$$$$) { + IkiWiki::check_canedit($src, $q, $session); + if ($attachment) { + if (IkiWiki::Plugin::attachment->can("check_canattach")) { +- IkiWiki::Plugin::attachment::check_canattach($session, $src, $srcfile); ++ IkiWiki::Plugin::attachment::check_canattach($session, $src, "$config{srcdir}/$srcfile"); + } + else { + error("renaming of attachments is not allowed"); +@@ -85,7 +85,7 @@ sub check_canrename ($$$$$$) { + if ($attachment) { + # Note that $srcfile is used here, not $destfile, + # because it wants the current file, to check it. +- IkiWiki::Plugin::attachment::check_canattach($session, $dest, $srcfile); ++ IkiWiki::Plugin::attachment::check_canattach($session, $dest, "$config{srcdir}/$srcfile"); + } + } +</pre> diff --git a/doc/bugs/firefox_doesn__39__t_want_to_load_updated_pages_at_ikiwiki.info.mdwn b/doc/bugs/firefox_doesn__39__t_want_to_load_updated_pages_at_ikiwiki.info.mdwn index 8cb47f864..558eb90c8 100644 --- a/doc/bugs/firefox_doesn__39__t_want_to_load_updated_pages_at_ikiwiki.info.mdwn +++ b/doc/bugs/firefox_doesn__39__t_want_to_load_updated_pages_at_ikiwiki.info.mdwn @@ -3,3 +3,12 @@ I'm using firefox-3.0.8-alt0.M41.1 (Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1 Only explicitly pressing "reload" helps. Is it a bug? I haven't been noticing such problems usually on other sites. --Ivan Z. + +This remains to be true now, with Epiphany 2.26.3 (Mozilla/5.0 (X11; U; Linux i686; en; rv:1.9.1.4pre) Gecko/20080528 Epiphany/2.22 Firefox/3.5). --Ivan Z. + +> In the most recent ikiwiki release, I added a Cache-Control hack +> explicitly to work around firefox's broken over-caching. +> +> (When I tested epiphany and chromium, neither had firefox's problem.) +> +> [[!tag done]] diff --git a/doc/bugs/git_utf8.mdwn b/doc/bugs/git_utf8.mdwn index dc8df1dab..39903d51c 100644 --- a/doc/bugs/git_utf8.mdwn +++ b/doc/bugs/git_utf8.mdwn @@ -6,3 +6,7 @@ appearing. Probably git's output needs to be force encoded to utf-8. --[[Joey]] + +> I did that in 4ac0b2953131d7a53562ab8918c8e5a49952d8ac , [[done]] +> --[[Joey]] + diff --git a/doc/bugs/gitremotes_script_picks_up_tags_from_anywhere.mdwn b/doc/bugs/gitremotes_script_picks_up_tags_from_anywhere.mdwn new file mode 100644 index 000000000..9bd8938c5 --- /dev/null +++ b/doc/bugs/gitremotes_script_picks_up_tags_from_anywhere.mdwn @@ -0,0 +1,22 @@ +[[!tag patch]] +[[!template id=gitbranch branch=smcv/ready/no-tags author="[[smcv]]"]] + +The `gitremotes` script picks up tags from any repository, including those +used for local .debs that were never actually present in Debian: + + smcv@reptile% git tag | grep -c nmu + 52 + +This can be avoided with the `tagopt = --no-tags` option in .git/config; +see <http://git.pseudorandom.co.uk/smcv/ikiwiki.git?a=shortlog;h=refs/heads/ready/no-tags> + +> *done* thanks. Also cleared propigated tags out of origin. +> +> Hmm, in testing I still see tags get pulled the first time a remote +> is added. If those are then locally deleted, it doesn't pull them again +> with the `--no-tags`. +> --[[Joey]] + +>> Oh, I see why. Try the same branch again... --[[smcv]] + +>>> [[done]] --[[Joey]] diff --git a/doc/bugs/html5_support.mdwn b/doc/bugs/html5_support.mdwn index 239474275..ba67d532b 100644 --- a/doc/bugs/html5_support.mdwn +++ b/doc/bugs/html5_support.mdwn @@ -9,10 +9,59 @@ HTML5](http://www.w3.org/TR/html5-diff/). * [ikiwiki instance with HTML5 templates](http://natalian.org) * [HTML5 outliner tool](http://gsnedders.html5.org/outliner/) -- to check you have the structure of your markup correct +> Kai, thanks enormously for working on this. I switched a page to +> the html5 doctype today, and was rather pleasently suprised that it +> validated, except for the new Cache-Control meta tag. Now I see you're +> well ahead of me. --[[Joey]] +> +> So, how should ikiwiki support html5? There are basically 3 approaches: +> +> 1. Allow users to add html5 tags to their existing xhtml pages. +> What has been done so far, can be extended. Basically works +> in browsers, if you don't care about standards. A good prerequisite +> for anything else, anyway. +> 2. Have both a html5 and a xhtml mode, allow user to select. +> 3. Switch to html5 in eg, ikiwiki 4; users have to deal with +> any custom markup on their pages/templates that breaks then. +> +> The second option seems fairly tractable from what I see here and in +> your branch. You made only relatively minor changes to 10 templates. +> It would probably not be too dreadful to put them in ifdefs. I've made a +> small start at doing that. +> +> I've made ikiwiki use the time element and all the new semantic elements +> in html5 mode. +> +> Other ideas: +> +> * Use details tag instead of the javascript in the toggle plugin. +> (Need to wait on browser support probably.) +> * Use figure and figcaption for captions in img. However, I have not +> managed to style it to look as good as the current table+caption +> approach. +> +> --[[Joey]] + # htmlscrubber.pm needs to not scrub new HTML5 elements * [new elements](http://www.w3.org/TR/html5-diff/#new-elements) +> Many added now. +> +> Things I left out, too hard to understand today: +> Attributes contenteditable, +> data-\*, draggable, role, aria-\*. +> Tags command, keygen, output. +> +> Clearly unsafe: embed. +> +> Apparently cannot be used w/o javascript: menu. +> +> I have not added the new `ping` attribute, because parsing a +> space-separeated list of urls to avoid javascript injection is annoying, +> and the attribute seems generally dubious. +> --[[Joey]] + # HTML5 Validation and t/html.t [validator.nu](http://validator.nu/) is the authorative HTML5 validator, @@ -25,6 +74,9 @@ In the future, hopefully ikiwiki can test for valid HTML5 using [Relax NG schema](http://syntax.whattf.org/) using a Debian package tool [rnv](http://packages.qa.debian.org/r/rnv.html). +> Validation in the test suite is nice, but I am willing to lose those +> tests for a while. --[[Joey]] + # HTML5 migration issues # [article](http://www.whatwg.org/specs/web-apps/current-work/multipage/semantics.html#the-article-element) element @@ -37,6 +89,8 @@ This element is poorly supported by browsers. As a workaround, `style.css` needs Internet Explorer will display it as a block, though you can't seem to be able to further control the style. +> done (needed for header too) --[[Joey]] + ## Time element The [time element](http://www.whatwg.org/specs/web-apps/current-work/multipage/text-level-semantics.html#the-time-element) ideally needs the datatime= attribute set by a template variable with what [HTML5 defines as a valid datetime string](http://www.whatwg.org/specs/web-apps/current-work/multipage/infrastructure.html#valid-global-date-and-time-string). @@ -45,3 +99,19 @@ As a workaround: au:~% grep timeformat natalian.setup timeformat => '%Y-%m-%d', + +> Also, the [[plugins/relativedate]] plugin needs to be updated to +> support relatatizing the contents of time elements. --[[Joey]] + +> Done and done; in html5 mode it uses the time tag, and even +> adds pubdate when displaying ctimes. --[[Joey]] + +## tidy plugin + +Will reformat html5 to html4. + +---- + + +Ok, I consider this [[done]], at least as a first pass. Html5 mode +is experimental, but complete enough. --[[Joey]] diff --git a/doc/bugs/html5_time_element__39__s_pubdate_wrong_when_using_xhtml5___34__mode__34__.mdwn b/doc/bugs/html5_time_element__39__s_pubdate_wrong_when_using_xhtml5___34__mode__34__.mdwn new file mode 100644 index 000000000..def5bcc2a --- /dev/null +++ b/doc/bugs/html5_time_element__39__s_pubdate_wrong_when_using_xhtml5___34__mode__34__.mdwn @@ -0,0 +1,46 @@ +Hi, + +XML error: + + Created <time datetime="2009-03-24T18:02:14Z" pubdate class="relativedate" title="Tue, 24 Mar 2009 14:02:14 -0400">2009-03-24</time> + +The pubdate REQUIRES a date, so e.g. `pubdate="2009-03-24T18:02:14Z"` + +> No, `pubdate="pubdate"`. It's a boolean attribute. applied && [[done]] +> --[[Joey]] +>> awesome, thanks for fixing my fix ;) --[[simonraven]] + +Otherwise the XML parser chokes. + +<http://www.whatwg.org/specs/web-apps/current-work/multipage/text-level-semantics.html#attr-time-pubdate> + +(indented exactly 4 spaces) + +<pre> + diff --git a/IkiWiki.pm b/IkiWiki.pm + index 1f2ab07..6ab5b56 100644 + --- a/IkiWiki.pm + +++ b/IkiWiki.pm + @@ -1004,7 +1004,7 @@ sub displaytime ($;$$) { + my $time=formattime($_[0], $_[1]); + if ($config{html5}) { + return '<time datetime="'.date_3339($_[0]).'"'. + - ($_[2] ? ' pubdate' : ''). + + ($_[2] ? ' pubdate="'.date_3339($_[0]).'"' : ''). + '>'.$time.'</time>'; + } + else { + diff --git a/IkiWiki/Plugin/relativedate.pm b/IkiWiki/Plugin/relativedate.pm + index fe8ef09..8c4a1b4 100644 + --- a/IkiWiki/Plugin/relativedate.pm + +++ b/IkiWiki/Plugin/relativedate.pm + @@ -59,7 +59,7 @@ sub mydisplaytime ($;$$) { + + if ($config{html5}) { + return '<time datetime="'.IkiWiki::date_3339($time).'"'. + - ($pubdate ? ' pubdate' : '').$mid.'</time>'; + + ($pubdate ? ' pubdate="'.IkiWiki::date_3339($time).'"' : '').$mid.'</time>'; + } + else { + return '<span'.$mid.'</span>'; +</pre> diff --git a/doc/bugs/htmlscrubber_breaks_multimarkdown_footnotes.mdwn b/doc/bugs/htmlscrubber_breaks_multimarkdown_footnotes.mdwn new file mode 100644 index 000000000..343037b45 --- /dev/null +++ b/doc/bugs/htmlscrubber_breaks_multimarkdown_footnotes.mdwn @@ -0,0 +1,18 @@ +I enabled multimarkdown to make use of footnotes in my file. I have the multimarkdown plugin, +as well as the command-line program. If I write a document with footnotes: + + This line has a footnote[^1] + + [^1]: this is the footnote + +and compile it from the cli, the reference becomes a link to the footnote and the footnote +gets a backreferencing link appended. When compiled in ikiwiki with the goodstuff plugin +enabled, the links are created but their hrefs are empty (so they do not actually act as links). +Disabling the htmlscrubber plugin fixes this issue + +[[!tag multimarkdown htmlscrubber]] + +> href was of the form: #fnref:1 , scrubbed by overzealous protocol +> scrubbing. + +[[done]] --[[Joey]] diff --git a/doc/bugs/http_proxy_for_openid.mdwn b/doc/bugs/http_proxy_for_openid.mdwn index 3d0c99b83..dac4d2736 100644 --- a/doc/bugs/http_proxy_for_openid.mdwn +++ b/doc/bugs/http_proxy_for_openid.mdwn @@ -22,8 +22,7 @@ Note that using $ua->proxy(['https'], ...); won't work, you get a "Not Implement Also note that the proxy won't work with liblwpx-paranoidagent-perl, I had to remove liblwpx-paranoidagent-perl first. -Please get the patch from the *.mdwn source. - +<pre> louie:/usr/share/perl5/IkiWiki/Plugin# diff -u openid.pm.old openid.pm --- openid.pm.old 2008-10-26 12:18:58.094489360 +1100 +++ openid.pm 2008-10-26 12:40:05.763429880 +1100 @@ -42,6 +41,11 @@ louie:/usr/share/perl5/IkiWiki/Plugin# diff -u openid.pm.old openid.pm # Store the secret in the session. my $secret=$session->param("openid_secret"); if (! defined $secret) { - +</pre> Brian May + +> Rather than adding config file settings for every useful environment +> variable, there is a ENV config file setting that can be used to set +> any environment variables you like. So, no changed needed. [[done]] +> --[[Joey]] diff --git a/doc/bugs/ikiwiki-transition_does_not_set_perl_moduels_path_properly.mdwn b/doc/bugs/ikiwiki-transition_does_not_set_perl_moduels_path_properly.mdwn new file mode 100644 index 000000000..b3e87b529 --- /dev/null +++ b/doc/bugs/ikiwiki-transition_does_not_set_perl_moduels_path_properly.mdwn @@ -0,0 +1,17 @@ +When installing ikiwiki the perl module path is setup correctly + + use lib '/usr/local/ikiwiki-3.20100312/share/perl/5.10.0'; + +This is not true for ikiwiki-transition: + + $ PATH=/usr/local/ikiwiki-3.20100312/bin ikiwiki-transition prefix_directives ikiwiki.setup + Can't locate IkiWiki.pm in @INC (@INC contains: /etc/perl /usr/local/lib/perl/5.10.0 + /usr/local/share/perl/5.10.0 /usr/lib/perl5 /usr/share/perl5 /usr/lib/perl/5.10 /usr/share/perl/5.10 /usr/local/lib/site_perl .) + at /usr/local/ikiwiki-3.20100312/bin/ikiwiki-transition line 4. + BEGIN failed--compilation aborted at /usr/local/ikiwiki-3.20100312/bin/ikiwiki-transition line 4. + +The missing line should be added. + +Thanks! + +[[done]] --[[Joey]] diff --git a/doc/bugs/img_plugin_and_missing_heigth_value.mdwn b/doc/bugs/img_plugin_and_missing_heigth_value.mdwn new file mode 100644 index 000000000..4bc070c95 --- /dev/null +++ b/doc/bugs/img_plugin_and_missing_heigth_value.mdwn @@ -0,0 +1,7 @@ +When I set up my picture page with `\[[!img defaults size=300x]]` then the html validator complains that the value for height is missing and the IE browsers won't show the pictures up at all; no problems with ff tho. If I set up my picture page with `\[[!img defaults size=300x300]]` then all the images are funny stretched. What am I doing wrong? + +> This is a bug. --[[Joey]] + +> And .. [[fixed|done]] --[[Joey]] + +>> Not quite; For some reason it requires me to update wiki pages twice before the height value shows up. diff --git a/doc/bugs/img_vs_align.mdwn b/doc/bugs/img_vs_align.mdwn new file mode 100644 index 000000000..c78465a37 --- /dev/null +++ b/doc/bugs/img_vs_align.mdwn @@ -0,0 +1,38 @@ +The *[[ikiwiki/directive/img]]* directive allows for specifying an +*align* parameter -- which is of limited usability as the image is +embedded as `<p><img ...></p>`. That's at least what I see on +<http://www.bddebian.com:8888/~hurd-web/hurd/status/>. On the other +hand, CSS is supposed to be used instead, I guess. (But how... I forgot +almost of my CSS foo again ;-) it seems.) --[[tschwinge]] + +> [[!img logo/ikiwiki.png align=right]]The [img tag doesn't create P tags](http://git.ikiwiki.info/?p=ikiwiki;a=blob;f=IkiWiki/Plugin/img.pm;h=32023fa97af8ba8e63192cacaff10a4677d20654;hb=HEAD), but if you have surrounded the img directive with newlines, they will result in paragraph tags. +> +> I've edited the URL you provided to demonstrate this -- hope you don't mind! I've also added an inline, right-aligned image to this page.[[!tag done]] +> -- [[Jon]] + +> Contrary to all of the above, html does not care about P tags when +> floating an image to the left or right via align. Proof: +> <http://kitenet.net/~joey/pics/toomanypicturesofjoey/>, where the image +> is in its own paragraph but still floats. Also, I re-modified a local +> copy of the hurd page to enclose the image in a P, and it still floats. +> +> Tested with Chromium and Firefox. --[[Joey]] + +>> Uh, sorry for not confirming what I supposed to be with looking into +>> the relevant standard. It just seemed too obvious to me that the +>> closure of `<p>...</p>` would confine whatever embedded stuff may be +>> doing. (Meaning, I didn't expect that the *img*'s alignment would +>> propagate to the *p*'s and would thus be visible from the outside.) +>> +>> I confirm (Firefox, Ubuntu jaunty) that your picture page is being +>> shown correctly -- thus I suppose that there's a buglet in our CSS +>> scripts again... +>> +>> --[[tschwinge]] + +>>> It seems, the 'align=right' parameter gets filtered in my installation +>>> Are there other plugins, that could throw the parameter away? +>>> --[[jwalzer]] + +>>>> Can't think of anything. htmlscrubber doesn't; tidy doesn't. +>>>> --[[Joey]] diff --git a/doc/bugs/inline_breaks_PERMALINK_variable.mdwn b/doc/bugs/inline_breaks_PERMALINK_variable.mdwn new file mode 100644 index 000000000..fc891bb25 --- /dev/null +++ b/doc/bugs/inline_breaks_PERMALINK_variable.mdwn @@ -0,0 +1,25 @@ +in 3.20091017 the following inline + +> `\[[!inline pages="internal(foo/bar/baz/*)" show=3 archive="yes" feeds="no" template="sometemplates"]]` + +with sometemplate being + +> `<p><a href="<TMPL_VAR PERMALINK>"><TMPL_VAR TITLE></a> (<TMPL_VAR CTIME>)</p>` + +produced output that links nowhere (`<a href="">`) while the other variables do fine. This problem does not occur in 3.1415926. + +> This must be caused by an optimisation that avoids reading the page +> content when using a template that does not use CONTENT. +> +> I guess that it needs to instead check all the variables the template +> uses, and read content if PERMALINK, or probably any other unknown +> variable is used. Unfortunatly, that will lose the optimisation +> for the archivepage template as well -- it also uses PERMALINK. +> +> So, revert the optimisation? Or, make meta gather the permalink +> data on scan? That seems doable, but is not a general fix for +> other stuff that might be a) used in a template and b) gathered +> at preprocess time. +> +> For now, I am going with the special case fix of fixing meta. I may need +> to go for a more general fix later. --[[Joey]] [[!tag done]] diff --git a/doc/bugs/inline_plugin_sets_editurl_even_when_editpage_is_disabled.html b/doc/bugs/inline_plugin_sets_editurl_even_when_editpage_is_disabled.html new file mode 100644 index 000000000..62c91a932 --- /dev/null +++ b/doc/bugs/inline_plugin_sets_editurl_even_when_editpage_is_disabled.html @@ -0,0 +1,16 @@ +see subject, simple patch below +<pre> +--- a/IkiWiki/Plugin/inline.pm ++++ b/IkiWiki/Plugin/inline.pm +@@ -371,7 +371,8 @@ sub preprocess_inline (@) { + } + if (length $config{cgiurl} && defined $type) { + $template->param(have_actions => 1); +- $template->param(editurl => cgiurl(do => "edit", page => $page)); ++ $template->param(editurl => cgiurl(do => "edit", page => $page)) ++ if IkiWiki->can("cgi_editpage"); + } + } +</pre> + +[[done]] --[[Joey]] diff --git a/doc/bugs/inline_raw_broken_on_unknown_pagetype.mdwn b/doc/bugs/inline_raw_broken_on_unknown_pagetype.mdwn new file mode 100644 index 000000000..19aa94e7e --- /dev/null +++ b/doc/bugs/inline_raw_broken_on_unknown_pagetype.mdwn @@ -0,0 +1,29 @@ +When trying to insert the raw content of an attached shell script +called `whatever` using: + + \[[!inline pages="whatever" raw="yes"]] + +The generated HTML contains: + + \[[!inline Erreur: Can't call method "param" on an undefined value + at /usr/local/share/perl/5.10.0/IkiWiki/Plugin/inline.pm + line 346.]] + +Looking at the inline plugin's code, it is clear that `$template` is +undef in such a situation. Defining `$template` just before line 346, +in case it's not defined, removes the error message, but nothing +gets inlined as `get_inline_content` returns the empty string in +this situation. + +If we explicitely don't want to allow raw inlining of unknown page +types, ikiwiki should output a better error message. + +> I have made it just do a direct include if the page type is not known, in +> raw mode. That seems useful if you want to include some other file right +> into a page. You could probably even wrap it in a format directive. +> +> It does allow including binary files right into a page, but nothing is +> stopping you pasting binary data right into the edit form either, so +> while annoying I don't think that will be a security problem. --[[Joey]] + +[[done]] diff --git a/doc/bugs/inline_skip_causes_empty_inline.mdwn b/doc/bugs/inline_skip_causes_empty_inline.mdwn new file mode 100644 index 000000000..e1cbc5470 --- /dev/null +++ b/doc/bugs/inline_skip_causes_empty_inline.mdwn @@ -0,0 +1,10 @@ +When using the [[directive/inline]] directive with the skip parameter i get +emtpy list inline (no output at all). The same inline used to work before +but not in 3.20091031. + +> I need more information to help. Skip is working as expected here. +> Can I download/clone your wiki? --[[Joey]] + +>> The bug occurs only together with archive="yes" as I Just found out: + +>>> Thanks, [[fixed|done]] in git. --[[Joey]] diff --git a/doc/bugs/librpc-xml-perl_0.69_breaks_XML-RPC_plugins.mdwn b/doc/bugs/librpc-xml-perl_0.69_breaks_XML-RPC_plugins.mdwn new file mode 100644 index 000000000..d8def82aa --- /dev/null +++ b/doc/bugs/librpc-xml-perl_0.69_breaks_XML-RPC_plugins.mdwn @@ -0,0 +1,13 @@ +Upgrading to `librpc-xml-perl` 0.69-1 on debian breaks external XML-RPC plugins (such as [[rst]]). +Refresing the wiki fails with an error message like this: + + RPC::XML::Parser::new: This method should have + been overridden by the RPC::XML::Parser class at + /usr/share/perl5/RPC/XML/Parser.pm line 46, <GEN1> line 30. + +It appears an incompatible change in the library creates this problem. + +`librpc-xml-perl` version 0.67-1 works. The easiest solution is to downgrade to this version, +or disable XML-RPC plugins if they are not needed. + +[[fixed|done]] --[[Joey]] diff --git a/doc/bugs/login_page_non-obvious_with_openid.mdwn b/doc/bugs/login_page_non-obvious_with_openid.mdwn index 1d087985a..9aa702037 100644 --- a/doc/bugs/login_page_non-obvious_with_openid.mdwn +++ b/doc/bugs/login_page_non-obvious_with_openid.mdwn @@ -36,7 +36,7 @@ If you want to keep it as one form, then perhaps using some javascript to disabl > that allows modifying that form, but does not allow creating a separate > form. The best way to make it obvious how to use it currently is to just > disable password auth, then it's nice and simple. :-) Javascript is an -> interesting idea. It's also possible to write a custom [[signin.tmpl wikitemplates]] that +> interesting idea. It's also possible to write a custom [[templates]] that > is displayed instead of the regular signin form, and it should be > possible to use that to manually lay it out better than FormBuilder > manages with its automatic layout. --[[Joey]] @@ -44,4 +44,4 @@ If you want to keep it as one form, then perhaps using some javascript to disabl > I've improved the form, I think it's more obvious now that the openid > stuff is separate. Good enough to call this [[done]]. I think. --[[Joey]] ->> Looks good, thanks! :-) -- [[AdamShand]]
\ No newline at end of file +>> Looks good, thanks! :-) -- [[AdamShand]] diff --git a/doc/bugs/login_page_should_note_cookie_requirement.mdwn b/doc/bugs/login_page_should_note_cookie_requirement.mdwn index 96686053c..17ac12b34 100644 --- a/doc/bugs/login_page_should_note_cookie_requirement.mdwn +++ b/doc/bugs/login_page_should_note_cookie_requirement.mdwn @@ -31,3 +31,9 @@ Best of all would be to use URL-based or hidden-field-based session tokens if co >> don't look static. Are they really? --[MJR](http://mjr.towers.org.uk) >>> As soon as you post an edit page, you are back to a static website. + +>>> It is impossible to get to an edit page w/o a cookie, unless +>>> anonymous edits are allowed, in which case it will save. No data loss. +>>> Since noone is working on this, and the nonsense above has pissed me +>>> off to the point that I will certianly never work on it, I'm going to +>>> close it. --[[Joey]] [[done]] diff --git a/doc/bugs/map_fails_to_close_ul_element_for_empty_list.mdwn b/doc/bugs/map_fails_to_close_ul_element_for_empty_list.mdwn index 5e842ca7f..ad0f506f2 100644 --- a/doc/bugs/map_fails_to_close_ul_element_for_empty_list.mdwn +++ b/doc/bugs/map_fails_to_close_ul_element_for_empty_list.mdwn @@ -66,8 +66,6 @@ Patch: >>>>>> The current arrangement looks fine to me. Thanks. --[[harishcm]] -[[!template id=gitbranch author="[[harishcm]]" branch=smcv/ready/harishcm-map-fix]] - > [[merged|done]] --[[Joey]] Patch: diff --git a/doc/bugs/map_sorts_by_pagename_and_not_title_when_show__61__title_is_used.mdwn b/doc/bugs/map_sorts_by_pagename_and_not_title_when_show__61__title_is_used.mdwn new file mode 100644 index 000000000..d12414d55 --- /dev/null +++ b/doc/bugs/map_sorts_by_pagename_and_not_title_when_show__61__title_is_used.mdwn @@ -0,0 +1,20 @@ +The [[ikiwiki/directive/map]] directive sort by pagename. That looks kind of odd, when used together with show=title. I would expect it to sort by title then. + +> This would be quite hard to fix. Map sorts the pages it displays by page +> name, which has the happy effect of making "foo/bar" come after "foo"; +> which it *has* to do, so that it can be displayed as a child of the page +> it's located in. If sorting by title, that wouldn't hold. So, map +> would have to be effectively totally rewritten, to build up each group +> of child pages, and then re-sort those. --[[Joey]] + +>> Ok, you are right, that does would break the tree. This made me think that I do not +>> need to generate a tree for my particular use case just a list, so i thought i could use [[ikiwiki/directive/inline]] instead. +>> This created two new issues: +>> +>> 1. inline also does sort by pagename even when explicitly told to sort by title. +>> +>> 2. I cannot get inline to create a list when the htmltidy plugin is switched on. I have a template which is enclosed in an li tag, and i put the ul tag around the inline manually, but htmltidy breaks this. --martin + +>>>> You might want to check if the [[plugins/contrib/report]] plugin solves your problem. It can sort by title, among other things. --[[KathrynAndersen]] + +>> See also: [[todo/sort_parameter_for_map_plugin_and_directive]] --[[smcv]] diff --git a/doc/bugs/minor:_tiny_rendering_error.mdwn b/doc/bugs/minor:_tiny_rendering_error.mdwn new file mode 100644 index 000000000..b2e07eef9 --- /dev/null +++ b/doc/bugs/minor:_tiny_rendering_error.mdwn @@ -0,0 +1,5 @@ +`\[[!inline]]` is rendered with a space in front of the first closing bracket. --[[tschwinge]] + +> I don't think that complicating the directive parser +> is warrented by the minorness of this bug. The result that it outputs is +> still valid. --[[Joey]] diff --git a/doc/bugs/misctemplate_does_not_respect_the_current_page___40__if_any__41__.mdwn b/doc/bugs/misctemplate_does_not_respect_the_current_page___40__if_any__41__.mdwn new file mode 100644 index 000000000..3b0347f5f --- /dev/null +++ b/doc/bugs/misctemplate_does_not_respect_the_current_page___40__if_any__41__.mdwn @@ -0,0 +1,101 @@ +I really like the new approach to having only one main template "page.tmpl". For instance, it improves previews during edits. +But it causes some nasty bugs for plugins that use the pagetemplate hook. It is especially visible with the [[plugins/sidebar]] plugin. + +## Some examples + +### A first example + +* activate sidebars globally and cgi +* create "/sidebar.mdwn" with "[<span></span>[foo]]" inside +* create "/foo.mdwn" with "hello!" inside +* create "/bar.mdwn" +* run ikiwiki +* with the web browser, go to the page "bar" +* notice the sidebar, click on "foo" +* notice the page "foo" is now displayed (hello!) +* return the the page "bar" and click "edit" +* notice the sidebar is still here, click on the "foo" +* -> Problem: 404, the browser goes to "/bar/foo" +* -> Was expected: the browser goes to "/foo" + +> You must have a locally modified `page.tmpl` that omits the "TMPL_IF DYNAMIC" +> that adds a `<base>` tag. That is needed to make all links displayed by +> cgis work reliably. Not just in this page editing case. +> The [[version_3.20100515]] announcment mentions that you need to +> update old `page.tmpl` files to include that on upgrade. --[[Joey]] + +>> I followed the anouncment. I also disabled my custom page.tmpl to confirm the bug. I even produced a step-by-step example to reproduce the bug. +>> In fact, the base tag work for the page links (the content part) but did not works for the sidebar links (the sidebar part) since the sidebar links are generated in the context of the root page. +>> In the examble above: +>> +>> * base="http://www.example.com/bar" relative_link_in_bar=''something" -> absolute_link_in_bar = "http://www.example.com/bar/something" (that is fine) +>> * base="http://www.example.com/bar" relative_link_in_sidebar="foo" (because generated in the context of the root page) -> absolute_link_in_sidebar = "http://www.example.com/bar/foo" (that is not fine) +>> +>> The fix commited work for previewing, but not in other cases : links are still broken. Please juste follow the example step-by-step to reproduce it (I just retried it with a "fixed" version: Debian 3.20100610). If you cannot reproduce, please say it explicitely instead of guessing about my innability to read changelogs. -- [[JeanPrivat]] + +>>> Sorry if my not seeing the bug offended you. [[Fixed|done]] --[[Joey]] + +>>>> Thanks! --[[JeanPrivat]] (I'm not offended) + +### A second example + +* create "/bar/sidebar.mdwn" with "world" +* run ikiwiki +* with the web browser, go to the page "bar" +* notice the sidebar displays "world" +* click "edit" +* -> Problem: the sidebar now shows the foo link (it is the root sidebar!) +* -> Was expecte : the sidebar displays "world" + +> I think it's a misconception to think that the page editing page is the same +> as the page it's editing. If you were deleting that page, would you expect +> the "are you sure" confirmation page to display the page's sidebar? +> --[[Joey]] + +>> It is a very good point and could be argued: +>> +>> * for dynamic page, is the root context more legitimate than the current page context? +>> * when clicking the Edit link, does the user expect to remain in the "same page"? +>> +>> But, as far as something sensible is displayed and that the links work. I'm OK with any choice. -- [[JeanPrivat]] + +### A last example + +* with the web browser edit the page "bar" +* type <code>[<span></span>[!sidebar content="goodby"]]</code> +* click preview +* -> Problem: the sidebar still displays the foo link +* -> Was expected: the sidebar display "goodby" + +> In the specific case of previewing, it is indeed a bug that the +> right sidebar is not displayed. And replacing the regular sidebar +> with the one from the previewed page is probably the best we can do.. +> displaying 2 sidebars would be confusing, and the `page.tmpl` can +> put the sidebar anywhere so we can't just display the preview sidebar +> next to the rest of the page preview. --[[Joey]] + +>> The behavior is fine for me. However, some nitpicking (fell free to ingore) : +>> +>> * If the sidebar is replaced (making the previewing in-place), for consitency, should not the previewed content also shown in-place ? i.e. above the form part +>> * there is no way to come back (without saving or canceling) to the root context (e.g. displaying the root sidebar) i.e. some sort of unpreviewing. +>> +>> -- [[JeanPrivat]] + +## Some superficial hacking + +With the following workaround hacks, I manage to solve the 3 examples shown above: + +1- edit IkiWiki/Plugin/editpage.pm and call showform with additional page and destpage parameters: +<pre>showform($form, \@buttons, $session, $q, forcebaseurl => $baseurl, page => $page, destpage => $page);</pre> + +2- edit /usr/share/perl5/IkiWiki.pm and modify the misctemplate function to use the given page and destpage: +<pre>my %params=@_; +shift->(page => $params{page}, destpage => $params{destpage}, template => $template);</pre> + +I do not guarantee (I do not even expect) that it is the proper way to solve +this bug but it may help developers to find and solve the real problem. + +> Oh, it's pretty reasonable. I don't think it breaks anything. :) +> I modified it a bit, and explicitly made it *not* "fix" the second example. +> --[[Joey]] +>> I removed the done tag (I suspect it is the way to reopen bugs) -- [[JeanPrivat]] diff --git a/doc/bugs/nested_raw_included_inlines.mdwn b/doc/bugs/nested_raw_included_inlines.mdwn index 33433e235..92ea4c4ef 100644 --- a/doc/bugs/nested_raw_included_inlines.mdwn +++ b/doc/bugs/nested_raw_included_inlines.mdwn @@ -32,3 +32,20 @@ Am I missing something? Is this a bug or Ikiwiki not supposed to support this us > currently merges pagespecs, though - maybe the patches I suggested for > [[separating_and_uniquifying_pagespecs|todo/should_optimise_pagespecs]] > would help? --[[smcv]] + +>> That, or something seems to have helped in the meantime... +>> Actually, I think it was the [[transitive_dependencies]] support +>> that did it, though smcv's pagespec stuff was also a crucial improvement. +>> +>> Anyhoo: + + joey@gnu:~/tmp>touch testcase/page2.mdwn + joey@gnu:~/tmp>ikiwiki -v testcase html + refreshing wiki.. + scanning page2.mdwn + building page2.mdwn + building page1.mdwn, which depends on page2 + building page0.mdwn, which depends on page1 + done + +>> I happily think this is [[done]] --[[Joey]] diff --git a/doc/bugs/pagecount_is_broken.mdwn b/doc/bugs/pagecount_is_broken.mdwn index 101230d94..57df6b75d 100644 --- a/doc/bugs/pagecount_is_broken.mdwn +++ b/doc/bugs/pagecount_is_broken.mdwn @@ -1,3 +1,4 @@ -The [[plugins/pagecount]] plugin seems to be broken, as it claims there are [[!pagecount ]] pages in this wiki. (if it's not 0, the bug is fixed) +The [[plugins/pagecount]] plugin seems to be broken, as it claims there are +\[[!pagecount ]] pages in this wiki. (if it's not 0, the bug is fixed) [[fixed|done]] --[[Joey]] diff --git a/doc/bugs/pagemtime_in_refresh_mode.mdwn b/doc/bugs/pagemtime_in_refresh_mode.mdwn new file mode 100644 index 000000000..f926ec86c --- /dev/null +++ b/doc/bugs/pagemtime_in_refresh_mode.mdwn @@ -0,0 +1,28 @@ +I'd like a way to always ask the RCS (Git) to update a file's mtime in +refresh mode. This is currently only done on the first build, and later +for `--gettime --rebuild`. But always rebuilding is too heavy-weight for +this use-case. My options are to either manually set the mtime before +refreshing, or to have ikiwiki do it at command. I used to do the +former, but would now like the latter, as ikiwiki now generally does this +timestamp handling. + +From a quick look, the code in `IkiWiki/Render.pm:find_new_files` is +relevant: `if (! $pagemtime{$page}) { [...]`. + +How would you like to tackle this? + +--[[tschwinge]] + +> This could be done via a `needsbuild` hook. The hook is passed +> the list of changed files, and it should be safe to call `rcs_getmtime` +> and update the `pagemtime` for each. +> +> That lets the feature be done by a plugin, which seems good, since +> `rcs_getmtime` varies between very slow and not very fast, depending on +> VCS. +> +> AFAICS, the only use case for doing this is if you commit changes and +> then delay pushing them to a DVCS repo. Since then the file mtime will +> be when the change was pushed, not when it was committed. But I've +> generally felt that recording when a change was published to the repo +> of a wiki as its mtime is good enough. --[[Joey]] diff --git a/doc/bugs/pages_missing_top-level_directory.mdwn b/doc/bugs/pages_missing_top-level_directory.mdwn new file mode 100644 index 000000000..77c31cd27 --- /dev/null +++ b/doc/bugs/pages_missing_top-level_directory.mdwn @@ -0,0 +1,78 @@ +Hi, + +I've rebuilt two sites now, and anything that requires a working directory structure isn't working properly. I have no idea how it's doing this. I don't see anything in my templates, and I haven't messed around with the back-end code much. + +An example would show this best I think. + +<pre> +/ <- root of site +/About/ <- sub-directory + /Policy/ <- sub-sub- +</pre> + +When you're on /About/, any generated links get mapped to /Policy/ and NOT /About/Policy/ - of course this results in a 404 error. + +I used to be able to use relative links or absolute ones to get the links I want, and now I can't do either. The generated link results in a 404 due to the stripping of a directory. + +I don't know if it's related to the fact that I have one ikiwiki install under another (/blog/ under / is also ikiwiki), but both are FUBAR. + +> what do you mean by generated links: do you mean the output of +> [[ikiwiki/wikilink]]s? Or are you generating links some other way? +> When you say "on /About/, any generated links get mapped to +> /Policy/ and NOT /About/Policy" can you provide an example of what +> source generates the link? -- [[Jon]] + +>> No, a \[[map]] call, such as: +>> +>> (actual code)<br /> +>> = = = = =<br /> +>> \[[!map pages="About/*" show="title"]]<br /> +>> = = = = =<br /> +>> +>> The end result is:<br /> +>> (actual code) +>> +<pre> +<div class="map"> +<ul> +<li><a class="mapitem" href="./Policy/">Policy</a> +<ul> +<li><a class="mapitem" href="./Policy/Microblog/">Microblogging subscription policy</a> +</li> +</ul> +</li> +</ul> +</div> +</pre> + +> I'm also confused about what is generating the links. The map directive? +> You? --[[Joey]] + +>> see above :) + +>> I suspect this is due to git scanning everything under the pwd of the .git/ directory, but not totally so. + +>>> Ikiwiki never, ever, looks in directories with names starting with a +>>> dot. --[[Joey]] + +>> Other ikiwiki sites I have don't do this, and work OK, on the same server, but different docroots. + +>>> Well, I've moved my blog to under my site's docroot - in terms of git +>>> and ikiwiki - and it's still cutting out a whole directory level. I +>>> have no idea what's going on. I need to check the code. The site is at +>>> http://simonraven.kisikew.org/ - if you follow the "About" link, you'll +>>> understand exactly what's going on, if you look at the URL in your +>>> status bar (or under your cursor if you're using a text browser). + +>>>> Your page contains the following in its html: +>>>> `<base href="../" />` +>>>> +>>>> Given a link like "./Policy/", which is *correct*, and when on the +>>>> About page will normally link to the About/Policy page, this causes +>>>> the link to really link to ".././Policy/" which is of course broken. +>>>> +>>>> Ikiwiki's standard page templates do not contain this base tag, so +>>>> I guess your customised templates are broken. --[[Joey]] [[done]] + +>>>>> I totally forgot about that tag... good catch. I was thinking it was my template that was broken, since yesterday, but I couldn't see what. Thank you very much for your eyes. + diff --git a/doc/bugs/pagespec:_tagged__40____41____44___globbing.mdwn b/doc/bugs/pagespec:_tagged__40____41____44___globbing.mdwn new file mode 100644 index 000000000..f9cb37487 --- /dev/null +++ b/doc/bugs/pagespec:_tagged__40____41____44___globbing.mdwn @@ -0,0 +1,36 @@ +With the current HEAD (b10d353490197b576ef7bf2e8bf8016039efbd2d), +globbing in `tagged()` pagespecs doesn't work for me. For example, +`tagged(*)` doesn't match any pages. (It does in this wiki installation +here, though.) + +I did not yet do any testing to figure out when this broke. + +--[[tschwinge]] + +[[!map pages="*/a* and tagged(*ose)"]] + +> Are you sure that `tagged()` ever matches pages there? Take globbing +> out of the equasion. +> +> This could be as simple as you having not rebuilt the wiki +> on upgrade to the version that tracks tagged links. --[[Joey]] + +>> Yes, it is a globbing issue: + +>> \[[!map pages="tagged(open_i*ue_gdb)" show=title]] + +>> ... doesn't show anything. + +>> \[[!map pages="tagged(open_issue_gdb)" show=title]] + +>> ... does show a map of eight pages. Also, it's working fine on the +>> autotags pages. + +>> --[[tschwinge]] + +>>> Only way I can reproduce something like this is if tagbase is not set. +>>> I have fixed a bug there, see if it works for you? +>>> --[[Joey]] + +>>>> This is now indeed [[fixed|done]] (thanks!) -- even though I already +>>>> did have tagbase set. diff --git a/doc/bugs/pagespec_error_on_refresh_but_not_rebuild.mdwn b/doc/bugs/pagespec_error_on_refresh_but_not_rebuild.mdwn new file mode 100644 index 000000000..e89545955 --- /dev/null +++ b/doc/bugs/pagespec_error_on_refresh_but_not_rebuild.mdwn @@ -0,0 +1,32 @@ +I'm getting this error message when I refresh my wiki: + + $ hg commit -u me -m "Minor corrections" + refreshing wiki.. + scanning htmletc/moco-conf-rooms.mdwn + building htmletc/moco-conf-rooms.mdwn + Use of uninitialized value in concatenation (.) or string at /usr/local/lib/perl5/site_perl/5.8.9/Text/Typography.pm line 542. + building sidebar.mdwn, which depends on htmletc/moco-conf-rooms + building contact.mdwn, which depends on sidebar + building 500.mdwn, which depends on sidebar + Use of uninitialized value in concatenation (.) or string at /usr/local/lib/perl5/site_perl/5.8.9/Text/Typography.pm line 542. + building ceramics.mdwn, which depends on sidebar + building glossary.mdwn, which depends on sidebar + syntax error in pagespec "internal(glossary/comment_*)" + warning: post-commit hook exited with status 2 + +But there is no error if I use `ikiwiki --rebuild` to regenerate the whole thing. + +> You neglect to say what version of ikiwiki this is, +> or give any information to reproduce the bug. +> +> My guess: A version older than 3.20100403, which included +> [this bugfix](http://git.ikiwiki.info/?p=ikiwiki;a=commitdiff;h=799b93d258bad917262ac160df74136f05d4a451), +> which could lead to incorrect "syntax error in pagespec" +> that only happened some of the time. +> +> (The Text::Typography warning seems probably unrelated.) +> --[[Joey]] + +>> I'm sorry, I don't know what I was thinking there. It's ikiwiki 3.20100212, and manually applying the patch you linked to made the bug go away. (Upgrading ikiwiki is a pain on nearlyfreespeech, especially if you don't want to keep the build directory around -- please consider making ikiwiki runnable directly from a git clone.) + +[[!meta link="done"]] diff --git a/doc/bugs/pagetitle_function_does_not_respect_meta_titles.mdwn b/doc/bugs/pagetitle_function_does_not_respect_meta_titles.mdwn index be14e5126..c6e3cd4fd 100644 --- a/doc/bugs/pagetitle_function_does_not_respect_meta_titles.mdwn +++ b/doc/bugs/pagetitle_function_does_not_respect_meta_titles.mdwn @@ -144,7 +144,7 @@ So, looking at your meta branch: --[[Joey]] has no title, then A will display the link as "B". Now page B is modified and a title is added. Nothing updates "A". The added overhead of rebuilding every page that links to B when B is - changed (as the `postscan` hook of the po plugin does) is IMHO a killer. + changed (as the `indexhtml` hook of the po plugin does) is IMHO a killer. That could be hundreds or thousands of pages, making interactive editing way slow. This is probably the main reason I had not attempted this whole thing myself. IMHO this calls for some kind of intellegent dependency diff --git a/doc/bugs/plugins__47__relativedate_depends_on_locale_at_setup_file.mdwn b/doc/bugs/plugins__47__relativedate_depends_on_locale_at_setup_file.mdwn new file mode 100644 index 000000000..a9a39ac47 --- /dev/null +++ b/doc/bugs/plugins__47__relativedate_depends_on_locale_at_setup_file.mdwn @@ -0,0 +1,16 @@ +[[plugins/relativedate]] does not works when russian locale defined at setup file (locale => 'ru_RU.UTF-8'). This is happen because javascript for this plugin takes either elements title or content itself. If russian locale is turned on then title generated on russian language and JS can't convert it into Date object. innerHTML is language independent (YYYY-MM-DD HH:mm) always. + +If I switch locale to en_US.UTF-8 then this plugin correctly parses text date and print relative date. But when I mouseover on date I see unusual formating of the date (it uses AM/PM format while russians use 24-h notation). + +P.S. All pages but RecentChanges show well-formated date. RecentChanges show date formated using locale. Anyway, plugin does not work without en_US locale. + +> [[Fixed|done]]. Now it uses C locale for the date put in the title, +> that is used by relativedate. The mouseover will display the date in your +> native locale. +> +> Only exception is that when javascript is disabled... then +> relativedate can't work, so instead you will see your localized date +> displayed; but on mouseover you will get shown the C locale date. +> --[[Joey]] + +>> Thanks. diff --git a/doc/bugs/po:_double_commits_of_po_files.mdwn b/doc/bugs/po:_double_commits_of_po_files.mdwn new file mode 100644 index 000000000..a871785be --- /dev/null +++ b/doc/bugs/po:_double_commits_of_po_files.mdwn @@ -0,0 +1,19 @@ +When adding a new english page, the po files are created, committed, +and then committed again. The second commit makes this change: + + -"Content-Type: text/plain; charset=utf-8\n" + -"Content-Transfer-Encoding: ENCODING" + +"Content-Type: text/plain; charset=UTF-8\n" + +"Content-Transfer-Encoding: ENCODING\n" + +Same thing happens when a change to an existing page triggers a po file +update. --[[Joey]] + +> * The s/utf-8/UTF-8 part has been fixed. +> * The ENCODING\n part is due to an inconsistency in po4a, which +> I've just send a patch for. --[[intrigeri]] + +>> I resubmitted the patch to po4a upstream, sending it this time to +>> their mailing-list: +>> [post archive](http://lists.alioth.debian.org/pipermail/po4a-devel/2010-July/001897.html). +>> --[[intrigeri]] diff --git a/doc/bugs/po:_new_pages_not_translatable.mdwn b/doc/bugs/po:_new_pages_not_translatable.mdwn new file mode 100644 index 000000000..c19f66594 --- /dev/null +++ b/doc/bugs/po:_new_pages_not_translatable.mdwn @@ -0,0 +1,12 @@ +Today I added a new English page to l10n.ikiwiki.info. When I saved, +the page did not have the translation links at the top. I waited until +the po plugin had, in the background, created the po files, and refreshed; +still did not see the translation links. Only when I touched the page +source and refreshed did it finally add the translation links. +I can reproduce this bug in a test site. --[[Joey]] + +> I could reproduce this bug at some point during the merge of a buggy +> version of my ordered slave languages patch, but I cannot anymore. +> Could you please try again? --[[intrigeri]] + +>> Cannot reproduce with 3.20100722, [[done]] I guess. --[[Joey]] diff --git a/doc/bugs/po:_po_files_instead_of_html_files.mdwn b/doc/bugs/po:_po_files_instead_of_html_files.mdwn new file mode 100644 index 000000000..933e348c4 --- /dev/null +++ b/doc/bugs/po:_po_files_instead_of_html_files.mdwn @@ -0,0 +1,6 @@ +On the home page of my wiki, when i click on the link "ikiwiki", i get the english file instead of the french file. +At the bottom of this page, there is the "Links" line: +Links: index index.fr templates templates.fr +When i click on "templates.fr", i get the po.file instead of html. + + Sorry for the noise! I set "po_master_language" to fr and all was ok. [[done]]. diff --git a/doc/bugs/po:_ugly_messages_with_empty_files.mdwn b/doc/bugs/po:_ugly_messages_with_empty_files.mdwn new file mode 100644 index 000000000..d3992b6bc --- /dev/null +++ b/doc/bugs/po:_ugly_messages_with_empty_files.mdwn @@ -0,0 +1,6 @@ +If there are empty .mdwn files, the po plugin displays some ugly messages. + +> This is due to a bug in po4a (not checking definedness of a +> variable). One-liner patch sent. --[[intrigeri]] + +>> This seems to be fixed in po4a 0.40 => [[done]]. --[[intrigeri]] diff --git a/doc/bugs/po_plugin_adds_new_dependency.mdwn b/doc/bugs/po_plugin_adds_new_dependency.mdwn new file mode 100644 index 000000000..3ddcc30f2 --- /dev/null +++ b/doc/bugs/po_plugin_adds_new_dependency.mdwn @@ -0,0 +1,38 @@ +Was it intended that the po plugin add a new dependency? + +> Yes; see debian/control Build-Depends. However, I have made it disable +> building that is po4a is not available. [[done]] --[[Joey]] + + PERL5LIB=.. ./po2wiki underlay.setup + Failed to load plugin IkiWiki::Plugin::po: Can't locate Locale/Po4a/Common.pm in @INC (@INC contains: .. /Library/Perl/Updates/5.8.8 /System/Library/Perl/5.8.8/darwin-thread-multi-2level /System/Library/Perl/5.8.8 /Library/Perl/5.8.8/darwin-thread-multi-2level /Library/Perl/5.8.8 /Library/Perl /Network/Library/Perl/5.8.8/darwin-thread-multi-2level /Network/Library/Perl/5.8.8 /Network/Library/Perl /System/Library/Perl/Extras/5.8.8/darwin-thread-multi-2level /System/Library/Perl/Extras/5.8.8 /Library/Perl/5.8.6 /Library/Perl/5.8.1 /sw/lib/perl5/5.8.8/darwin-thread-multi-2level /sw/lib/perl5/5.8.8 /sw/lib/perl5/darwin-thread-multi-2level /sw/lib/perl5 /sw/lib/perl5/darwin /usr/local/lib/perl5/site_perl/5.8.8/darwin-thread-multi-2level /usr/local/lib/perl5/site_perl/5.8.8 /usr/local/lib/perl5/site_perl .) at ../IkiWiki/Plugin/po.pm line 13. + BEGIN failed--compilation aborted at ../IkiWiki/Plugin/po.pm line 13. + Compilation failed in require at (eval 27) line 2. + BEGIN failed--compilation aborted at (eval 27) line 2. + + make[1]: *** [po2wiki_stamp] Error 2 + make: *** [extra_build] Error 2 + +And it looks like this dependency is not easy to work around. The issue is that the newly translated base wiki means that the po plugin is being used by the build system. It is no longer optional. I've turned it off in my workspace like this: (heavy handed, but it lets me keep going until a proper fix is available) + + diff --git a/Makefile.PL b/Makefile.PL + index 602d8fb..68728b7 100755 + --- a/Makefile.PL + +++ b/Makefile.PL + @@ -42,7 +42,7 @@ extra_build: ikiwiki.out ikiwiki.setup docwiki + ./mdwn2man ikiwiki-makerepo 1 doc/ikiwiki-makerepo.mdwn > ikiwiki-makerepo.man + ./mdwn2man ikiwiki-transition 1 doc/ikiwiki-transition.mdwn > ikiwiki-transition.man + ./mdwn2man ikiwiki-update-wikilist 1 doc/ikiwiki-update-wikilist.mdwn > ikiwiki-update-wikilist.man + - $(MAKE) -C po + + # $(MAKE) -C po + + docwiki: ikiwiki.out + $(PERL) -Iblib/lib $(extramodules) $(tflag) ikiwiki.out -libdir . -setup docwiki.setup -refresh + @@ -114,7 +114,7 @@ extra_install: underlay_install + install ikiwiki.out $(DESTDIR)$(PREFIX)/bin/ikiwiki + install ikiwiki-makerepo ikiwiki-transition ikiwiki-update-wikilist $(DESTDIR)$(PREFIX)/bin/ + + - $(MAKE) -C po install DESTDIR=$(DESTDIR) PREFIX=$(PREFIX) + + # $(MAKE) -C po install DESTDIR=$(DESTDIR) PREFIX=$(PREFIX) + + # These might fail if a regular user is installing into a home + # directory. diff --git a/doc/bugs/po_plugin_cannot_add_po_files_into_git.mdwn b/doc/bugs/po_plugin_cannot_add_po_files_into_git.mdwn new file mode 100644 index 000000000..8e3399611 --- /dev/null +++ b/doc/bugs/po_plugin_cannot_add_po_files_into_git.mdwn @@ -0,0 +1,34 @@ +po files are not added to git (error: /path/to/po/file not in repository tree) in my setup. + +I have set absolute path for srcdir = '/path/to/repo/doc/'. The root of my git repository is '/path/to/repo/'. When I enable the po plugin, it creates all po files and produces an error when it try to add the file saying that the /path/to/repo/doc/index.fr.po is not in the repository tree. + +I have no problem when I use an relative path like srcdir = '.'. + +I have an other issue with the po plugin when I set the srcdir to './doc/' (provided that my config file is in /path/to/repo). In this case the po plugin try to add 'doc/doc/index.fr.po' which does not exists (seems like the srcdir path is prepended twice). + +> You should never use a relative srcdir path with ikiwiki. +> +> I wonder what version of git you have there, since it works ok with the +> version I have here. But, the po plugin is definitly doing the wrong +> thing; it's telling git to add the po file with the full scrdir path +> rather than relative to its root. Fixed that. [[done]] --[[Joey]] + +>> Yeah, I figured for the relative path +>> Git version 1.6.3.3 (on both my dev and server machines) +>> +>> Here is an example of what I get when I update the po file on my laptop and I push to the master repository: + + From /srv/git/sb + 5eb4619..ecac4d7 master -> origin/master + scanning doc.fr.po + building doc.fr.po + building doc.mdwn, which depends on doc.fr + building recentchanges.mdwn, which depends on recentchanges/change_ecac4d7311b15a3a3ed03102b9250487315740bc + fatal: '/srv/www/sb.l.n/new/doc/doc.fr.po' is outside repository + 'git add /srv/www/sb.l.n/new/doc/doc.fr.po' failed: at /usr/share/perl5/IkiWiki/Plugin/git.pm line 161. + done + To ssh://git.lohrun.net/var/cache/git/songbook.git + 5eb4619..ecac4d7 master -> master + +>> The root repository used to run ikiwiki is `/srv/www/sb.l.n/new/` +>> -- [[AlexandreDupas]] diff --git a/doc/bugs/po_vs_templates.mdwn b/doc/bugs/po_vs_templates.mdwn new file mode 100644 index 000000000..d826546e6 --- /dev/null +++ b/doc/bugs/po_vs_templates.mdwn @@ -0,0 +1,48 @@ +The po plugin's protection against processing loops (i.e. the +alreadyfiltered stuff) is playing against us: the template plugin +triggers a filter hooks run with the very same ($page, $destpage) +arguments pair that is used to identify an already filtered page. + +Processing an included template can then mark the whole translation +page as already filtered, which prevented `po_to_markup` to be called on +the PO content. + +Symptoms: the unprocessed gettext file goes unfiltered to the +generated HTML. + +This has been fixed in my po branch. + +> My commit dcd57dd5c9f3265bb7a78a5696b90976698c43aa updates the +> bugfix in a much more elegant manner. Its main disadvantage is to +> add an (optional) argument to IkiWiki::filter. Please review. + +-- [[intrigeri]] + +>> Hmm. Don't like adding a fourth positional parameter to that (or +>> any really) function. +>> +>> I think it's quite possible that some of the directives that are +>> calling filter do so unnecessarily. For example, conditional, +>> cutpaste, more, and toggle each re-filter text that comes from the +>> page and so has already been filtered. They could probably drop +>> the filtering. template likewise does not need to filter the +>> parameters passed into it. Does it need to filter the template output? +>> Well, it allows the (deprecated) embed plugin to work on template +>> content, but that's about it. +>> +>> Note also that the only other plugin to provide a filter, txt, +>> could also run into similar problems as po has, in theory (it looks at +>> the page parameter and assumes the content is for the whole page). +>> +>> [[!template id=gitbranch branch=origin/filter-full author="[[joey]]"]] +>> So, I've made a filter-full branch, where I attempt to fix this +>> by avoiding unnecessary filtering. Can you check it and merge it into +>> your po branch and remove your other workarounds so I can merge? +>> --[[Joey]] + +>>> I merged your filter-full branch into my po branch and reverted my +>>> other workarounds. According to my tests this works ok. I'm glad +>>> you found this solution, as I didn't like changing the filter +>>> prototype. I believe you can now merge this code. --[[intrigeri]] + +[[!tag patch done]] diff --git a/doc/bugs/post-commit_hangs.mdwn b/doc/bugs/post-commit_hangs.mdwn new file mode 100644 index 000000000..32820d886 --- /dev/null +++ b/doc/bugs/post-commit_hangs.mdwn @@ -0,0 +1,47 @@ +# post-commit hangs + +I installed ikiwiki v3.14159 in /usr/local from tarball (/usr contains an older version). Having done so, and used ikiwiki-transition to update setup file, the post commit hook is now blocking in flock (as seen by ps). I should also mention that I added the goodstuff, attachment and remove plugins (which was the purpose of upgrading to v3). Any clues as how to debug/fix gratefully received. The wiki is publically viewable at wiki.sgcm.org.uk if that helps. + +> It's blocking when you do what? Save a page from the web? Make a commit +> to the underlaying VCS? Which VCS? These are all different code paths.. +> --[[Joey]] + +>> It's blocking when I run "ikiwiki --setup ikiwiki.setup" (which calls hg update, which calls ikiwiki --post-commit). +>> Hmm, maybe it's the recursive call to ikiwiki which is the problem. +>> The underlying VCS is mercurial. --Ali + +>>> You're not supposed to run ikiwiki -setup manually in your post commit hook. +>>> Doing so will certianly lead to a locking problem; it also forces ikiwiki to rebuild +>>> the entire wiki anytime a single page changes, which is very inefficient! +>>> +>>> Instead, you should use the `mercurial_wrapper` setting +>>> in the setup file, which will make ikiwiki generate a small +>>> executable expressly designed to be run at post commit time. +>>> Or, you can use the `--post-commit` option, as documented +>>> in [[rcs/mecurial]] --[[Joey]] + +>>>> I don't run ikiwiki --setup in the commit hook; I run ikiwiki --post-commit (as mentioned above). +>>>> I'm trying to run ikiwiki --setup from the command line after modifying the setup file. +>>>> ikiwiki --setup is calling hg update, which is calling ikiwiki --post-commit. Am I not supposed to do that? --Ali + +>>>>> No, I don't think that hg update should call ikiwiki anything. The +>>>>> [[hgrc_example|rcs/mercurial]] doesn't seem to configure it to do that? --[[Joey]] + +>>>>>> Ok, I'm not sure I understand what's going on, but my problem is solved. +>>>>>> +>>>>>> My hgrc used to say: +>>>>>> +>>>>>> [hooks] +>>>>>> +>>>>>> incoming.update = hg up +>>>>>> +>>>>>> update.ikiwiki = ikiwiki --setup /home/ikiwiki/ikiwiki.setup --post-commit +>>>>>> +>>>>>> I've now changed it to match the example page and it works. Thanks --Ali. + +>>>>>>> [[done]] + +> Also, how have you arranged to keep it from seeing the installation in /usr? Perl could well be loading +> modules from the old installation, and if it's one with a different locking strategy that would explain your problem. --[[Joey]] + +>> Good point. Not knowing perl, I just assumed /usr/local would take precedence. I've now used "dpkg -r ikiwiki" to remove the problem. --Ali diff --git a/doc/bugs/post-update_hook_can__39__t_be_compiled_with_tcc.mdwn b/doc/bugs/post-update_hook_can__39__t_be_compiled_with_tcc.mdwn new file mode 100644 index 000000000..a8fb19888 --- /dev/null +++ b/doc/bugs/post-update_hook_can__39__t_be_compiled_with_tcc.mdwn @@ -0,0 +1,19 @@ +Thinking that any c compiler would do the job, I tried to use tcc with ikiwiki, as explicitely allowed by the Debian package dependencies. + +I installed `tcc` and `libc6-dev` (for `libcrt1`). The wrapper compilation was OK, but the wrapper fails to run correctly and dies with + + usage: ikiwiki [options] source dest + ikiwiki --setup configfile + +Everything works fine with gcc. + +versions: Debian lenny + backports + +> Seems that tcc does not respect changing where `environ` points as a way +> to change the environment seen after `exec` +> +> Given that the man page for `clearenv` suggests using `environ=NULL` +> if `clearenv` is not available, I would be lerry or using tcc to compile +> stuff, since that could easily lead to a security compromise of code that +> expects that to work. However, I have fixed ikiwiki to use `clearenv`. +> --[[Joey]] [[done]] diff --git a/doc/bugs/preview_pagestate.mdwn b/doc/bugs/preview_pagestate.mdwn new file mode 100644 index 000000000..7f7ec0976 --- /dev/null +++ b/doc/bugs/preview_pagestate.mdwn @@ -0,0 +1,13 @@ +If a change to a page is previewed, but not saved, `%pagestate` and +`%wikistate` can be changed, and saved. Actually, it's not limited to +those. Seems that spurious dependencies can be added, though existing +dependencies will at least not be removed. + +It calls saveindex to record state about files created on disk for the +preview. Those files will expire later. However, saveindex also +saves other state changes. + +Seems like it needs to isolate all state changes when previewing... ugh. +--[[Joey]] + +[[done]] diff --git a/doc/bugs/rebuild_after_changing_the_underlaydir_config_option.mdwn b/doc/bugs/rebuild_after_changing_the_underlaydir_config_option.mdwn new file mode 100644 index 000000000..8613ef03c --- /dev/null +++ b/doc/bugs/rebuild_after_changing_the_underlaydir_config_option.mdwn @@ -0,0 +1,12 @@ +It seems that rebuild a wiki (`ikiwiki --rebuild`) after changing the `underlaydir` config option doesn't remove the pages coming from the previous underlaydir. + +I've noticed this with the debian package version 3.20100102.3~bpo50+1. + +Perhaps it is possible to improve this or mention it in the manual page? + +--prosper + +> --rebuild causes ikiwiki to throw away all its info about what it built +> before, so it will never clean up pages that have been removed, by any +> means. Suggest you do a --refresh, possibly followed by a --rebuild +> if that is really necessary. --[[Joey]] diff --git a/doc/bugs/remove_orphaned_sparkline-php_from_Suggests.mdwn b/doc/bugs/remove_orphaned_sparkline-php_from_Suggests.mdwn index b4e2a1501..ab08c0b26 100644 --- a/doc/bugs/remove_orphaned_sparkline-php_from_Suggests.mdwn +++ b/doc/bugs/remove_orphaned_sparkline-php_from_Suggests.mdwn @@ -18,3 +18,5 @@ Thanks > rewriting the ikiwiki code to use it, *and* packaging that alternative > and maintaining it in Debian. So your suggestion doesn't make a lot of > sense; Debian should just find a maintainer for sparkline-php. --[[Joey]] + +[[done]] diff --git a/doc/bugs/removing_pages_with_utf8_characters.mdwn b/doc/bugs/removing_pages_with_utf8_characters.mdwn new file mode 100644 index 000000000..0d96aa75f --- /dev/null +++ b/doc/bugs/removing_pages_with_utf8_characters.mdwn @@ -0,0 +1,51 @@ +I have a page with the name "umläute". When I try to remove it, ikiwiki says: + +Error: ?umläute does not exist + +> I'm curious about the '?' in the "?umläute" message. Suggests that the +> filename starts with another strange character. Can I get a copy of a +> git repository or tarball containing this file? --[[Joey]] + +I wrote the following patch, which seems to work on my machine. I'm running on FreeBSD 6.3-RELEASE with ikiwiki-3.20100102.3 and perl-5.8.9_3. + + --- remove.pm.orig 2009-12-14 23:26:20.000000000 +0100 + +++ remove.pm 2010-01-18 17:49:39.000000000 +0100 + @@ -193,6 +193,7 @@ + # and that the user is allowed to edit(/remove) it. + my @files; + foreach my $page (@pages) { + + $page = Encode::decode_utf8($page); + check_canremove($page, $q, $session); + + # This untaint is safe because of the + + +> The problem with this patch is that, in a recent fix to the same +> plugin, I made `@pages` come from `$form->field("page")`, and +> that, in turn is already run through `decode_form_utf8` just above the +> code you patched. So I need to understand why that is apparently not +> working for you. (It works fine for me, even when deleting a file named +> "umläute" --[[Joey]] + +---- + +> Update, having looked at the file in the src of the wiki that +> is causing trouble for remove, it is: `uml\303\203\302\244ute.mdwn` +> And that is not utf-8 encoded, which, represented the same +> would be: `uml\303\244ute.mdwn` +> +> I think it's doubly-utf-8 encoded, which perhaps explains why the above +> patch works around the problem (since the page name gets doubly-decoded +> with it). The patch doesn't fix related problems when using remove, etc. +> +> Apparently, on apoca's system, perl encodes filenames differently +> depending on locale settings. On mine, it does not. Ie, this perl +> program always creates a file named `uml\303\244ute`, no matter +> whether I run it with LANG="" or LANG="en_US.UTF-8": +> +> perl -e 'use IkiWiki; writefile("umläute", "./", "baz")' +> +> Remains to be seen if this is due to the older version of perl used +> there, or perhaps FreeBSD itself. --[[Joey]] +> +> Update: Perl 5.10 fixed the problem. --[[Joey]] diff --git a/doc/bugs/rst_plugin_has_python_hardcode_in_shebang_line.mdwn b/doc/bugs/rst_plugin_has_python_hardcode_in_shebang_line.mdwn new file mode 100644 index 000000000..a594adc09 --- /dev/null +++ b/doc/bugs/rst_plugin_has_python_hardcode_in_shebang_line.mdwn @@ -0,0 +1,15 @@ +Current the rst plugin uses this shebang line: + + #!/usr/bin/python + +The problem is that rst plugin uses some feature (for example, iterator comprehension) which is unavailable on old version of Python. + +So rst plugin will not work on a machine which has an old version of python in system path even though +the user have installed a new version of python in other place. For example, I am using ikiwiki with the rst plugin on Mac OS X 10.4 which ships python 2.3 but I do have python2.6 installed on /opt/local/bin/python (via macports). + +Thus I suggest to change the shebang line to: + + #!/usr/bin/env python + +> [[done]], although the irony of all the perl hashbangs in ikiwiki +> being hardcoded doesn't escape me. --[[Joey]] diff --git a/doc/bugs/rst_tweak.mdwn b/doc/bugs/rst_tweak.mdwn index 9d433e24e..8667a459b 100644 --- a/doc/bugs/rst_tweak.mdwn +++ b/doc/bugs/rst_tweak.mdwn @@ -41,3 +41,12 @@ what is supposed to happen? --Peter > That's why the [[plugin_page|plugins/rst]] has its note about > issues with wikilinks and directives. You'd have to put those inside > raw directives yourself to avoid rst escaping their result. --[[Joey]] + +You can also create a raw "role" which is at least easier than raw directives. + + .. role:: ikiwiki(raw) + :format: html + + :ikiwiki:`\[[WikiLink]]` + +A role assigns meaning to interpreted text (for example :acronym:`ABC`) or :PEP:`8`. --ulrik [kaizer.se] diff --git a/doc/bugs/some_but_not_all_meta_fields_are_stored_escaped.mdwn b/doc/bugs/some_but_not_all_meta_fields_are_stored_escaped.mdwn new file mode 100644 index 000000000..587771ba4 --- /dev/null +++ b/doc/bugs/some_but_not_all_meta_fields_are_stored_escaped.mdwn @@ -0,0 +1,44 @@ +[[!template id=gitbranch branch=smcv/unescaped-meta author="[[Simon_McVittie|smcv]]"]] +[[!tag patch]] +(Warning: this branch has not been tested thoroughly.) + +While discussing the [[plugins/meta]] plugin on IRC, Joey pointed out that +it stores most meta fields unescaped, but 'title', 'guid' and 'description' +are special-cased and stored escaped (with numeric XML/HTML entities). This +is to avoid emitting markup in the `<title>` of a HTML page, or in an RSS/Atom +feed, neither of which are subject to the [[plugins/htmlscrubber]]. + +However, having the meta fields "partially escaped" like this is somewhat +error-prone. Joey suggested that perhaps everything should be stored +unescaped, and the escaping should be done on output; this branch +implements that. + +Points of extra subtlety: + +* The title given to the [[plugins/search]] plugin was previously HTML; + now it's plain text, potentially containing markup characters. I suspect + that that's what Xapian wants anyway (which is why I didn't change it), + but I could be wrong... + + > AFAICS, this if anything, fixes a bug, xapian definitely expects + > unescaped text here. --[[Joey]] + +* Page descriptions in the HTML `<head>` were previously double-escaped: + the description was stored escaped with numeric entities, then that was + output with a second layer of escaping! In this branch, I just emit + the page description escaped once, as was presumably the intention. + +* It's safe to apply this change to a wiki and neglect to rebuild it + (assuming I implemented it correctly!), but until the wiki is rebuilt, + titles, descriptions and GUIDs for unchanged pages will appear + double-escaped on any page that inlines them in `quick=yes` mode, and + is rebuilt for some other reason. The failure mode is too much escaping + rather than too little, so it shouldn't be a security problem. + +* Reverting this change, if applied, is more dangerous; until the wiki is + rebuilt, any titles, descriptions and GUIDs on unchanged pages that + contained markup could appear unescaped on any page that inlines them + in `quick=yes` mode, and is rebuilt for some other reason. The failure + mode here would be too little escaping, i.e. cross-site scripting. + +[[!tag done]] diff --git a/doc/bugs/stray___60____47__p__62___tags.mdwn b/doc/bugs/stray___60____47__p__62___tags.mdwn index 6e508ffda..99d6fe09f 100644 --- a/doc/bugs/stray___60____47__p__62___tags.mdwn +++ b/doc/bugs/stray___60____47__p__62___tags.mdwn @@ -13,3 +13,5 @@ I believe that this snippet in `IkiWiki.pm` might be the reason for the imbalanc } The fact that HTML in a `\[[!meta title]]` is added but then escaped might indicate that some other bug is involved. + +> [[done]] --[[Joey]] diff --git a/doc/bugs/support_for_openid2_logins.mdwn b/doc/bugs/support_for_openid2_logins.mdwn index 139a53760..a71ed7ba9 100644 --- a/doc/bugs/support_for_openid2_logins.mdwn +++ b/doc/bugs/support_for_openid2_logins.mdwn @@ -20,3 +20,5 @@ However both Perl OpenID 2.x implementations have not been released and are inco > I've tested with yahoo, and it works with the updated module. Sweet and > [[done]] --[[Joey]] + +## A quick fix for the impatient running stable is simply `sudo apt-get install libnet-openid-consumer-perl -t unstable` diff --git a/doc/bugs/svn_commit_failures_interpreted_as_merge_conflicts.mdwn b/doc/bugs/svn_commit_failures_interpreted_as_merge_conflicts.mdwn new file mode 100644 index 000000000..0c9bce4b9 --- /dev/null +++ b/doc/bugs/svn_commit_failures_interpreted_as_merge_conflicts.mdwn @@ -0,0 +1,21 @@ +I'm attempting a merge with the SVN plugin via the web interface +with ikiwiki-3.20100403 and subversion 1.6.11. + +The web interface says + + Your changes conflict with other changes made to the page. + + Conflict markers have been inserted into the page content. Reconcile the conflict and commit again to save your changes. + +However there are no merge conflict markers in the page. My apache error log says: + + [Fri Apr 30 16:43:57 2010] [error] [client 10.64.64.42] svn: Commit failed (details follow):, referer: https://unixwiki.ncl.ac.uk/ikiwiki.cgi + [Fri Apr 30 16:43:57 2010] [error] [client 10.64.64.42] svn: Authorization failed, referer: https://unixwiki.ncl.ac.uk/ikiwiki.cgi + +-- [[Jon]] + +> Only way for this to be improved would be for the svn plugin to +> explicitly check the file for conflict markers. I guess it could +> change the error message then, but the actual behavior of putting the +> changed file back in the editor so the user can recommit is about right +> as far as error recovery goes. --[[Joey]] diff --git a/doc/bugs/tag_plugin:_autotag_vs._staged_changes.mdwn b/doc/bugs/tag_plugin:_autotag_vs._staged_changes.mdwn new file mode 100644 index 000000000..e5526bedf --- /dev/null +++ b/doc/bugs/tag_plugin:_autotag_vs._staged_changes.mdwn @@ -0,0 +1,17 @@ +The autotag functionality of the tag plugin committed (when doing its +first commit) all changes that have been staged (in Git). I suggest it +should be restricted to the specific file only. --[[tschwinge]] + +> This is not specific to the tag plugin. Same can happen +> if you rename a file, or post a comment, or remove a file +> via web interface. All of these use `rcs_commit_staged`. +> +> This is why ikiwiki is supposed to have a checkout of +> the repository that it uses for its own purposes, and nobody else +> should mess with. There are various notes about that being needed here +> and there; you're free to not give ikiwiki its own repo, but you have to +> be aware that it can fight with you if you're making changes to the same +> repo. [[done]] --[[Joey]] + +>> Ack, that is reasonable. (And it's only been a very minor problem +>> during manual testing.) --[[tschwinge]] diff --git a/doc/bugs/tagged__40____41___matching_wikilinks.mdwn b/doc/bugs/tagged__40____41___matching_wikilinks.mdwn index e7e4af7c3..a211654f1 100644 --- a/doc/bugs/tagged__40____41___matching_wikilinks.mdwn +++ b/doc/bugs/tagged__40____41___matching_wikilinks.mdwn @@ -28,6 +28,8 @@ rationale on this, or what am I doing wrong, and how to achieve what I want? >> is valid. [[todo/matching_different_kinds_of_links]] is probably >> how it will eventually be solved. --[[Joey]] +>>> [[Done]]: `tagged` no longer matches other wikilinks. --[[smcv]] + > And this is an illustration why a clean work-around (without changing the software) is not possible: while thinking about [[todo/matching_different_kinds_of_links]], I thought one could work around the problem by simply explicitly including the kind of the relation into the link target (like the tagbase in tags), and by having a separate page without the "tagbase" to link to when one wants simply to refer to the tag without tagging. But this won't work: one has to at least once refer to the real tag page if one wants to talk about it, and this reference will count as tagging (unwanted). --Ivan Z. > But well, perhaps there is a workaround without introducing different kinds of links. One could modify the [[tag plugin|plugins/tag]] so that it adds 2 links to a page: for tagging -- `tagbase/TAG`, and for navigation -- `tagdescription/TAG` (displayed at the bottom). Then the `tagdescription/TAG` page would hold whatever list one wishes (with `tagged(TAG)` in the pagespec), and whenever one wants to merely refer to the tag, one should link to `tagdescription/TAG`--this link won't count as tagging. So, `tagbase/TAG` would become completely auxiliary (internal) link targets for ikiwiki, the users would edit or link to only `tagdescription/TAG`. --Ivan Z. diff --git a/doc/bugs/templateForRecentChangesMissingCloseSpan.mdwn b/doc/bugs/templateForRecentChangesMissingCloseSpan.mdwn new file mode 100644 index 000000000..5c322991a --- /dev/null +++ b/doc/bugs/templateForRecentChangesMissingCloseSpan.mdwn @@ -0,0 +1,26 @@ +In the template for ikiwiki's recent changes page + + /usr/share/ikiwiki/templates/change.tmpl + +there is a missing </span> tag after the + + <span class="changedate"><TMPL_VAR COMMITDATE> + +This results in the recentchanges/ page being invalid and rendering quite horrifyingly in Internet Exploder. + +[I'm running](http://wiki.shlrm.org) (linked so you can see the one I'm running if you need to) the latest version of ikiwiki, and I note that it's broken on [ikiwiki.info](http://validator.w3.org/check?uri=http%3A%2F%2Fikiwiki.info%2Frecentchanges%2F&charset=%28detect+automatically%29&doctype=Inline&group=0&user-agent=W3C_Validator%2F1.767) too :) + +[This one on debian](https://www.icanttype.org/recentchanges/) is somehow [valid](http://validator.w3.org/check?uri=https%3A%2F%2Fwww.icanttype.org%2F%2Frecentchanges%2F&charset=%28detect+automatically%29&doctype=Inline&group=0&user-agent=W3C_Validator%2F1.767), although it's using the same template. Perhaps there's an additional scrubbing going on his end. + +Thanks, +David + +PS: I have fixed the template by hand on my server, so it will validate, however ikiwiki.info will not. + +> [[!template id="gitbranch" branch=smcv/trivia author="[[smcv]]"]] [[!tag patch]] +> Enabling either [[plugins/htmltidy]] or [[plugins/htmlbalance]] will automatically fix unbalanced +> markup like this; using [[plugins/comments]] without having one or other of those is a bad idea +> from the point of view of avoiding comment forgery, which is probably why icanttype.org works +> correctly. Anyway, I've fixed this in a branch: Joey, care to review smcv/trivia? --[[smcv]] + +[[done]], thanks guys --[[Joey]] diff --git a/doc/bugs/the_login_page_is_unclear_when_multiple_methods_exist.mdwn b/doc/bugs/the_login_page_is_unclear_when_multiple_methods_exist.mdwn new file mode 100644 index 000000000..70266c49c --- /dev/null +++ b/doc/bugs/the_login_page_is_unclear_when_multiple_methods_exist.mdwn @@ -0,0 +1,16 @@ +When multiple login methods are enabled, the ikiwiki login page lists one form per method, e.g. + + * one for openid + * one for local user/password store + +Followed by the "login" button underneath. It's not obvious to anyone unfamiliar with the software that these are distinct forms, or that there are multiple ways of logging in, etc. -- [[Jon]] + +> As discussed in [[login_page_non-obvious_with_openid]], +> architectural reasons disallow multiple forms, with multiple +> submit buttons. But the default style sheet includes +> a styling for the openid portion of the form that makes +> it visually distinct from the rest of the form. I'm sure the styling +> could be improved, but the current form does not seem too non-obvious +> to me, or to naive users in the field. --[[Joey]] + +>> [[done]], better fixed by new fancy openid login form. --[[Joey]] diff --git a/doc/bugs/toggle_expects_body_element_without_attributes.mdwn b/doc/bugs/toggle_expects_body_element_without_attributes.mdwn new file mode 100644 index 000000000..0b39346f4 --- /dev/null +++ b/doc/bugs/toggle_expects_body_element_without_attributes.mdwn @@ -0,0 +1,3 @@ +The toggle plugins checks for a `<body>` in the page; if not found, javascript tags are inserted at the top of the document. Since my page uses `<body onload="javascript:fixLinks()">`; a plain `<body>` is not found and I get script links before the docstring declaration. Please see the source of the following toggle-using page: http://kaizer.se/wiki/kupfer/ -- ulrik [kaizer.se] + +[[fixed|done]] --[[Joey]] diff --git a/doc/bugs/transitive_dependencies.mdwn b/doc/bugs/transitive_dependencies.mdwn new file mode 100644 index 000000000..c44fe7962 --- /dev/null +++ b/doc/bugs/transitive_dependencies.mdwn @@ -0,0 +1,94 @@ +If a sidebar contains a map, or inline (etc), one would expect a +add/remove of any of the mapped/inlined pages to cause a full wiki +rebuild. But this does not happen. + +If page A inlines page B, which inlines page C, a change to C will cause B +to be updated, but A will not "notice" that this means A needs to be +updated. + +One way to look at this bug is that it's a bug in where dependencies are +recorded when preprocessing the rendered or sidebar page. The current code +does: + + add_depends($params{page}, $somepage); + +Where `$params{page}` is page B. If this is changed to `$params{destpage}`, +then the dependency is added to page A, and updates to C cause it to +change. This does result in the page A's getting lots more dependency info +recorded than before (essentially a copy of all the B's dependency info). + +It's also a fragile, since all plugins that handle dependencies have to be +changed, and do this going forward. And it seems non-obvious that this should +be done. Or really, whether to use `page` or `destpage` there. Currently, +making the "wrong" choice and using `destpage` instead of `page` (which nearly +everything uses) will just result in semi-redundant dependency info being +recorded. If we make destpage mandatory to fix this, goofing up will lead to +this bug coming back. Ugh. + +---- + +## rebuild = change approach + +[[!template id=gitbranch branch=origin/transitive-dependencies author="[[joey]]"]] + +Another approach to fix it is to say that anything that causes a +rebuild of B is treated as a change of B. Then when C is changed, B is +rebuilt due to dependencies, and in turn this means A is rebuilt because B +"changed". + +This is essentially what is done with wikilinks now, and why, if a sidebar +links to page C, add/remove of C causes all pages to be rebuilt, as seen +here: + + removing old page meep + building sidebar.mdwn, which links to meep + building TourBusStop.mdwn, which depends on sidebar + building contact.mdwn, which depends on sidebar + ... + +Downsides here: + +* Means a minimum of 2x as much time spent resolving dependencies, + at least in my simple implementation, which re-runs the dependency + resolution loop until no new pages are rebuilt. + (I added an optimisation that gets it down to 1.5X as much work on + average, still 2x as much worst case. I suppose building a directed + graph and traversing it would be theoretically more efficient.) +* Causes extra work for some transitive dependencies that we don't + actually care about. This is amelorated, but not solved by + the current work on [[todo/dependency_types]]. + For example, changing index causes + plugins/brokenlinks to update in the first pass; if there's a second + pass, plugins/map is no longer updated (contentless dependencies FTW), + but plugins is, because it depends on plugins/brokenlinks. + (Of course, this is just a special case of the issue that a real + modification to plugins/brokenlinks causes an unnecessary update of + plugins, and could be solved by adding more dependency types.) + +[[done]] --[[Joey]] + +> Some questions/comments... I've thought about this a lot for [[todo/tracking_bugs_with_dependencies]]. +> +> * When you say that anything that causes a rebuild of B is treated as a change of B, are you: i) Treating +> any rebuild as a change, or ii) Treating any rebuild that gives a new result as a change? Option ii) would +> lead to fewer rebuilds. Implementation is easy: when you're about to rebuild a page, load the old rendered html in. Do the rebuild. Compare +> the new and old html. If there is a difference, then mark that page as having changed. If there is no difference +> then you don't need to mark that pages as changed, even though it has been rebuilt. (This would ignore pages in meta-data that don't +> cause changes in html, but I don't think that is a huge issue.) + +>> That is a good idea. I will have to look at it to see if the overhead of +>> reading back in the html of every page before building actually is a +>> win though. So far, I've focused on avoiding unnecessary rebuilds, and +>> there is still some room for more dependency types doing so. +>> (Particularly for metadata dependencies..) --[[Joey]] + +> * The second comment I have relates to cycles in transitive dependencies. At the moment I don't think this is +> possible, but with some additions it may well become so. This could be problematic as it could lead to a) +> updates that never complete, or b) it being theoretically unclear what the final result should be (i.e. you +> can construct logical paradoxes in the system). I think the point above about marking things as changed only when +> the output actually changes fixes any cases that are well defined. For logical paradoxes and infinite loops (e.g. +> two pages that include each other), you might want to put a limit on the number of times you'll rebuild a page in any +> given run of ikiwiki. Say, only allow a page to rebuild twice on any run, regardless of whether a page it depends on changes. +> This is not a perfect solution, but would be a good approximation. -- [[Will]] + +>> Ikiwiki only builds any given output file once per run, already. --[[Joey]] diff --git a/doc/bugs/underlaydir_file_expose.mdwn b/doc/bugs/underlaydir_file_expose.mdwn index c827c6dd8..4ee30e39d 100644 --- a/doc/bugs/underlaydir_file_expose.mdwn +++ b/doc/bugs/underlaydir_file_expose.mdwn @@ -1,4 +1,13 @@ If a file in the srcdir is removed, exposing a file in the underlaydir, -ikiwiki will notice the removal and delete the page from the destdir. The +ikiwiki will not notice the removal, and the page from the underlay will not be built. (However, it will be if the wiki gets rebuilt.) + +> This problem is caused by ikiwiki storing only filenames relative to +> the srcdir or underlay, and mtime comparison not handling this case. + +> A related problem occurs if changing a site's theme with the +> [[plugins/theme]] plugin. The style.css of the old and new theme +> often has the same mtime, so ikiwiki does not update it w/o a rebuild. +> This is worked around in theme.pm with a special-purpose needsbuild hook. +> --[[Joey]] diff --git a/doc/bugs/unicode_encoded_urls_and_recentchanges.mdwn b/doc/bugs/unicode_encoded_urls_and_recentchanges.mdwn index 88dbfc39b..262aa24fc 100644 --- a/doc/bugs/unicode_encoded_urls_and_recentchanges.mdwn +++ b/doc/bugs/unicode_encoded_urls_and_recentchanges.mdwn @@ -2,7 +2,7 @@ it appears that unicode characters in the title that are unicode letters are spa > Filenames can have any alphanumerics in them without the __ escaping. > Your locale determines whether various unicode characters are considered -> alphanumeric. In other words, it just looks at the [[:alpha:]] character +> alphanumeric. In other words, it just looks at the \[[:alpha:]] character > class, whatever your locale defines it to be. --[[Joey]] this is not a problem per se, but (at least with git backend) the recent changes missinterpret the file name character set (it seems to read the filenames as latin1) and both display wrong titles and create broken links. diff --git a/doc/bugs/utf-8_bug_in_websetup.pm.mdwn b/doc/bugs/utf-8_bug_in_websetup.pm.mdwn new file mode 100644 index 000000000..debedb01c --- /dev/null +++ b/doc/bugs/utf-8_bug_in_websetup.pm.mdwn @@ -0,0 +1,22 @@ +[[!tag patch bugs]] + +I type chinese characters into the fields. After press "save setup" button the characters turn into gibberish. + +I submit a patch that solve the problem for me. --Lingo + +> Fully fixing it is slightly more complex, but now [[done]] --[[Joey]] + +---- + + --- websetup.pm 2009-12-02 05:07:46.000000000 +0800 + +++ /usr/share/perl5/IkiWiki/Plugin/websetup.pm 2010-01-08 22:05:16.000000000 +0800 + @@ -308,7 +308,8 @@ + $fields{$_}=$shown{$_} foreach keys %shown; + } + } + - + + + + IkiWiki::decode_form_utf8($form); + if ($form->submitted eq "Cancel") { + IkiWiki::redirect($cgi, $config{url}); + return; diff --git a/doc/bugs/wrapper_can__39__t_find_the_perl_modules.mdwn b/doc/bugs/wrapper_can__39__t_find_the_perl_modules.mdwn new file mode 100644 index 000000000..9804d86c5 --- /dev/null +++ b/doc/bugs/wrapper_can__39__t_find_the_perl_modules.mdwn @@ -0,0 +1,16 @@ +If i intsall perl modules in my custom directory, cgi wrapper can't find them. I found clearing enviroment variables in code of wrapper. But information about custom directories put to perl with PERL5LIB variable. + +Workaround: add newenviron variable PERL5LIB + +My additional question - what wrapper do? I'am russian hosting provider. I am interesting with ikiwiki. + +> The wrapper allows ikiwiki to run as the user who owns the wiki, which +> is generally not the same as the user that runs the web server. +> (It also handles some other things, like some locking.) +> +> As a suid program, the wrapper cannot safely let environment variables +> pass through. +> +> If you want to install ikiwiki's perl modules in a nonstandard location, +> you can set `INSTALL_BASE` when running `Makefile.PL`. ikiwiki will then +> be built to look in that location. --[[Joey]] [[!tag done]] diff --git a/doc/competition.mdwn b/doc/competition.mdwn new file mode 100644 index 000000000..2c782ea92 --- /dev/null +++ b/doc/competition.mdwn @@ -0,0 +1,19 @@ +When I started ikiwiki in 2006, there were no other existing systems that +filled quite the niche of generating a static html wiki out of markdown +files stored in a [[VCS|rcs]]. My +[first blog about ikiwiki](http://kitenet.net/~joey/blog/entry/seeking_wiki/) +looked at some projects that were semi-close, and found them wanting. + +My hope was that besides being useful to all its [[users|ikiwikiusers]], +ikiwiki would help spread its underlying concepts. Let a thousand flowers +bloom! These are some that have sprung up since. --[[Joey]] + +* [Gitit](http://gitit.johnmacfarlane.net/) is a wiki backed by a git (or + darcs) filestore. No static rendering here; pages are generated on the fly. + It's written in Haskell and uses the amazing PanDoc to generate html + from markdown or many other formats. + +* [Markdoc](http://blog.zacharyvoase.com/post/246800035) statically builds + a wiki from markdown source (which can be in a VCS, if you check it in). + It includes a built-in webserver to ease serving the generated static + html. diff --git a/doc/css.mdwn b/doc/css.mdwn index 5b6b9e1af..35db57b0c 100644 --- a/doc/css.mdwn +++ b/doc/css.mdwn @@ -3,11 +3,22 @@ ## Using CSS with ikiwiki Ikiwiki comes with two CSS stylesheets: [[style.css]] and [[local.css]]. -The idea is to customize the second one overriding the first one and +The idea is to customize the second one, overriding the first one and defining brand new rendering rules. While ikiwiki's default use of stylesheets is intentionally quite plain and minimalistic, CSS allows creating any kind of look you can dream up. +The [[theme_plugin|plugins/theme]] provides some prepackaged themes in an +easy to use way. + The [[css_market]] page is an attempt to collect user contributed local.css files. + +## Per-page CSS + +The [[plugins/meta]] plugin can be used to add additional style sheets to a +page. + +The [[plugins/localstyle]] plugin can be used to override the toplevel +[[local.css]] for a whole section of the wiki. diff --git a/doc/css/discussion.mdwn b/doc/css/discussion.mdwn new file mode 100644 index 000000000..dc9663e56 --- /dev/null +++ b/doc/css/discussion.mdwn @@ -0,0 +1,18 @@ +I must be doing something wrong. Running setup: + +$ ikiwiki --setup MyIkiwiki.setup + +overwrites my local.css with the default (empty) version. + +I am using ikiwiki_3.1415926.tar.gz installed into /usr/local. + +--- +Sorry. Never mind. RTFM. --refresh duh. + +> Hmm, well. Using --refresh is a good thing because it allow ikiwiki to +> update a site quicker. But, it must only be hiding the real problem. +> Sounds like you are trying to edit local.css directly inside +> the destdir. But ikiwiki has a local.css located in its basewiki, +> so when you rebuild your local mods are lost. Fix is to put you +> locally modified local.css inside the srcdir, along with the other pages +> of your wiki. --[[Joey]] diff --git a/doc/css_market.mdwn b/doc/css_market.mdwn index 61254c4b8..0e5a68740 100644 --- a/doc/css_market.mdwn +++ b/doc/css_market.mdwn @@ -5,7 +5,8 @@ these style sheets can be installed by copying them into your wiki's source dir with a filename of `local.css`. Feel free to add your own stylesheets here. (Upload as wiki pages; wiki -gnomes will convert them to css files..) +gnomes will convert them to css files..) Selected ones from here are +included in the [[theme_plugin|plugins/theme]]. * **[[css_market/zack.css]]**, contributed by [[StefanoZacchiroli]], customized mostly for *blogging purposes*, can be seen in action on @@ -15,7 +16,6 @@ gnomes will convert them to css files..) * **[[css_market/kirkambar.css]]**, contributed by [[Roktas]]. This far from perfect stylesheet follows a [Gitweb](http://www.kernel.org/git/?p=git/git.git;a=tree;f=gitweb) like theme, so it may provide a consistent look'n feel along with the [[rcs/git]] backend. ;-) - You can see it in action on [kirkambar](http://kirkambar.net/) (Turkish content). [[!meta stylesheet="kirkambar"]] * **[[css_market/embeddedmoose.css]]**, contributed by [[JoshTriplett]]. @@ -23,17 +23,6 @@ gnomes will convert them to css files..) Debian lighttpd index.html page. [[!meta stylesheet="embeddedmoose"]] -* **Refresh**, contributed by [[FredericLespez]]. Adapted from a free template - designed by [styleshout](http://www.styleshout.com). - You can see it [here](http://fred.ccheznous.org). You can download the local.css file and - the modified templates [here](http://fred.ccheznous.org/refresh_20060602.tgz). - - * This link (above) seems to deliver an empty tarball. - - * You'll find a updated version of these templates [here](http://www.der-winnie.de/~winnie/configs/ikiwiki-templates.tar.gz). - These templates are known to work with ikiwiki 2.31, and since I'll install always the newest one on my server I'll will update them on a regular basis. - * (This link appears to be broken?) - * **[[02_Template.css]]**, contributed and adapted by [maxx](http://martin.wuertele.net/), [original](http://www.openwebdesign.org/viewdesign.phtml?id=3057) designed by [jarico](http://www.openwebdesign.org/userinfo.phtml?user=jcarico) (License: public domain). You'll need a modified page.tmpl @@ -61,6 +50,12 @@ gnomes will convert them to css files..) the action list (Edit, RecentChanges, etc.) as tabs. [[!meta stylesheet="actiontabs"]] +* **[wiki.css](http://cyborginstitute.net/includes/wiki.css)** by [[tychoish]]. + I typically throw this in as `local.css` in new wikis as a slightly more clear and readable + layout for wikis that need to be functional and elegant, but not necessarily uniquely designed. + Currently in use by the [the outeralliance wiki](http://oa.criticalfutures.com/). + + If your web browser allows selecting between multiple stylesheets, this page can be viewed using many of the stylesheets above. For example, if using Epiphany with the Select Stylesheet extension enabled, use View -> diff --git a/doc/css_market/discussion.mdwn b/doc/css_market/discussion.mdwn index 86d16de9a..3dc47b55a 100644 --- a/doc/css_market/discussion.mdwn +++ b/doc/css_market/discussion.mdwn @@ -18,3 +18,20 @@ The most recent version is always available at: <http://git.upsilon.cc/cgi-bin/g ---- Added small changes to embeddedmoose.css to work with ikiwiki 3.x. Figured here is as good as any until Josh can review and update if he so chooses: <http://bosboot.org/tb-embeddedmoose.css>. It removes annoying borders around the header and footer. -- [[TimBosse]] + +----- + +I removed this from the list since both places to download it are broken. +--[[Joey]] + +* **Refresh**, contributed by [[FredericLespez]]. Adapted from a free template + designed by [styleshout](http://www.styleshout.com). + You can see it [here](http://fred.ccheznous.org). You can download the local.css file and + the modified templates [here](http://fred.ccheznous.org/refresh_20060602.tgz). + + * This link (above) seems to deliver an empty tarball. + + * You'll find a updated version of these templates [here](http://www.der-winnie.de/~winnie/configs/ikiwiki-templates.tar.gz). + These templates are known to work with ikiwiki 2.31, and since I'll install always the newest one on my server I'll will update them on a regular basis. + * (This link appears to be broken?) + diff --git a/doc/download.mdwn b/doc/download.mdwn index 015c87f1b..92c8a4f75 100644 --- a/doc/download.mdwn +++ b/doc/download.mdwn @@ -1,14 +1,17 @@ -Here's how to get ikiwiki. See [[setup]] for how to use it, and be sure to -add your wiki to [[IkiwikiUsers]] if you use ikiwiki. +Here's how to get ikiwiki in source or prepackaged form. See [[setup]] for +how to use it, and be sure to add your wiki to [[IkiwikiUsers]] if you use +ikiwiki. -## tarball +## source + +Ikiwiki is developed in a [[git_repository|git]]. The best place to download a tarball of the latest release is from <http://packages.debian.org/unstable/source/ikiwiki>. -Installation steps and requirements are listed on the [[install]] page. +Manual installation steps and requirements are listed on the [[install]] page. -## packages +## Debian / Ubuntu packages To install with apt, if using Debian or Ubuntu: @@ -16,29 +19,34 @@ To install with apt, if using Debian or Ubuntu: Or download the deb from <http://packages.debian.org/unstable/web/ikiwiki>. -There is a backport of a recent version of ikiwiki for Debian 4.0 at -<http://packages.debian.org/etch-backports/ikiwiki>. - -Fedora versions 8 and newer have RPMs of ikiwiki available. +There is a backport of a recent version of ikiwiki for Debian 5.0 at +<http://packages.debian.org/lenny-backports/ikiwiki>. There is also an unofficial backport of ikiwiki for Ubuntu Jaunty, provided by [[Paweł_Tęcza|users/ptecza]], at [http://gpa.net.icm.edu.pl/ubuntu/](http://gpa.net.icm.edu.pl/ubuntu/index-en.html). +## RPM packages + +Fedora versions 8 and newer have RPMs of ikiwiki available. + +Ikiwiki's source includes a RPM spec file, which you can use to build your +own RPM. + +## BSD ports + +Ikiwiki can be installed [from macports](http://www.macports.org/ports.php?by=name&substr=ikiwiki) +by running `sudo port install ikiwiki`. + NetBSD and many other platforms: pkgsrc has an [ikiwiki package](ftp://ftp.netbsd.org/pub/pkgsrc/current/pkgsrc/www/ikiwiki/README.html). FreeBSD has ikiwiki in its [ports collection](http://www.freshports.org/www/ikiwiki/). +## Other packages + Gentoo has an [ebuild](http://bugs.gentoo.org/show_bug.cgi?id=144453) in its bug database. The [openSUSE Build Service](http://software.opensuse.org/search?baseproject=ALL&p=1&q=ikiwiki) has packages for openSUSE -IkiWiki can be installed [from macports](http://www.macports.org/ports.php?by=name&substr=ikiwiki) -by running `sudo port install ikiwiki`. - A [PKGBUILD for Arch Linux](http://aur.archlinux.org/packages.php?ID=12284) is in the AUR. - -## revision control - -Ikiwiki is developed in a [[git_repository|git]]. diff --git a/doc/examples/blog.mdwn b/doc/examples/blog.mdwn index 2155d7fea..b25601227 100644 --- a/doc/examples/blog.mdwn +++ b/doc/examples/blog.mdwn @@ -5,21 +5,22 @@ Or, run this command to set up a blog with ikiwiki. % ikiwiki -setup /etc/ikiwiki/auto-blog.setup -Some additional configuration you might want to do: +Some additional configuration you might want to do, if not using +`auto-blog.setup`: * Make sure to configure ikiwiki to generate RSS or Atom feeds. -* Make sure you have the tag plugin enabled, and tag posts using it. An - example of how to tag a post is: - \[[!tag tags/life]] - -* Enable the [[sidebar|plugins/sidebar]] plugin to get a sidebar listing all - the categories you've tagged posts with. +* Make sure you have the tag plugin enabled, and the `tagbase` set to + "tags". Tag pages will then automatically be created. + An example of how to tag a post is: + \[[!tag life]] * Enable the [[pagestats|plugins/pagestats]] plugin to get a tag cloud to display on the [[index]]. -* Enable the [[comments|plugins/comments]] plugin and configure it to - enable comments to posts to the blog: +* Enable the [[comments|plugins/comments]] plugin to + enable comments to posts to the blog. - comments_pagespec => 'blog/posts/* and !*/Discussion', +* Enable the [[calendar|plugins/calendar]] plugin and run the + [[ikiwiki-calendar]] command from cron daily to get an interlinked + set of calendar archives. diff --git a/doc/examples/blog/archives.mdwn b/doc/examples/blog/archives.mdwn new file mode 100644 index 000000000..d07b73b74 --- /dev/null +++ b/doc/examples/blog/archives.mdwn @@ -0,0 +1,8 @@ +[[!if test="archives/*" then=""" +Browse through blog archives by year: +[[!map pages="./archives/* and !./archives/*/* and !*/Discussion"]] +""" +else=""" +You need to use the `ikiwiki-calendar` program to generate calendar-based +archive pages. +"""]] diff --git a/doc/examples/blog/comments.mdwn b/doc/examples/blog/comments.mdwn new file mode 100644 index 000000000..e22b50a34 --- /dev/null +++ b/doc/examples/blog/comments.mdwn @@ -0,0 +1,10 @@ +[[!sidebar content=""" +[[!inline pages="comment_pending(./posts/*)" feedfile=pendingmoderation +description="comments pending moderation" show=-1]] +Comments in the [[!commentmoderation desc="moderation queue"]]: +[[!pagecount pages="comment_pending(./posts/*)"]] +"""]] + +Recent comments on posts in the [[blog|index]]: +[[!inline pages="./posts/*/Discussion or comment(./posts/*)" +template="comment"]] diff --git a/doc/examples/blog/index.mdwn b/doc/examples/blog/index.mdwn index 84c732dd1..7914cd203 100644 --- a/doc/examples/blog/index.mdwn +++ b/doc/examples/blog/index.mdwn @@ -1,13 +1,11 @@ -[[!pagestats pages="./tags/*"]] +[[!if test="enabled(sidebar)" then=""" +[[!sidebar]] +""" else=""" +[[!inline pages=sidebar raw=yes]] +"""]] -Welcome to my blog. - -Have a look at the most recent posts below, or browse the tag cloud on the -right. An archive of all [[posts]] is also available. - -[[!inline pages="./posts/* and !*/Discussion" show="10" +[[!inline pages="page(./posts/*) and !*/Discussion" show="10" actions=yes rootpage="posts"]] ----- This blog is powered by [ikiwiki](http://ikiwiki.info). diff --git a/doc/examples/blog/posts.mdwn b/doc/examples/blog/posts.mdwn index 4b2939120..08e014838 100644 --- a/doc/examples/blog/posts.mdwn +++ b/doc/examples/blog/posts.mdwn @@ -1,3 +1,3 @@ -Here is a full list of posts to my [[blog|index]]. +Here is a full list of posts to the [[blog|index]]. -[[!inline pages="./posts/* and !*/Discussion" archive=yes feedshow=10 quick=yes]] +[[!inline pages="page(./posts/*) and !*/Discussion" archive=yes feedshow=10 quick=yes]] diff --git a/doc/examples/blog/posts/first_post.mdwn b/doc/examples/blog/posts/first_post.mdwn index d49432341..343497d18 100644 --- a/doc/examples/blog/posts/first_post.mdwn +++ b/doc/examples/blog/posts/first_post.mdwn @@ -1,4 +1,2 @@ This is the first post to this example blog. To add new posts, just add files to the posts/ subdirectory, or use the web form. - -[[!tag tags/tech]] diff --git a/doc/examples/blog/sidebar.mdwn b/doc/examples/blog/sidebar.mdwn index a9fac388e..e0895f63f 100644 --- a/doc/examples/blog/sidebar.mdwn +++ b/doc/examples/blog/sidebar.mdwn @@ -1,7 +1,10 @@ -Example sidebar +[[!if test="enabled(calendar)" then=""" +[[!calendar pages="page(./posts/*) and !*/Discussion"]] +"""]] -* [[Blog|index]] -* [[Archive|posts]] +[[Recent Comments|comments]] -Categories: -[[!map pages="./tags/* and !*/Discussion"]] +[[Archives]] + +[[Tags]]: +[[!pagestats style="list" pages="./tags/*" among="./posts/*"]] diff --git a/doc/examples/blog/tags.mdwn b/doc/examples/blog/tags.mdwn index 53cc8d368..b5eca5b71 100644 --- a/doc/examples/blog/tags.mdwn +++ b/doc/examples/blog/tags.mdwn @@ -1,3 +1,3 @@ -[[!pagestats pages="./tags/*"]] +[[!pagestats pages="./tags/*" among="./posts/*"]] On the right you can see the tag cloud for this blog. diff --git a/doc/examples/blog/tags/life.mdwn b/doc/examples/blog/tags/life.mdwn deleted file mode 100644 index ddc2e646c..000000000 --- a/doc/examples/blog/tags/life.mdwn +++ /dev/null @@ -1,4 +0,0 @@ -This feed contains pages in the "life" category. - -[[!inline pages="link(tags/life) and ./posts/* and !*/Discussion" -show="10" actions=yes]] diff --git a/doc/examples/blog/tags/tech.mdwn b/doc/examples/blog/tags/tech.mdwn deleted file mode 100644 index e811cac34..000000000 --- a/doc/examples/blog/tags/tech.mdwn +++ /dev/null @@ -1,3 +0,0 @@ -This feed contains pages in the "tech" category. - -[[!inline pages="link(tags/tech) and !*/Discussion" show=10 actions=yes]] diff --git a/doc/examples/softwaresite.mdwn b/doc/examples/softwaresite.mdwn index e43a9d116..99f791177 100644 --- a/doc/examples/softwaresite.mdwn +++ b/doc/examples/softwaresite.mdwn @@ -14,3 +14,6 @@ Some additional configuration you might want to do: * Read the [[tips/integrated_issue_tracking_with_ikiwiki]] article for tips about how to use ikiwiki as a BTS. + +* Read [[tips/spam_and_softwaresites]] for information on how to keep spam + and spam-fighting commits out of your main version control history. diff --git a/doc/examples/softwaresite/news.mdwn b/doc/examples/softwaresite/news.mdwn index 9b53c7d99..20efba6e0 100644 --- a/doc/examples/softwaresite/news.mdwn +++ b/doc/examples/softwaresite/news.mdwn @@ -1,4 +1,4 @@ -This is where annoucements of new releases, features, and other news is +This is where announcements of new releases, features, and other news is posted. FooBar users are recommended to subscribe to this page's RSS feed. diff --git a/doc/features.mdwn b/doc/features.mdwn index ff341d2cc..1f8168703 100644 --- a/doc/features.mdwn +++ b/doc/features.mdwn @@ -13,15 +13,19 @@ Instead of editing pages in a stupid web form, you can use vim and commit changes via [[Subversion|rcs/svn]], [[rcs/git]], or any of a number of other [[Revision_Control_Systems|rcs]]. -ikiwiki can be run from a [[post-commit]] hook to update your wiki +Ikiwiki can be run from a [[post-commit]] hook to update your wiki immediately whenever you commit a change using the RCS. +It's even possible to securely let +[[anonymous_users_git_push_changes|tips/untrusted_git_push]] +to the wiki. + Note that ikiwiki does not require a RCS to function. If you want to run a simple wiki without page history, it can do that too. ## A wiki compiler -ikiwiki is a wiki compiler; it builds a static website for your wiki, and +Ikiwiki is a wiki compiler; it builds a static website for your wiki, and updates it as pages are edited. It is fast and smart about updating a wiki, it only builds pages that have changed (and tracks things like creation of new pages and links that can indirectly cause a page to need a rebuild) @@ -41,7 +45,7 @@ easily be added by [[plugins]]. For example it also supports traditional [[plugins/HTML]], or pages written in [[reStructuredText|plugins/rst]] or [[Textile|plugins/textile]]. -ikiwiki also supports files of any other type, including plain text, +Ikiwiki also supports files of any other type, including plain text, images, etc. These are not converted to wiki pages, they are just copied unchanged by ikiwiki as it builds your wiki. So you can check in an image, program, or other special file and link to it from your wiki pages. @@ -66,9 +70,11 @@ you would care to syndicate. ## Valid html and [[css]] -ikiwiki aims to produce -[valid XHTML 1.0](http://validator.w3.org/check?url=referer). ikiwiki -generates html using [[templates|wikitemplates]], and uses [[css]], so you +Ikiwiki aims to produce +[valid XHTML 1.0](http://validator.w3.org/check?url=referer). +(Experimental [[tips/HTML5]] support is also available.) + +Ikiwiki generates html using [[templates]], and uses [[css]], so you can change the look and layout of all pages in any way you would like. ## [[Plugins]] @@ -142,7 +148,8 @@ authentication, or other methods implemented via plugins. Thanks to subpages, every page can easily and automatically have a /Discussion subpage. By default, these links are included in the -[[templates]] for each page. +[[templates]] for each page. If you prefer blog-syle +[[plugins/comments]], that is available too. ### Edit controls @@ -158,9 +165,14 @@ Well, sorta. Rather than implementing YA history browser, it can link to ### Full text search -ikiwiki can use the xapian search engine to add powerful +Ikiwiki can use the xapian search engine to add powerful full text [[plugins/search]] capabilities to your wiki. +### Translation via po files + +The [[plugins/po]] plugin allows translating individual wiki pages using +standard `po` files. + ### [[w3mmode]] Can be set up so that w3m can be used to browse a wiki and edit pages diff --git a/doc/forum.mdwn b/doc/forum.mdwn index 729540774..19ca9ed0b 100644 --- a/doc/forum.mdwn +++ b/doc/forum.mdwn @@ -1,6 +1,8 @@ -This is a place for questions and discussions that don't have a Discussion page fitting enough. Users of ikiwiki can ask questions here. +This is a place for questions and discussions that don't have a Discussion +page fitting enough. Users of ikiwiki can ask questions here. -_This is a bold experiment by me, since I have exactly such a question. This overrides the default content/discussion dichotomy, feel free to refactor and discuss! --ulrik_ +Note that for more formal bug reports or todo items, you can also edit the +[[bugs]] and [[todo]] pages. ## Current topics ## diff --git a/doc/forum/Can_OpenID_users_be_adminusers__63__.mdwn b/doc/forum/Can_OpenID_users_be_adminusers__63__.mdwn index 7599e71e5..17c60c423 100644 --- a/doc/forum/Can_OpenID_users_be_adminusers__63__.mdwn +++ b/doc/forum/Can_OpenID_users_be_adminusers__63__.mdwn @@ -62,7 +62,7 @@ index 0bf100a..77b467a 100644 >>>> So you can see if the two usernames/openids match. If the end is "0", >>>> they don't match. If nothing is logged, you have not enabled the websetup plugin. ->>>> If the end if "1" you should see the "Wiki Setup" button, if not the +>>>> If the end if "1" you should see the "Setup" button, if not the >>>> problem is not in determining if you're an admin, but elsewhere.. >>>> --[[Joey]] diff --git a/doc/forum/Can__39__t_get_comments_plugin_working.mdwn b/doc/forum/Can__39__t_get_comments_plugin_working.mdwn new file mode 100644 index 000000000..06b7fdd84 --- /dev/null +++ b/doc/forum/Can__39__t_get_comments_plugin_working.mdwn @@ -0,0 +1,7 @@ +I feel like I must be missing something. + +My blog is based on Ikiwiki, and uses /yyyy/mm/dd/title/ for blog posts. Because I use the plugin that generates index pages for subdirectories, I have to use /????/??/??/* to identify posts and avoid missing the index pages for years, months and days. + +I've enabled the comments plugin, but no matter what I do, I can't seem to make the comment form appear on my posts. I've removed the entire site and have rebuilt. I've set the pagespec to /????/??/??/* and ./????/??/??/*, but neither seems to work. I don't see any output, or anything else to indicate that pages aren't working. + +Are there any other plugins that need to be enabled for this to work? I think I've locked things down such that anonymous users can't edit by enabling signinedit and setting a lock, but this may be blocking the ability to comment (though I don't recall seeing anything in the docs about needing additional plugins.) diff --git a/doc/forum/Cannot_write_to_commitlock.mdwn b/doc/forum/Cannot_write_to_commitlock.mdwn new file mode 100644 index 000000000..05490a799 --- /dev/null +++ b/doc/forum/Cannot_write_to_commitlock.mdwn @@ -0,0 +1,28 @@ +I am following the laptop wiki with git tip page. I have set up my local and remote wiki as suggested. However, when I try to push my local changes back to the server I get the following error: + +Writing objects: 100% (4/4), 359 bytes, done. +Total 4 (delta 2), reused 0 (delta 0) +cannot write to /home/ian/ianbarton/.ikiwiki/commitlock: No such file or directory +To ian@wilkesley.org:~/ikiwiki/ianbarton.git + 5cf9054..16a871d master -> master + +The relevnt bit of my setup file is: + +git_wrapper => '/home/ian/ianbarton.git/hooks/post-commit', + +Now ~/ianbarton/.ikiwiki exists and is owned and writable by me. I have tried touching commitlock and also removing lock and commitlock before pushing. Any suggestions for further trouble shooting? + +Ian. + +> I'm guessing that this is some kind of permissions problem, +> and that the error message is just being misleading. +> +> When you push the changes to the server, what user is +> git logging into the server as? If that user is different +> than `ian` (possibly due to using git-daemon?), the post-commit +> wrapper needs to be setuid to `ian`. This ensures that ikiwiki +> runs as you and can see and write to the files. --[[Joey]] + +The user is logging as ian, the same user as the laptop. I can push and pull git repos on the same server owned by the same user via ssh with no problem. I have deleted and re-started from scratch several times. However, for my use case I think it's simpler to keep the repo on my local computer and just rsync the web pages to the server. + +Ian. diff --git a/doc/forum/Error:_bad_page_name.mdwn b/doc/forum/Error:_bad_page_name.mdwn new file mode 100644 index 000000000..70277a1e4 --- /dev/null +++ b/doc/forum/Error:_bad_page_name.mdwn @@ -0,0 +1,46 @@ +I'm trying to use ikiwiki for the first time. In the start, I had problems +with installing the package, because I don't have a root account on my +server. + +When I solved this, I finally set up my wiki, but whenever I try to edit a +page, I get an error: “Error: bad page name”. + +What am I doing wrong? The wiki is at +<http://atrey.karlin.mff.cuni.cz/~onderka/wiki/>, the setupfile I used at +<http://atrey.karlin.mff.cuni.cz/~onderka/wiki/ikiwiki.setup>. + +> This means that one of the checks that ikiwiki uses to prevent +> editing files with strange or insecure names has fired incorrectly. +> Your setup file seems fine. +> We can figure out what is going wrong through a series of tests: +> +> * Test if your perl has a problem with matching alphanumerics: +> `perl -le 'print int "index"=~/^([-[:alnum:]+\/.:_]+)$/'` +> * Check if something is breaking pruning of disallowed files: +> `perl -le 'use IkiWiki; %config=IkiWiki::defaultconfig(); print ! IkiWiki::file_pruned("index")'` +> --[[Joey]] + +>>Both seem to run fine: + + onderka@atrey:~$ perl -le 'print int "index"=~/^([-[:alnum:]+\/.:_]+)$/' + 1 + onderka@atrey:~$ perl -le 'use IkiWiki; %config=IkiWiki::defaultconfig(); print ! IkiWiki::file_pruned("index")' + 1 + +>>> Try installing this [instrumented +>>> version](http://kitenet.net/~joey/tmp/editpage.pm) of +>>> `IkiWiki/Plugin/editpage.pm`, which will add some debugging info +>>> to the error message. --[[Joey]] + +>>>>When I tried to `make` ikiwiki with this file, I got the error + + ../IkiWiki/Plugin/editpage.pm:101: invalid variable interpolation at "$" + +>>>>> Sorry about that, I've corrected the above file. --[[Joey]] + +>>>>>> Hmm, funny. Now that I reinstalled it with your changed file, it started working. I didn't remember how exactly did I install it the last time, so this time, it seems I did it correctly. Thank you very much for your help. + +>>>>>>> Well, this makes me suspect you installed an older version of +>>>>>>> ikiwiki and my file, which is from the latest version, included a +>>>>>>> fix for whatever bug you were seeing. If I were you, I'd ensure +>>>>>>> that I have a current version of ikiwiki installed. --[[Joey]] diff --git a/doc/forum/How_does_ikiwiki_remember_times__63__.mdwn b/doc/forum/How_does_ikiwiki_remember_times__63__.mdwn index 5522cbf45..6b7739fd0 100644 --- a/doc/forum/How_does_ikiwiki_remember_times__63__.mdwn +++ b/doc/forum/How_does_ikiwiki_remember_times__63__.mdwn @@ -20,15 +20,17 @@ Do I have it right? > Some VCS, like git, set the file mtimes to the current time > when making a new checkout, so they will be lost if you do that. > The creation times can be retrived using the `--getctime` option. -> I suppose it might be nice if there were a `--getmtime` that pulled -> true modification times out of the VCS, but I haven't found it a big -> deal in practice for the last modification times to be updated to the -> current time when rebuilding a wiki like this. --[[Joey]] +> --[[Joey]] > > > Thanks for the clarification. I ran some tests of my own to make sure I understand it right, and I'm satisfied > > that the order of posts in my blog can be retrieved from the VCS using the `--getctime` option, at least if I > > choose to order my posts by creation time rather than modification time. But I now know that I can't rely on > > page modification times in ikiwiki as these can be lost permanently. +> +> > > Update: It's now renamed to `--gettime`, and pulls both the creation +> > > and modification times. Also, per [[todo/auto_getctime_on_fresh_build]], +> > > this is now done automatically the first time ikiwiki builds a +> > > srcdir. So, no need to worry about this any more! --[[Joey]] > > > > I would suggest that there should at least be a `--getmtime` option like you describe, and perhaps that > > `--getctime` and `--getmtime` be _on by default_. In my opinion the creation times and modification times of @@ -87,3 +89,10 @@ Do I have it right? $EDITOR "$pagename" >>>>> -- [[Jon]] + +> A quick workaround for me to get modification times right is the following +> little zsh script, which unfortunately only works for git: + +>> Elided; no longer needed since --gettime does that, and much faster! --[[Joey]] + +> --[[David_Riebenbauer]] diff --git a/doc/forum/Ikiwiki_CGI_not_working_on_my_server__44___and_it__39__s_a_binary_file__63__.mdwn b/doc/forum/Ikiwiki_CGI_not_working_on_my_server__44___and_it__39__s_a_binary_file__63__.mdwn new file mode 100644 index 000000000..35db20dc8 --- /dev/null +++ b/doc/forum/Ikiwiki_CGI_not_working_on_my_server__44___and_it__39__s_a_binary_file__63__.mdwn @@ -0,0 +1,33 @@ +Hey, trying to get ikiwiki working on my account on a shared webserver. Actually installing ikiwiki on the server is phase 2. For now I'm running the latest ikiwiki (from source) locally, compiling the output with the ikiwiki command, then rsyncing the output dir up to the server. This works for the static HTML files, but the CGI file doesn't work, the server redirects to an error page. The error log on the server says "Premature end of script headers: /path/to/ikiwiki.cgi" + +My first thought was that this is a Perl CGI and I would need to change the shebang to point to the unusual location of Perl on this server, it's at /usr/pkg/bin/perl. But when I looked at ikiwiki.cgi I found it was a binary file. + +Why is it a binary? And what can I do about this error? + +> It's a binary because it's SUID, so that it has permission to write to the ikiwiki repository. See [[security]], under 'suid wrappers', for more on that. +> +> As to why you get 'premature end of script headers', that suggests there is a problem running +> the script (and there is output occurring before the HTTP headers are printed). Do you have access +> to the webserver logs for your host? They might contain some clues. Are you sure that the webserver +> is setup for CGI properly? -- [[Jon]] + +> Quite likely your laptop and your server do not run the same +> OS, so the wrapper binary cannot just be copied from one +> to the other and run. Also, the wrapper is just that, a +> thin wrapper which then runs ikiwiki. As ikiwiki is not +> yet installed on your server, that's another reason what +> you're trying can't work. +> +> If installing ikiwiki on the server is not possible or +> too much work right now, you could try building your wiki +> on your laptop with cgi disabled in the setup file. +> The result would be a static website that you could deploy to +> the server this way. Of course, it wouldn't be editable +> on the server, and other features that need the CGI would +> also be disabled. --[[Joey]] + +> > Ah, ok thanks. Yes the server runs a different OS and ikiwiki +> > is not installed on it. I've got it working as a static site, +> > so if I want the CGI I'll have to install ikiwiki on the server. +> > Ok. It might not work as I don't have root access, but I might +> > give it a try. Thanks diff --git a/doc/forum/Migrating_old_repository_to_new_ikiwiki_system__63__.mdwn b/doc/forum/Migrating_old_repository_to_new_ikiwiki_system__63__.mdwn index fe67e6aba..d7a33b526 100644 --- a/doc/forum/Migrating_old_repository_to_new_ikiwiki_system__63__.mdwn +++ b/doc/forum/Migrating_old_repository_to_new_ikiwiki_system__63__.mdwn @@ -20,10 +20,6 @@ How do I set up an ikiwiki system using a pre-existing repository (instead of cr > recreate the ikiwiki srcdir > 3. `git clone` from the bare git repository a second time, > to create a checkout you can manually edit (optional) -> 4. run `ikiwiki --getctime --setup your.setup` -> The getctime will ensure page creation times are accurate -> by putting the info out of the git history, -> and only needs to be done once. > > If you preserved your repository, but not the setup file, > the easiest way to make one is probably to run diff --git a/doc/forum/News_site_where_articles_are_submitted_and_then_reviewed_before_posting.mdwn b/doc/forum/News_site_where_articles_are_submitted_and_then_reviewed_before_posting.mdwn new file mode 100644 index 000000000..8dd755274 --- /dev/null +++ b/doc/forum/News_site_where_articles_are_submitted_and_then_reviewed_before_posting.mdwn @@ -0,0 +1,23 @@ +[[!meta date="2008-04-28 14:57:25 -0400"]] + +I am considering moving a news site to Ikiwiki. I am hoping that Ikiwiki has a feature where anonymous posters can submit a form that moderators can review and then accept for it to be posted on a news webpage (like front page of the website). + +> Well, you can have one blog that contains unreviewed articles, and +> moderators can then add a tag that makes the article show up in the main +> news feed. There's nothing stopping someone submitting an article +> pre-tagged though. If you absolutely need to lock that down, you could +> have one blog with unreviewed articles in one subdirectory, and reviewers +> then move the file over to another subdirectory when they're ready to +> publish it. (This second subdirectory would be locked to prevent others +> from writing to it.) --[[Joey]] + +Also it would be good if the news page would keep maybe just the latest 10 entries with links to an archive that make it easy to browse to old entries by date. (Could have over a thousand news articles.) + +> The inline plugin allows setting up things like this. + +Plus users be able to post feedback to news items. If anonymous, they must be approved first. I'd prefer to not use normal "wiki" editor for feedback. + +Any thoughts or examples on this? Any links to examples of news sites or blogs with outside feedback using ikiwiki? + +Thanks --[[JeremyReed]] + diff --git a/doc/forum/PERL5LIB__44___wrappers_and_homedir_install.mdwn b/doc/forum/PERL5LIB__44___wrappers_and_homedir_install.mdwn new file mode 100644 index 000000000..fba941efc --- /dev/null +++ b/doc/forum/PERL5LIB__44___wrappers_and_homedir_install.mdwn @@ -0,0 +1,38 @@ +What is the way to tell wrappers that PERL5LIB should include ~/bin directories? + +Having this in the wiki.setup doesn't help anymore: + + # environment variables + ENV => { + PATH => '/home/user/bin/bin:/usr/local/bin:/usr/bin:/bin:/usr/games:/home/user/ikiwiki/usr/bin/:/home/user/ikiwiki/usr/sbin/:/home/user/bin/bin/:~/bin/bin/', + PERL5LIB => '/home/user/bin/share/perl/5.10.0:/home/user/bin/lib/perl/5.10.0' + }, + +Or at least I get CGI errors and running ikiwiki.cgi manually fails too: + + Use of uninitialized value $tainted in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 233. + Argument "" isn't numeric in umask at /usr/share/perl5/IkiWiki.pm line 139. + Undefined subroutine &IkiWiki::cgierror called at /home/user/bin/bin/ikiwiki line 199. + +Server has an older ikiwiki installed but I'd like to use a newer version from git, and I don't have root access. + +> You can't set `PERL5LIB` in `ENV` in a setup file, because ikiwiki is already +> running before it reads that, and so it has little effect. Your error +> messages do look like a new bin/ikiwiki is using an old version of +> `IkiWiki.pm`. +> +> The thing to do is set `INSTALL_BASE` when you're installing ikiwiki from +> source. Like so: + + cd ikiwiki + perl Makefile.PL INSTALL_BASE=$HOME PREFIX= + make install + +> Then `$HOME/bin/ikiwiki` will have hardcoded into it to look +> for ikiwiki's perl modules in `$HOME/lib/perl5/` +> (This is documented in the README file by the way.) --[[Joey]] + +>> Ok, *perl Makefile.PL INSTALL_BASE=$HOME/bin PREFIX=* finally did it for me. I tried too many things with +>> these paths so I wasn't sure which actually worked. After that I did +>> *$ ikiwiki --setup www.setup --wrappers --rebuild*. Somehow in this update mess I seem to have lost the user +>> accounts, maybe the --rebuild was too much. diff --git a/doc/forum/Regex_for_Valid_Characters_in_Filenames.mdwn b/doc/forum/Regex_for_Valid_Characters_in_Filenames.mdwn new file mode 100644 index 000000000..618576f81 --- /dev/null +++ b/doc/forum/Regex_for_Valid_Characters_in_Filenames.mdwn @@ -0,0 +1,19 @@ +I'm sure that this is documented somewhere but I've ransacked the wiki and I can't find it. :-( What are the allowed characters in an ikiwiki page name? I'm writing a simple script to make updating my blog easier and need to filter invalid characters (so far I've found that # and , aren't allowed ;-)). Thanks for any pointers. -- [[AdamShand]] + +> The default `wiki_file_regexp` matches filenames containing only +> `[-[:alnum:]_.:/+]` +> +> The titlepage() function will convert freeform text to a valid +> page name. See [[todo/should_use_a_standard_encoding_for_utf_chars_in_filenames]] +> for an example. --[[Joey]] + +>> Perfect, thanks! +>> +>> In the end I decided that I didn't need any special characters in filenames and replaced everything but alphanumeric characters with underscores. In addition to replacing bad characters I also collapse multiple underscores into a single one, and strip off trailing and leading underscores to make tidy filenames. If it's useful to anybody else here's a sed example: +>> +>> # echo "++ Bad: ~@#$%^&*()_=}{[];,? Iki: +_-:./ Num: 65.5 ++" | sed -e 's/[^A-Za-z0-9_]/_/g' -e 's/__*/_/g' -e 's/^_//g' -e 's/_$//g' +>> Bad_Iki_Num_65_5 +>> +>>--[[AdamShand]] + +[[!meta date="2008-01-18 23:40:02 -0500"]] diff --git a/doc/forum/Render_more_than_one_dest_page_from_same_source_page.mdwn b/doc/forum/Render_more_than_one_dest_page_from_same_source_page.mdwn new file mode 100644 index 000000000..e7362c903 --- /dev/null +++ b/doc/forum/Render_more_than_one_dest_page_from_same_source_page.mdwn @@ -0,0 +1,51 @@ +Is it possible to render more than one destination page from the same source page? +That is, same source, slightly different presentation at the other end, needing a different output file. + +> It's possible to render more than one output _file_ from a given source +> page. See, for example, the inline plugin's generation of rss files. +> This is done by calling `will_render()` and using `writefile()` to +> generate the additional files. Probably in a format hook if you want +> to generate html files. + +>> Thanks for the tip, I'll take a look at that. -- [[KathrynAndersen]] + +> It's not possible for one source file to represent multiple wiki pages. +> There is a 1:1 mapping between source filenames and page names. The +> difference between wiki pages and output files is that you can use +> wikilinks to link to wiki pages, etc. --[[Joey]] + +I have two problems that would be solved by being able to do this. + +[[!toc startlevel=2]] + +##"full" and "print" versions of a page. + +One has a page "foo", which is rendered into foo.html. +One also wants a foo-print.html page, which uses "page-print.tmpl" rather than "page.tmpl" as its template. + +I want to do this for every page on the site, automatically, so it isn't feasible to do it by hand. + +> Did you know that ikiwiki's `style.css` arranges for pages to display +> differently when printed out? Things like the Action bar are hidden in +> printouts (search for `@media print`). So I don't see a reason to need +> whole files for printing when you can use these style sheet tricks. +> --[[Joey]] + +>>Fair enough. --[[KathrynAndersen]] + +##"en" and "en-us" versions of a page. + +My site is in non-US English. However, I want US-English people to find my site when they search for it when they use US spelling on certain search terms (such as "optimise" versus "optimize"). This requires a (crude) US-English version of the site where the spellings are changed automatically, and the LANG is "en-us" rather than "en". (No, don't tell me to use keywords; Google ignores keywords and has for a number of years). + +So I want the page "foo" to render to "foo.en.html" and "foo.en-us.html" where the content is the same, just some automated word-substitution applied before foo.en-us.html is written. And do this for every page on the site. + +I can't do this with the "po" plugin, as it considers "en-us" not to be a valid language. And the "po" plugin is probably overkill for what I want anyway. + +But I'm not sure how to achieve the result I need. + +-- [[KathrynAndersen]] + +> Sounds like this could be considered a single page that generates two +> html files, so could be handled per above. --[[Joey]] + +>>Thanks! --[[KathrynAndersen]] diff --git a/doc/forum/Should_not_create_an_existing_page.mdwn b/doc/forum/Should_not_create_an_existing_page.mdwn new file mode 100644 index 000000000..b9500757f --- /dev/null +++ b/doc/forum/Should_not_create_an_existing_page.mdwn @@ -0,0 +1,15 @@ +[[!meta date="2007-01-08 14:55:31 +0000"]] + +This might be a bug, but will discuss it here first. +Clicking on an old "?" or going to a create link but new Markdown content exists, should not go into "create" mode, but should do a regular "edit". + +> I belive that currently it does a redirect to the new static web page. +> At least that's the intent of the code. --[[Joey]] + +>> Try at your site: `?page=discussion&from=index&do=create` +>> It brings up an empty textarea to start a new webpage -- even though it already exists here. --reed + +>>> Ah, right. Notice that the resulting form allows saving the page as +>>> discussion, or users/discussion, but not index/discussion, since this +>>> page already exists. If all the pages existed, it would do the redirect +>>> thing. --[[Joey]] diff --git a/doc/forum/Sidebar_with_links__63__.mdwn b/doc/forum/Sidebar_with_links__63__.mdwn new file mode 100644 index 000000000..790ee85a2 --- /dev/null +++ b/doc/forum/Sidebar_with_links__63__.mdwn @@ -0,0 +1,58 @@ +I'm trying to create a template to use as a sidebar with links. The template will be static +(no variables are used). I first created a page with this directive: \[[!template id=sidebar]], +and then created the template with the web interface. + +This is the code I put in the template: + + <div class="infobox"> + <ul> + <li>\[[Existing internal link|exists]]</li> + <li>\[[Non-existing internal link|doesnotexist]]</li> + <li>[External link](http://google.com/)</li> + </ul> + <http://google.com/> + </div> + +This is the relevant part of the resulting html file `template/sidebar.html`: + + <div class="infobox"> + <ul> + <li><a href="../exists.html">Existing internal link</a></li> + <li><span class="createlink"><a href="http://localhost/cgi-bin/itesohome.cgi?page=doesnotexist&from=templates%2Fsidebar&do=create" rel="nofollow">?</a>Non-existing internal link</span></li> + <li>[External link](http://google.com/)</li> + </ul> + </div> + +Note that the `<http://google.com/>` link has disappeared, and that `[External link](http://google.com/)` +has been copied literally instead of being converted to a link, as I expected. + +> Templates aren't Markdown page. [[ikiwiki/WikiLink]] only are expanded. --[[Jogo]] + +>> Thanks for the help Jogo. Looking at the [[templates]] page, it says that +"...you can include WikiLinks and all other forms of wiki markup in the template." I read this +to mean that a template may indeed include Markdown. Am I wrong in my interpratation? --[[buo]] + +>> I discovered that if I eliminate all html from my sidebar.mdwn template, the links are +rendered properly. It seems that the mix of Markdown and html is confusing some part of +Ikiwiki. --[[buo]] + +Worse, this is the relevant part of the html file of the page that includes the template: + + <div class="infobox"> + <ul> + <li><span class="selflink">Existing internal link</span></li> + <li><span class="createlink"><a href="http://localhost/cgi-bin/itesohome.cgi?page=doesnotexist&from=research&do=create" rel="nofollow">?</a>Non-existing internal link</span></li> + <li>[External link](http://google.com/)</li> + </ul> + </div> + +Note that the `Existing internal link` is no longer a link. It is only text. + +What am I doing wrong? Any help or pointers will be appreciated. --[[buo]] + +----- + +I think I have figured this out. I thought the template was filled and then +processed to convert Markdown to html. Instead, the text in each variable is +processed and then the template is filled. I somehow misunderstood the +[[templates]] page. -- [[buo]] diff --git a/doc/forum/Spaces_in_wikilinks.mdwn b/doc/forum/Spaces_in_wikilinks.mdwn new file mode 100644 index 000000000..9326ac448 --- /dev/null +++ b/doc/forum/Spaces_in_wikilinks.mdwn @@ -0,0 +1,104 @@ +[[!meta date="2007-07-02 13:21:29 +0000"]] + +# Spaces in WikiLinks? + +Hello Joey, + +I've just switched from ikiwiki 2.0 to ikiwiki 2.2 and I'm really surprised +that I can't use the spaces in WikiLinks. Could you please tell me why the spaces +aren't allowed in WikiLinks now? + +My best regards, + +--[[PaweB|ptecza]] + +> See [[bugs/Spaces_in_link_text_for_ikiwiki_links]] + +---- + +# Build in OpenSolaris? + +Moved to [[bugs/build_in_opensolaris]] --[[Joey]] + +---- + +# Various ways to use Subversion with ikiwiki + +I'm playing around with various ways that I can use subversion with ikiwiki. + +* Is it possible to have ikiwiki point to a subversion repository which is on a different server? The basic checkin/checkout functionality seems to work but there doesn't seem to be any way to make the post-commit hook work for a non-local server? + +> This is difficult to do since ikiwiki's post-commit wrapper expects to +> run on a machine that contains both the svn repository and the .ikiwiki +> state directory. However, with recent versions of ikiwiki, you can get +> away without running the post-commit wrapper on commit, and all you lose +> is the ability to send commit notification emails. + +> (And now that [[recentchanges]] includes rss, you can just subscribe to +> that, no need to worry about commit notification emails anymore.) + +* Is it possible / sensible to have ikiwiki share a subversion repository with other data (either completely unrelated files or another ikiwiki instance)? This works in part but again the post-commit hook seems problematic. + +--[[AdamShand]] + +> Sure, see ikiwiki's subversion repository for example of non-wiki files +> in the same repo. If you have two wikis in one repository, you will need +> to write a post-commit script that calls the post-commit wrappers for each +> wiki. + +---- + +# Regex for Valid Characters in Filenames + +I'm sure that this is documented somewhere but I've ransacked the wiki and I can't find it. :-( What are the allowed characters in an ikiwiki page name? I'm writing a simple script to make updating my blog easier and need to filter invalid characters (so far I've found that # and , aren't allowed ;-)). Thanks for any pointers. -- [[AdamShand]] + +> The default `wiki_file_regexp` matches filenames containing only +> `[-[:alnum:]_.:/+]` +> +> The titlepage() function will convert freeform text to a valid +> page name. See [[todo/should_use_a_standard_encoding_for_utf_chars_in_filenames]] +> for an example. --[[Joey]] + +>> Perfect, thanks! +>> +>> In the end I decided that I didn't need any special characters in filenames and replaced everything but alphanumeric characters with underscores. In addition to replacing bad characters I also collapse multiple underscores into a single one, and strip off trailing and leading underscores to make tidy filenames. If it's useful to anybody else here's a sed example: +>> +>> # echo "++ Bad: ~@#$%^&*()_=}{[];,? Iki: +_-:./ Num: 65.5 ++" | sed -e 's/[^A-Za-z0-9_]/_/g' -e 's/__*/_/g' -e 's/^_//g' -e 's/_$//g' +>> Bad_Iki_Num_65_5 +>> +>>--[[AdamShand]] + +# Upgrade steps from RecentChanges CGI to static page? + +Where are the upgrade steps for RecentChanges change from CGI to static feed? +I run multiple ikiwiki-powered sites on multiple servers, but today I just upgraded one to 2.32.3. +Please have a look at +<http://bsdwiki.reedmedia.net/wiki/recentchanges.html> +Any suggestions? + +> There are no upgrade steps required. It does look like you need to enable +> the meta plugin to get a good recentchanges page though.. --[[Joey]] + +# News site where articles are submitted and then reviewed before posting? + +I am considering moving a news site to Ikiwiki. I am hoping that Ikiwiki has a feature where anonymous posters can submit a form that moderators can review and then accept for it to be posted on a news webpage (like front page of the website). + +> Well, you can have one blog that contains unreviewed articles, and +> moderators can then add a tag that makes the article show up in the main +> news feed. There's nothing stopping someone submitting an article +> pre-tagged though. If you absolutely need to lock that down, you could +> have one blog with unreviewed articles in one subdirectory, and reviewers +> then move the file over to another subdirectory when they're ready to +> publish it. (This second subdirectory would be locked to prevent others +> from writing to it.) --[[Joey]] + +Also it would be good if the news page would keep maybe just the latest 10 entries with links to an archive that make it easy to browse to old entries by date. (Could have over a thousand news articles.) + +> The inline plugin allows setting up things like this. + +Plus users be able to post feedback to news items. If anonymous, they must be approved first. I'd prefer to not use normal "wiki" editor for feedback. + +Any thoughts or examples on this? Any links to examples of news sites or blogs with outside feedback using ikiwiki? + +Thanks --[[JeremyReed]] + diff --git a/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server..mdwn b/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server..mdwn new file mode 100644 index 000000000..4061a7348 --- /dev/null +++ b/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server..mdwn @@ -0,0 +1,11 @@ +Hi, +Installed ikiwiki on my ubuntu (04.10) box; after creating a blog according to your setup instructions I cannot edit files on the web interface, and I get this errer «The requested URL /~jean/blog/ikiwiki.cgi was not found on this server.» +I have no idea what to do (sorry for my ignorance) + +tia, + +> Make sure you have a `~/public_html/ikiwiki.cgi`. Your setup +> file should generate that via the `cgi_wrapper` option. +> +> Maybe you need to follow the [[tips/dot_cgi]] tip to make apache see it. +> --[[Joey]] diff --git a/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_1_d36ce6fab90e0a086ac84369af38d205._comment b/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_1_d36ce6fab90e0a086ac84369af38d205._comment new file mode 100644 index 000000000..f95972c4f --- /dev/null +++ b/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_1_d36ce6fab90e0a086ac84369af38d205._comment @@ -0,0 +1,16 @@ +[[!comment format=mdwn + username="jeanm" + subject="comment 1" + date="2010-06-19T13:35:37Z" + content=""" +OK, I followed the dot cgi tip and this error diappears, thanks a lot! So ubuntu doesn't provide a \"working out of the box\" ikiwiki. + +But I get a new error message now when trying to edit a page: + +Error: \"do\" parameter missing + +My plugins now: +add_plugins => [qw{goodstuff websetup comments blogspam 404 muse}], + + +"""]] diff --git a/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_2_5836bba08172d2ddf6a43da87ebb0332._comment b/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_2_5836bba08172d2ddf6a43da87ebb0332._comment new file mode 100644 index 000000000..0a544eeb1 --- /dev/null +++ b/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_2_5836bba08172d2ddf6a43da87ebb0332._comment @@ -0,0 +1,7 @@ +[[!comment format=mdwn + username="http://joey.kitenet.net/" + subject="do parameter missing" + date="2010-06-23T17:03:12Z" + content=""" +That's an unusual problem. Normally the url or form that calls ikiwiki.cgi includes a \"do\" parameter, like \"do=edit\". I'd have to see the site to debug why it is missing for you. +"""]] diff --git a/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_3_4eec15c8c383275db5401c8e3c2d9242._comment b/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_3_4eec15c8c383275db5401c8e3c2d9242._comment new file mode 100644 index 000000000..faf3ad31b --- /dev/null +++ b/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_3_4eec15c8c383275db5401c8e3c2d9242._comment @@ -0,0 +1,9 @@ +[[!comment format=mdwn + username="jeanm" + ip="81.56.145.104" + subject="do parameter missing" + date="2010-06-30T07:30:08Z" + content=""" +the site address is piaffer.org, with a link to blog just over the picture. +tia, +"""]] diff --git a/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_4_43ac867621efb68affa6ae2b92740cad._comment b/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_4_43ac867621efb68affa6ae2b92740cad._comment new file mode 100644 index 000000000..d8b516f5f --- /dev/null +++ b/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_4_43ac867621efb68affa6ae2b92740cad._comment @@ -0,0 +1,10 @@ +[[!comment format=mdwn + username="http://joey.kitenet.net/" + nickname="joey" + subject="comment 4" + date="2010-07-04T18:16:26Z" + content=""" +What is the muse plugin that you have enabled? I am not familiar with it. + +Apparently your ikiwiki is not seeing cgi parameters that should be passed to it. This appears to be some kind of web server misconfiguration, or possibly a broken ikiwiki wrapper or broken CGI.pm. +"""]] diff --git a/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_5_e098723bb12adfb91ab561cae21b492b._comment b/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_5_e098723bb12adfb91ab561cae21b492b._comment new file mode 100644 index 000000000..b832d64f4 --- /dev/null +++ b/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_5_e098723bb12adfb91ab561cae21b492b._comment @@ -0,0 +1,8 @@ +[[!comment format=mdwn + username="http://joey.kitenet.net/" + nickname="joey" + subject="do parameter missing" + date="2010-07-08T06:04:44Z" + content=""" +I just debugged this problem with someone else who was using ngix-fcgi. There was a problem with it not passing CGI environment variables properly. If you're using that, it might explain your problem. +"""]] diff --git a/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_6_101183817ca4394890bd56a7694bedd9._comment b/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_6_101183817ca4394890bd56a7694bedd9._comment new file mode 100644 index 000000000..25a4e8bae --- /dev/null +++ b/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_6_101183817ca4394890bd56a7694bedd9._comment @@ -0,0 +1,10 @@ +[[!comment format=mdwn + username="jeanm" + ip="81.56.145.104" + subject="comment 6" + date="2010-07-12T17:43:31Z" + content=""" +I'm using apache and mostly firefox. +I've tried some changes in my config but still the same problem, then I fell ill and unable to try anything more. Now I seem to be better and I will go back to the problem soon. +Thx +"""]] diff --git a/doc/forum/Upgrade_steps_from_RecentChanges_CGI_to_static_page.mdwn b/doc/forum/Upgrade_steps_from_RecentChanges_CGI_to_static_page.mdwn new file mode 100644 index 000000000..298ff49f1 --- /dev/null +++ b/doc/forum/Upgrade_steps_from_RecentChanges_CGI_to_static_page.mdwn @@ -0,0 +1,10 @@ +Where are the upgrade steps for RecentChanges change from CGI to static feed? +I run multiple ikiwiki-powered sites on multiple servers, but today I just upgraded one to 2.32.3. +Please have a look at +<http://bsdwiki.reedmedia.net/wiki/recentchanges.html> +Any suggestions? + +> There are no upgrade steps required. It does look like you need to enable +> the meta plugin to get a good recentchanges page though.. --[[Joey]] + +[[!meta date="2008-02-23 21:10:42 -0500"]] diff --git a/doc/forum/Various_ways_to_use_Subversion_with_ikiwiki.mdwn b/doc/forum/Various_ways_to_use_Subversion_with_ikiwiki.mdwn new file mode 100644 index 000000000..8eed30cd8 --- /dev/null +++ b/doc/forum/Various_ways_to_use_Subversion_with_ikiwiki.mdwn @@ -0,0 +1,23 @@ +[[!meta date="2007-08-17 03:54:10 +0000"]] + +I'm playing around with various ways that I can use subversion with ikiwiki. + +* Is it possible to have ikiwiki point to a subversion repository which is on a different server? The basic checkin/checkout functionality seems to work but there doesn't seem to be any way to make the post-commit hook work for a non-local server? + +> This is difficult to do since ikiwiki's post-commit wrapper expects to +> run on a machine that contains both the svn repository and the .ikiwiki +> state directory. However, with recent versions of ikiwiki, you can get +> away without running the post-commit wrapper on commit, and all you lose +> is the ability to send commit notification emails. + +> (And now that [[recentchanges]] includes rss, you can just subscribe to +> that, no need to worry about commit notification emails anymore.) + +* Is it possible / sensible to have ikiwiki share a subversion repository with other data (either completely unrelated files or another ikiwiki instance)? This works in part but again the post-commit hook seems problematic. + +--[[AdamShand]] + +> Sure, see ikiwiki's subversion repository for example of non-wiki files +> in the same repo. If you have two wikis in one repository, you will need +> to write a post-commit script that calls the post-commit wrappers for each +> wiki. --[[Joey]] diff --git a/doc/forum/an_alternative_approach_to_structured_data.mdwn b/doc/forum/an_alternative_approach_to_structured_data.mdwn new file mode 100644 index 000000000..6e6af8adb --- /dev/null +++ b/doc/forum/an_alternative_approach_to_structured_data.mdwn @@ -0,0 +1,63 @@ +## First Pass + +Looking at the discussion about [[todo/structured_page_data]], it looks a bit like folks are bogged down in figuring out what *markup* to use for structured page data, something I doubt that people will really agree on. And thus, little progress is made. + +I propose that, rather than worry about what the data looks like, that we take a similar approach +to the way Revision Control Systems are used in ikiwiki: a front-end + back-end approach. +The front-end would be a common interface, where queries are made about the structured data, +and there would be any number of back-ends, which could use whatever markup or format that they desired. + +To that purpose, I've written the [[plugins/contrib/field]] plugin for a possible front-end. +I called it "field" because each page could be considered a "record" where one could request the values of "fields" of that record. +The idea is that back-end plugins would register functions which can be called when the value of a field is desired. + +This is gone into in more depth on the plugin page itself, but I would appreciate feedback and improvements on the approach. +I think it could be really powerful and useful, especially if it becomes part of ikiwiki proper. + +--[[KathrynAndersen]] + +> It looks like an interesting idea. I don't have time right now to look at it in depth, but it looks interesting. -- [[Will]] + +> I agree such a separation makes some sense. But note that the discussion on [[todo/structured_page_data]] +> talks about associating data types with fields for a good reason: It's hard to later develop a good UI for +> querying or modifying a page's data if all the data has an implicit type "string". --[[Joey]] + +>> I'm not sure that having an implicit type of "string" is really such a bad thing. After all, Perl itself manages with just string and number, and easily converts from one to the other. Strong typing is generally used to (a) restrict what can be done with the data and/or (b) restrict how the data is input. The latter could be done with some sort of validated form, but that, too, could be decoupled from looking up and returning the value of a field. --[[KathrynAndersen]] + +## Second Pass + +I have written additional plugins which integrate with the [[plugins/contrib/field]] plugin to both set and get structured page data. + +* [[plugins/contrib/getfield]] - query field values inside a page using {{$*fieldname*}} markup +* [[plugins/contrib/ftemplate]] - like [[plugins/template]] but uses "field" data as well as passed-in data +* [[plugins/contrib/ymlfront]] - looks for YAML-format data at the front of a page; this is just one possible back-end for the structured data + +--[[KathrynAndersen]] + +> I'm not an IkiWiki committer ([[Joey]] is the only one I think) +> but I really like the look of this scheme. In particular, +> having `getfield` interop with `field` without being *part of* +> `field` makes me happy, since I'm not very keen on `getfield`'s +> syntax (i.e. "ugh, yet another mini-markup-language without a +> proper escaping mechanism"), but this way people can experiment +> with different syntaxes while keeping `field` for the +> behind-the-scenes bits. +> +>> I've started using `field` on a private site and it's working +>> well for me; I'll try to do some code review on its +>> [[plugins/contrib/field/discussion]] page. --s +> +> My [[plugins/contrib/album]] plugin could benefit from +> integration with `field` for photos' captions and so on, +> probably... I'll try to work on that at some point. +> +> [[plugins/contrib/report]] may be doing too much, though: +> it seems to be an variation on `\[[inline archive="yes"]]`, +> with an enhanced version of sorting, a mini version of +> [[todo/wikitrails]], and some other misc. I suspect it could +> usefully be divided up into discrete features? One good way +> to do that might be to shuffle bits of its functionality into +> the IkiWiki distribution and/or separate plugins, until there's +> nothing left in `report` itself and it can just go away. +> +> --[[smcv]] diff --git a/doc/forum/appear_if_you_are_login_or_not_in_a_page.mdwn b/doc/forum/appear_if_you_are_login_or_not_in_a_page.mdwn index e50eb4e1c..be9854a08 100644 --- a/doc/forum/appear_if_you_are_login_or_not_in_a_page.mdwn +++ b/doc/forum/appear_if_you_are_login_or_not_in_a_page.mdwn @@ -3,3 +3,34 @@ Hi, Can you give me a hint for showing if one user is logged or not. If user is logged, then I want to display the user name, as wikipedia or dokuwiki for example. Regards, Xan. + +> ikiwiki doesn't serve pages, so this can't be done inside ikiwiki. +> For certain kinds of authentication it might be possible anyway. +> For instance, if you're using [[plugins/httpauth]] exclusively and +> your server has PHP, you could put `<?php print("$REMOTE_USER"); +> ?>` in all the relevant ikiwiki [[templates]] and arrange for the +> generated HTML pages to get run through the PHP interpreter. The trick +> would work differently with other [[plugins/type/auth]] plugins, +> if at all. --[[Schmonz]] + +>> Thanks a lot, Xan. + +>>> Another possible trick would be to use some Javascript to make a +>>> "who am I?" AJAX request to the CGI (the CGI would receive the +>>> session cookie, if any, and be able to answer). Obviously, this +>>> wouldn't work for users who've disabled Javascript, but since it's +>>> non-essential, that's not so bad. You'd need to +>>> [[write_a_plugin|plugins/write]] to add a suitable CGI action, +>>> perhaps ?do=whoami, and insert the Javascript. --[[smcv]] + +>>>> It's an idea, but you're trading off a serious speed hit for a very +>>>> minor thing. --[[Joey]] + +>>>> Cool idea. A similar trick (I first saw it +>>>> [here](http://www.peej.co.uk/articles/http-auth-with-html-forms.html)) +>>>> could be used to provide a [[plugins/passwordauth]]-like login form +>>>> for [[plugins/httpauth]]. --[[Schmonz]] + +>>>>> I always assumed the entire reason someone might want to use the +>>>>> httpauth plugin is to avoid nasty site-specific login forms.. +>>>>> --[[Joey]] diff --git a/doc/forum/cleaning_up_discussion_pages_and_the_like.mdwn b/doc/forum/cleaning_up_discussion_pages_and_the_like.mdwn new file mode 100644 index 000000000..35ceae59b --- /dev/null +++ b/doc/forum/cleaning_up_discussion_pages_and_the_like.mdwn @@ -0,0 +1,11 @@ +For example in [[forum/ikiwiki__39__s_notion_of_time]], should one remove the +text about the implementation bug that has been fixed, or should it stay there, +for reference? --[[tschwinge]] + +> I have no problem with cleaning up obsolete stuff in the forum, tips, etc. +> --[[Joey]] + +That's also what I think: such discussions or comments on [[forum]] discussion +pages, or generally on all pages' [[Discussion]] subpages, can be removed if +either they're simply not valid / interesting / ... anymore, or if they've been +used to improve the *real* documentation. --[[tschwinge]] diff --git a/doc/forum/cutpaste.pm_not_only_file-local.mdwn b/doc/forum/cutpaste.pm_not_only_file-local.mdwn new file mode 100644 index 000000000..3563e3e01 --- /dev/null +++ b/doc/forum/cutpaste.pm_not_only_file-local.mdwn @@ -0,0 +1,14 @@ +I'd like to use the cutpaste plugin, but not only on a file-local basis: fileA +has \[[!cut id=foo text="foo"]], and fileB does \[[!absorb pagenames=fileA]], +and can then use \[[!paste id=foo]]. + +Therefore, I've written an [*absorb* directive / +plugin](http://www.thomas.schwinge.homeip.net/tmp/absorb.pm), which is meant to +absorb pages in order to get hold of their *cut* and *copy* directives' +contents. This does work as expected. But it also absorbs page fileA's *meta* +values, like a *meta title*, etc. How to avoid / solve this? + +Alternatively, do you have a better suggestion about how to achieve what I +described in the first paragraph? + +--[[tschwinge]] diff --git a/doc/forum/cutpaste.pm_not_only_file-local/comment_1_497c62f21fd1b87625b806407c72dbad._comment b/doc/forum/cutpaste.pm_not_only_file-local/comment_1_497c62f21fd1b87625b806407c72dbad._comment new file mode 100644 index 000000000..8cc724a72 --- /dev/null +++ b/doc/forum/cutpaste.pm_not_only_file-local/comment_1_497c62f21fd1b87625b806407c72dbad._comment @@ -0,0 +1,8 @@ +[[!comment format=mdwn + username="http://kerravonsen.dreamwidth.org/" + ip="60.241.8.244" + subject="field and getfield and ymlfront" + date="2010-08-12T02:33:54Z" + content=""" +Have you considered trying the [[plugins/contrib/field]] plugin, and its associated plugins? [[plugins/contrib/ymlfront]] can give you the source (\"cut\") and [[plugins/contrib/getfield]] and/or [[plugins/contrib/report]] can get you the value (\"paste\") including the values from other pages. +"""]] diff --git a/doc/forum/debian_backports_update_someone_please.mdwn b/doc/forum/debian_backports_update_someone_please.mdwn new file mode 100644 index 000000000..7102d12a1 --- /dev/null +++ b/doc/forum/debian_backports_update_someone_please.mdwn @@ -0,0 +1,18 @@ +I'm just in the process of deploying ikiwiki and I'd love to use it in the html5 mode instead of in XHTML. Any chance that the ikiwiki's .deb in debian backports will be updated any time soon? + +> Formerer does a good job keeping the backport up-to-date with whatever is in Debian testing. +> Which is the policy of what Backports should contain. So, I just need to stop releasing ikiwiki +> for 2 weeks. :) --[[Joey]] + +>> And are there any chances you doing it... or rather not doing it? + +>>> Sure, I'm busily not doing it right now. Should reach testing in 3 +>>> days. I generally schedule things so a new ikiwiki reaches testing +>>> every 2 weeks to month. Getting important new features and bugfixes out +>>> can take priority though. --[[Joey]] + +>>>> Great! Thanks. + +>>>> Still not available in the backports; did you break the silence on the wire and got back to work [[Joey]]? + +>>>> I was blinded by my stupidity... thanks! diff --git a/doc/forum/double_forward_slash___39____47____47____39___in_the_address_bar.mdwn b/doc/forum/double_forward_slash___39____47____47____39___in_the_address_bar.mdwn new file mode 100644 index 000000000..b5fb2aa18 --- /dev/null +++ b/doc/forum/double_forward_slash___39____47____47____39___in_the_address_bar.mdwn @@ -0,0 +1,3 @@ +Why do all the pages are preceded with a double forward slash in the address bar... ie. http://example.org//ikiwiki/pagespec/ .. maybe someone knows? + +> Sorted, base url in the .setup file had the unnecessary '/' suffix. diff --git a/doc/forum/editing_a_comment.mdwn b/doc/forum/editing_a_comment.mdwn new file mode 100644 index 000000000..8bddab3fb --- /dev/null +++ b/doc/forum/editing_a_comment.mdwn @@ -0,0 +1 @@ +Is it possible to edit a comment? I did not find any button for it. diff --git a/doc/forum/editing_the_style_sheet.mdwn b/doc/forum/editing_the_style_sheet.mdwn new file mode 100644 index 000000000..b4aa8c89b --- /dev/null +++ b/doc/forum/editing_the_style_sheet.mdwn @@ -0,0 +1,18 @@ +[[!meta date="2006-12-29 04:19:51 +0000"]] + +It would be nice to be able to edit the stylesheet by means of the cgi. Or is this possible? I wasn't able to achieve it. +Ok, that's my last 2 cents for a while. --[Mazirian](http://mazirian.com) + +> I don't support editing it, but if/when ikiwiki gets [[todo/fileupload]] support, +> it'll be possible to upload a style sheet. (If .css is in the allowed +> extensions list.. no idea how safe that would be, a style sheet is +> probably a great place to put XSS attacks and evil javascript that would +> be filtered out of any regular page in ikiwiki). --[[Joey]] + +>> I hadn't thought of that at all. It's a common feature and one I've +>> relied on safely, because the wikis I am maintaining at the moment +>> are all private and restricted to trusted users. Given that the whole +>> point of ikiwiki is to be able to access and edit via the shell as +>> well as the web, I suppose the features doesn't add a lot. By the +>> way, the w3m mode is brilliant. I haven't tried it yet, but the idea +>> is great. diff --git a/doc/forum/ever-growing_list_of_pages.mdwn b/doc/forum/ever-growing_list_of_pages.mdwn new file mode 100644 index 000000000..9920e34bb --- /dev/null +++ b/doc/forum/ever-growing_list_of_pages.mdwn @@ -0,0 +1,29 @@ +What is overyone's idea about the ever-growing list of pages in bugs/ etc.? +Once linked to `done`, they're removed from the rendered [[bugs]] page -- but +they're still present in the repository. + +Shouldn't there be some clean-up at some point for those that have been +resolved? Or should all of them be kept online forever? + +--[[tschwinge]] + +> To answer a question with a question, what harm does having the done bugs +> around cause? At some point in the future perhaps the number of done pages +> will be large enough to be a time or space concern. Do you think we've +> reached a point now? One advantage of having them around is that people +> running older versions of the Ikiwiki software may find the page explaining +> that the bug is fixed if they perform a search. -- [[Jon]] + +> I like to keep old bugs around. --[[Joey]] + +So, I guess it depends on whether you want to represent the development of the +software (meaning: which bugs are open, which are fixed) *(a)* in a snapshot of +the repository (a checkout; that is, what you see rendered on +<http://ikiwiki.info/>), or *(b)* if that information is to be contained in the +backing repository's revision history only. Both approaches are valid. For +people used to using Git for accessing a project's history, *(b)* is what +they're used to, but for those poor souls ;-) that only use a web browser to +access this database, *(a)* is the more useful approach indeed. For me, using +Git, it is a bit of a hindrance, as, when doing a full-text search for a +keyword on a checkout, I'd frequently hit pages that reported a bug, but are +tagged `done` by now. --[[tschwinge]] diff --git a/doc/forum/formating:_how_to_align_text_to_the_right.mdwn b/doc/forum/formating:_how_to_align_text_to_the_right.mdwn new file mode 100644 index 000000000..2b56bd70b --- /dev/null +++ b/doc/forum/formating:_how_to_align_text_to_the_right.mdwn @@ -0,0 +1,15 @@ +as in title, how to align text to the right? + +> Add to your local.css a class that aligns text to the right: + + .alignright { text-align: right; } + +> And then you just just use `<span class="alignright">` around +> other html. +> +> You can refine that, and allow right-aligning markdowned text +> by using the [[ikiwiki/directive/template]] +> directive, with a template that contains the html. The +> [[templates/note]] template does something similar. --[[Joey]] + +>> Thanks! diff --git a/doc/forum/google_openid_broken__63__.mdwn b/doc/forum/google_openid_broken__63__.mdwn new file mode 100644 index 000000000..96ba2d791 --- /dev/null +++ b/doc/forum/google_openid_broken__63__.mdwn @@ -0,0 +1,79 @@ +Now that google supports using thier profiles as OpenIDs, that can be used +directly to sign into ikiwiki. Just use, for example, +<http://www.google.com/profiles/joeyhess> . Tested and it works. --[[Joey]] + +> This seems to work fine if you use the profile directly as an OpenID. It doesn't seem to work with delegation. From that I can see, this is a deliberate decision by Google for security reasons. See the response [here](http://groups.google.com/group/google-federated-login-api/browse_thread/thread/825067789537568c/23451a68c8b8b057?show_docid=23451a68c8b8b057). -- [[Will]] + +## historical discussion + +when I login via to this wiki (or ours) via Google's OpenID, I get this error: + +Error: OpenID failure: no_identity_server: The provided URL doesn't declare its OpenID identity server. + +Any idea how to fix this?? + +> Google is [doing things with openid that are not in the spec](http://googledataapis.blogspot.com/2008/10/federated-login-for-google-account.html) +> and it's not clear to me that they intend regular openid to work at all. +> What is your google openid URL so I can take a look at the data they are +> providing? --[[Joey]] + + +http://openid-provider.appspot.com/larrylud + +> I've debugged this some and filed +> <https://rt.cpan.org/Ticket/Display.html?id=48728> on the Openid perl +> module. It's a pretty easy fix, so I hope upstream will fix it quickly. +> --[[Joey]] + +>> A little more information here: I'm using that same openid provider at the moment. Note that +>> that provider isn't google - it is someone using the google API to authenticate. I normally have it +>> set up as a redirect from my home page (which means I can change providers easily). + + <link rel="openid.server" href="http://openid-provider.appspot.com/will.uther"> + <link rel="openid.delegate" href="http://openid-provider.appspot.com/will.uther"> + +>> In that mode it works (I used it to log in to make this edit). However, when I try the openid +>> URL directly, it doesn't work. I think there is something weird with re-direction. I hope this +>> isn't a more general security hole. +>> -- [[Will]] + +---- + +So, while the above bug will probably get fixed sooner or later, +the best approach for those of you needing a google openid now is +to use gmail. + + +Just a note that someone has apparently figured out how to use a google +openid, and not a third-party provider either, to edit this site. +The openid is +<https://www.google.com/accounts/o8/id?id=AItOawltlTwUCL_Fr1siQn94GV65-XwQH5XSku4> +(what a mouthfull!), and I don't know who that is or how to use it since it +points to a fairly useless xml document, rather than a web page. --[[Joey]] + +> That string is what's received via the discovery protocol. The user logging in with a Google account is not supposed to write that when logging in, but rather <https://www.google.com/accounts/o8/id>. The OpenID client library will accept that and redirect the user to a sign in page, which will return that string as the OpenID. It's not really usable as an identifier for edits and whatnots, but an alternative would be to use the attribute exchange extension to get the email address and display that. See <http://code.google.com/apis/accounts/docs/OpenID.html#Parameters>. + +> Yahoo's OpenID implementation works alike, but I haven't looked at it as much. It uses <https://me.yahoo.com/> to receive the endpoint. + +> I've added buttons that submit the two above URLs for logging in with a Google and Yahoo OpenID, respectively, to my locally changed OpenID login plugin. + +> Using the Google profile page as the OpenID is really orthogonal to the above. --[[kaol]] + +>> First, I don't accept that the openid google returns from their +>> generic signin url *has* to be so freaking ugly. For contrast, +>> look at the openid you log in as if you use the yahoo url. +>> <https://me.yahoo.com/joeyhess#35f22>. Nice and clean, now +>> munged by ikiwiki to "joeyhess [me.yahoo.com]". +>> +>> Displaying email addresses is not really an option, because ikiwiki +>> can't leak user email addresses like that. Displaying nicknames or +>> usernames is, see [[todo/Separate_OpenIDs_and_usernames]]. +>> +>> It would probably be good if the openid plugin could be configured with +>> a list of generic openid urls, so it can add quick login buttons using +>> those urls. +>> +>> The ugly google url will still be exposed here and there where +>> a unique user id is needed. That can be avoided by not using the generic +>> <https://www.google.com/accounts/o8/id>, but instead your own profile +>> like <http://www.google.com/profiles/joeyhess>. --[[Joey]] diff --git a/doc/forum/how_do_I_revert_edits_in_the_web_mode__63__.mdwn b/doc/forum/how_do_I_revert_edits_in_the_web_mode__63__.mdwn new file mode 100644 index 000000000..7686a7a08 --- /dev/null +++ b/doc/forum/how_do_I_revert_edits_in_the_web_mode__63__.mdwn @@ -0,0 +1,35 @@ +Puzzled a bit :-/ + +> There is no explicit interface for reverting edits. Most of us use `git revert`. --[[Joey]] + +>> That's a blow; I was planning on appointing no techies to keep law and order on our pages :-/ Is there a plugin or at least a plan to add such a 'in demand' feature? + +>>> A lot of things complicate adding that feature to the web interface. +>>> +>>> First, ikiwiki happily uses whatever the VCS's best of breed web +>>> history interface is. (ie, viewvcs, gitweb). To allow reverting +>>> past the bottom of the RecentChanges page, it would need to have its +>>> own history browser. Not sure I want to go there. +>>> +>>> And the mechanics of handling reverting can quickly get complex. +>>> Web reverting should only allow users to revert things they can edit, +>>> but reverting a whole commit in git might touch multiple files. +>>> Some files may not be editable over the web at all. (The +>>> [[tips/untrusted_git_push]] also has to deal with those issues.) +>>> Finally, a revert can fail with a conflict. The revert could touch +>>> multiple files, and multiple ones could conflict. The conflict may +>>> involve non-page files that can't be diffed. So an interface for +>>> resolving such a conflict could be hard. +>>> +>>> Probably web-based reverting would need to be limited to reverting +>>> single file changes, not whole commits, and not having very good +>>> conflict handling. And maybe only being accessible for changes +>>> still visible on RecentChanges. With those limitations, it's certianly +>>> doable (as a plugin even), but given how excellent `git revert` is in +>>> comparison, I have not had a real desire to do so. --[[Joey]] + +>>>> Web edits are single-file anyway, so I wouldn't expect web reverts +>>>> to handle the multi-file case. OTOH, I've sometimes wished ikiwiki +>>>> had its own history browser (somewhere down my todo list). --[[schmonz]] + +>>>> Yup, having a possibility to revert a single file would suffice. diff --git a/doc/forum/how_do_I_translate_a_TWiki_site.mdwn b/doc/forum/how_do_I_translate_a_TWiki_site.mdwn new file mode 100644 index 000000000..5bfdbb86c --- /dev/null +++ b/doc/forum/how_do_I_translate_a_TWiki_site.mdwn @@ -0,0 +1,44 @@ +[[!meta date="2006-12-19 09:56:21 +0000"]] + +# Excellent - how do I translate a TWiki site? + +I just discovered ikiwiki quite by chance, I was looking for a console/terminal +menu system and found pdmenu. So pdmenu brought me to here and I've found ikiwiki! +It looks as if it's just what I've been wanting for a long time. I wanted something +to create mostly text web pages which, as far as possible, have source which is human +readable or at least in a standard format. ikiwiki does this twice over by using +markdown for the source and producing static HTML from it. + +I'm currently using TWiki and have a fair number of pages in that format, does +anyone have any bright ideas for translating? I can knock up awk scripts fairly +easily, perl is possible (but I'm not strong in perl). + +> Let us know if you come up with something to transition from the other +> format. Another option would be writing a ikiwiki plugin to support the +> TWiki format. --[[Joey]] + +> Jamey Sharp and I have a set of scripts in progress to convert other wikis to ikiwiki, including history, so that we can migrate a few of our wikis. We already have support for migrating MoinMoin wikis to ikiwiki, including conversion of the entire history to Git. We used this to convert the [XCB wiki](http://xcb.freedesktop.org/wiki/) to ikiwiki; until we finalize the conversion and put the new wiki in place of the old one, you can browse the converted result at <http://xcb.freedesktop.org/ikiwiki>. We already plan to add support for TWiki (including history, since you can just run parsecvs on the TWiki RCS files to get Git), so that we can convert the [Portland State Aerospace Society wiki](http://psas.pdx.edu) (currently in Moin, but with much of its history in TWiki, and with many of its pages still in TWiki format using Jamey's TWiki format for MoinMoin). +> +> Our scripts convert by way of HTML, using portions of the source wiki's code to render as HTML (with some additional code to do things like translate MoinMoin's `\[[TableOfContents]]` to ikiwiki's `\[[!toc ]]`), and then using a modified [[!cpan HTML::WikiConverter]] to turn this into markdown and ikiwiki. This produces quite satisfactory results, apart from things that don't have any markdown equivalent and thus remain HTML, such as tables and definition lists. Conversion of the history occurs by first using another script we wrote to translate MoinMoin history to Git, then using our git-map script to map a transformation over the Git history. +> +> We will post the scripts as soon as we have them complete enough to convert our wikis. +> +> -- [[JoshTriplett]] + +>> Thanks for an excellent Xmas present, I will appreciate the additional +>> users this will help switch to ikiwiki! --[[Joey]] + + +>> Sounds great indeed. Learning from [here](http://www.bddebian.com/~wiki/AboutTheTWikiToIkiwikiConversion/) that HTML::WikiConverter needed for your conversion was not up-to-date on Debian I have now done an unofficial package, including your proposed Markdown patches, apt-get'able at <pre>deb http://debian.jones.dk/ sid wikitools</pre> +>> -- [[JonasSmedegaard]] + + +>>I see the "We will post the scripts ...." was committed about a year ago. A current site search for "Moin" does not turn them up. Any chance of an appearance in the near (end of year) future? +>> +>> -- [[MichaelRasmussen]] + +>>> It appears the scripts were never posted? I recently imported my Mediawiki site into Iki. If it helps, my notes are here: <http://iki.u32.net/Mediawiki_Conversion> --[[sabr]] + +>>>>> The scripts have been posted now, see [[joshtriplett]]'s user page, +>>>>> and I've pulled together all ways I can find to [[convert]] other +>>>>> systems into ikiwiki. --[[Joey]] diff --git a/doc/forum/how_to_add_post_titles_in_ikiwiki_blog__63__.mdwn b/doc/forum/how_to_add_post_titles_in_ikiwiki_blog__63__.mdwn new file mode 100644 index 000000000..68eb06c4c --- /dev/null +++ b/doc/forum/how_to_add_post_titles_in_ikiwiki_blog__63__.mdwn @@ -0,0 +1,28 @@ +Look at these two blogs: + +1) http://ciffer.net/~svend/blog/ + +2) http://upsilon.cc/~zack/blog/ + +Well, i set up successfully my blog (i am using inline function in a wiki page) but i have manually to insert blog pos titles and the result is that of blog #2. +Instead i would like to have blog post titles automatically inserted like blog #1 (and they are links too! I want them that way). +I looked in git repo of the two blogs but i couldn't find the answer. +Any help would be really appreciated. + +Thanks! + +Raf + +> Either name the blog post files with the full title you want them to +> have, or use [[ikiwiki/directive/meta]] title to set the title of a blog post. +> +> \[[!meta title="this is my blog post"]] +> +> Either way, the title will automatically be displayed, clickable, at the top. +> (zack has hacked his templates not to do that). --[[Joey]] + +>> Thanks for your answer.<br/> +>> I looked in the [templates](http://git.upsilon.cc/cgi-bin/gitweb.cgi?p=zack-homepage.git;a=tree;f=templates;h=824100e62a06cee41b582ba84fcb9cdd982fe4be;hb=HEAD) folder of zack but couldn't see any hack of that kind.<br/> +>> Anyway, I didn't hack my template...<br/> +>> I will follow your suggestion of using \[[ikiwiki/directive/meta]] title to set titles.<br/> +>> Thanks a lot. --Raf diff --git a/doc/forum/how_to_enable_multimarkdown__63__.mdwn b/doc/forum/how_to_enable_multimarkdown__63__.mdwn new file mode 100644 index 000000000..208aadcb0 --- /dev/null +++ b/doc/forum/how_to_enable_multimarkdown__63__.mdwn @@ -0,0 +1,9 @@ +I enabled multimarkdown in my setup file but I get this message 'remote: multimarkdown is enabled, but Text::MultiMarkdown is not installed'. +I also installed multimarkdown-git for my distro (archlinux), which should take care of installing all required perl modules, I believe. +What am I missing? + +Thanks. + +> You are apparently still missing the [[!cpan Text::MultiMarkdown]] +> perl module. Not being familiar with arch linux, I don't know what +> multimarkdown-git is, so I can't say more than that.. --[[Joey]] diff --git a/doc/forum/how_to_enable_multimarkdown__63__/comment_1_037f858c4d0bcbb708c3efd264379500._comment b/doc/forum/how_to_enable_multimarkdown__63__/comment_1_037f858c4d0bcbb708c3efd264379500._comment new file mode 100644 index 000000000..6045a4a3f --- /dev/null +++ b/doc/forum/how_to_enable_multimarkdown__63__/comment_1_037f858c4d0bcbb708c3efd264379500._comment @@ -0,0 +1,14 @@ +[[!comment format=mdwn + username="https://www.google.com/accounts/o8/id?id=AItOawlzADDUvepOXauF4Aq1VZ4rJaW_Dwrl6xE" + nickname="Dário" + subject="comment 1" + date="2010-07-15T15:37:31Z" + content=""" +multimarkdown-git is a package build that fetches the git version of multimarkdown. +It should install Text::Markdown I believe. +I tried to install it by hand on the cpan command line but it didn't work either: +perl -MCPAN -e shell +install Text::MultiMarkdown + +says couldn't run make file or something. +"""]] diff --git a/doc/forum/how_to_enable_multimarkdown__63__/comment_2_b7d512a535490dabf8d6ce55439741c7._comment b/doc/forum/how_to_enable_multimarkdown__63__/comment_2_b7d512a535490dabf8d6ce55439741c7._comment new file mode 100644 index 000000000..804b71c67 --- /dev/null +++ b/doc/forum/how_to_enable_multimarkdown__63__/comment_2_b7d512a535490dabf8d6ce55439741c7._comment @@ -0,0 +1,8 @@ +[[!comment format=mdwn + username="http://joey.kitenet.net/" + nickname="joey" + subject="comment 2" + date="2010-07-16T19:44:55Z" + content=""" +All I can tell you is that, if multimarkdown is correctly installed (ie, if `perl -e 'use Text::MultiMarkdown'` runs successfully), ikiwiki can use it. +"""]] diff --git a/doc/forum/how_to_load_an_external_page_and_still_have_it_under_ikiwiki_template.mdwn b/doc/forum/how_to_load_an_external_page_and_still_have_it_under_ikiwiki_template.mdwn new file mode 100644 index 000000000..a747911a5 --- /dev/null +++ b/doc/forum/how_to_load_an_external_page_and_still_have_it_under_ikiwiki_template.mdwn @@ -0,0 +1,3 @@ +OK, probably title is bit confusing. Basically I'd like to be able to keep my left hand side menu, which is part of the template, and at the same time load let's say forum on the right hand side, which sits on a separate domain. Is it possible then to construct template that for some special links it runs as lets say in *frameset* mode? + +> I think I'll have to use [[ikiwiki/directive/pagetemplate]] and this <http://stackoverflow.com/questions/153152/resizing-an-iframe-based-on-content> solution diff --git a/doc/forum/how_to_login_as_admin.mdwn b/doc/forum/how_to_login_as_admin.mdwn new file mode 100644 index 000000000..807f82501 --- /dev/null +++ b/doc/forum/how_to_login_as_admin.mdwn @@ -0,0 +1,18 @@ +I even managed to set up ikiwiki so it works fine with git; but how on earth do +I log in as an administrator? In the .setup file the admin user is set to +'zimek' but when I go and register 'zimek' on the web it appears as normal user +not the administrator. What am I missing? + +> That's really all there is to it. The [[automatic_setup|setup]] script +> registers the admin user for you before the wiki goes live. If you didn't +> use it, registering the right account name will get you the admin account. +> +> The name is case sensative, perhaps you really spelled one of them `Zimek`? +> +> Or maybe you're the admin, and don't know it? Everything looks the same for the admin, +> except they can edit even locked pages, and can access the websetup interface from their +> Preferences page, if you have that plugin enabled. --[[Joey]] + +>> Maybe I'm indeed. I know that I've disabled all the plugins while installing ikiwiki. Checking it now ;-) + +>> Yup, I'm the God of my ikiwiki. (Thanks) diff --git a/doc/forum/how_to_setup_ikiwiki_on_a_remote_host.mdwn b/doc/forum/how_to_setup_ikiwiki_on_a_remote_host.mdwn new file mode 100644 index 000000000..1c0f8f561 --- /dev/null +++ b/doc/forum/how_to_setup_ikiwiki_on_a_remote_host.mdwn @@ -0,0 +1,35 @@ +Hi all! +I really like ikiwiki and i tested it on my local machine but i have one question that i can't answer reading documentation (my fault of course)... +I have an account and some space on a free hosting service. +Now, i want to put my ikiwiki on this remote web space so that i can browse it from wherever i want. +I have my source dir and my git dir on my local machine. +How can i upload my ikiwiki on the remote host and manage it via git as i can do when i test it locally? +Where is specified? Where can i find documentation about it? + +Thanks in advance! + +Pab + +> There are several ways to accomplish this, depending on what you really +> want to do. +> +> If your goal is to continue generating the site locally, but then +> transfer it to the remote host for serving, you could use the +> [[plugins/rsync]] plugin. +> +> If your goal is to install and run the ikiwiki software on the remote host, +> then you would follow a similar path to the ones described in these tips: +> [[tips/nearlyfreespeech]] [[tips/DreamHost]]. Or even [[install]] ikiwiki +> from a regular package if you have that kind of access. Then you could +> push changes from your local git to git on the remote host to update the +> wiki. [[tips/Laptop_wiki_with_git]] explains one way to do that. +> --[[Joey]] + +Thanks a lot for your answer. +rsync plugin would be perfect but... how would i manage blog post? +I mean... is it possible to manage ikiwiki blog too with rsync plugin in the way you told me? --Pab + +> If you want to allow people to make comments on your blog, no, the rsync plugin will not help, since it will upload a completely static site where nobody can make comments. Comments require a full IkiWiki setup with CGI enabled, so that people add content (comments) from the web. --[[KathrynAndersen]] + +Ok, i understand, thanks. +Is there any hosting service that permits to have a full installation of iwkiwiki or i am forced to get a vps or to mantain a personal server for that? --Pab diff --git a/doc/forum/html_source_pages_in_version_3.20100704.mdwn b/doc/forum/html_source_pages_in_version_3.20100704.mdwn new file mode 100644 index 000000000..7a620fd57 --- /dev/null +++ b/doc/forum/html_source_pages_in_version_3.20100704.mdwn @@ -0,0 +1,8 @@ +Is this different from using the html/rawhtml plugins? + +> I suppose you're talking about this: + + * po: Added support for .html source pages. (intrigeri) + +> That means the [[plugins/po]] plugin is able to translate html pages +> used by the [[plugins/html]] plugin. --[[Joey]] diff --git a/doc/forum/ikiwiki_and_big_files.mdwn b/doc/forum/ikiwiki_and_big_files.mdwn new file mode 100644 index 000000000..cd41d9fce --- /dev/null +++ b/doc/forum/ikiwiki_and_big_files.mdwn @@ -0,0 +1,102 @@ +My website has 214 hand written html, 1500 of pictures and a few, err sorry, 114 +video files. All this takes around 1.5 GB of disk space at the moment. +Plain html files take 1.7 MB and fit naturally into git. + +But what about the picture and video files? + +Pictures are mostly static and rarely need to be edited after first upload, +wasting a megabyte or two after an edit while having them in git doesn't really matter. +Videos on the other hand are quite large from megabytes to hundreds. Sometimes +I re-encode them from the original source with better codec parameters and just +replace the files under html root so they are accessible from the same URL. +So having a way to delete a 200 MB file and upload a new one with same name and access URL +is what I need. And it appears git has trouble erasing commits from history, or requires +some serious gitfoo and good backups of the original repository. + +So which ikiwiki backend could handle piles of large binary files? Or should I go for a separate +data/binary blob directory next to ikiwiki content? + +Further complication is my intention to keep URL compatibility with old handwritten and ikiwiki +based site. Sigh, tough job but luckily just a hobby. + +[-Mikko](http://mcfrisk.kapsi.fi) + +ps. here's how to calculate space taken by html, picture and video files: + + ~/www$ unset sum; for size in $( for ext in htm html txt xml log; \ + do find . -iname "*$ext" -exec stat -c "%s" \{\} \; ; done | xargs ); \ + do sum=$(( $sum + $size )); done ; echo $sum + 1720696 + ~/www$ unset sum; for size in $( for ext in jpg gif jpeg png; \ + do find . -iname "*$ext" -exec stat -c "%s" \{\} \; ; done | xargs ); \ + do sum=$(( $sum + $size )); done ; echo $sum + 46032184 + ~/www$ unset sum; for size in $( for ext in avi dv mpeg mp4; \ + do find . -iname "*$ext" -exec stat -c "%s" \{\} \; ; done | xargs ); \ + do sum=$(( $sum + $size )); done ; echo $sum + 1351890888 + +> One approach is to use the [[plugins/underlay]] plugin to +> configure a separate underlay directory, and put the large +> files in there. Those files will then be copied to the generated +> wiki, but need not be kept in revision control. (Or could be +> revision controlled in a separate repository -- perhaps one using +> a version control system that handles large files better than git; +> or perhaps one that you periodically blow away the old history to +> in order to save space.) +> +> BTW, the `hardlink` setting is a good thing to enable if you +> have large files, as it saves both disk space and copying time. +> --[[Joey]] + +Can underlay plugin handle the case that source and destination directories +are the same? I'd rather have just one copy of these underlay files on the server. + +> No, but enabling hardlinks accomplishes the same effect. --[[Joey]] + +And did I goof in the setup file since I got this: + + $ ikiwiki -setup blog.setup -rebuild --verbose + Can't use string ("/home/users/mcfrisk/www/blog/med") as an ARRAY ref while + "strict refs" in use at + /home/users/mcfrisk/bin/share/perl/5.10.0/IkiWiki/Plugin/underlay.pm line 41. + $ grep underlay blog.setup + add_plugins => [qw{goodstuff websetup comments blogspam html sidebar underlay}], + underlaydir => '/home/users/mcfrisk/bin/share/ikiwiki/basewiki', + # underlay plugin + # extra underlay directories to add + add_underlays => '/home/users/mcfrisk/www/blog/media', + $ egrep "(srcdir|destdir)" blog.setup + srcdir => '/home/users/mcfrisk/blog', + destdir => '/home/users/mcfrisk/www/blog', + # allow symlinks in the path leading to the srcdir (potentially insecure) + allow_symlinks_before_srcdir => 1, + # directory in srcdir that contains directive descriptions + +-Mikko + +> The plugin seems to present a bad default value in the setup file. +> (Fixed in git.) A correct configuration would be: + + add_underlays => ['/home/users/mcfrisk/www/blog/media'], + +Umm, doesn't quite fix this yet: + + $ ikiwiki -setup blog.setup -v + Can't use an undefined value as an ARRAY reference at /home/users/mcfrisk/bin/share/perl/5.10.0/IkiWiki + /Plugin/underlay.pm line 44. + $ grep underlay blog.setup + add_plugins => [qw{goodstuff websetup comments blogspam html sidebar underlay}], + underlaydir => '/home/users/mcfrisk/bin/share/ikiwiki/basewiki', + # underlay plugin + # extra underlay directories to add + add_underlays => ['/home/users/mcfrisk/www/blog/media'], + $ ikiwiki --version + ikiwiki version 3.20091032 + +-Mikko + +> Yeah, I've fixed that in git, but you can work around it with this: +> --[[Joey]] + + templatedirs => [], diff --git a/doc/forum/ikiwiki_development_environment_tips.mdwn b/doc/forum/ikiwiki_development_environment_tips.mdwn new file mode 100644 index 000000000..91ccc6d6e --- /dev/null +++ b/doc/forum/ikiwiki_development_environment_tips.mdwn @@ -0,0 +1,44 @@ +I haven't settled on a comfortable/flexible/quick development environment for hacking on ikiwiki. The VM I host my web pages on is not fast enough to use for RAD and ikiwiki. For developing plugins, it seems a bit heavy-weight to clone the entire ikiwiki repository. I haven't managed to get into a habit of running a cloned version of ikiwiki from it's own dir, rather than installing it (If that's even possible). The ikiwiki site source (source ./doc) is quite large and not a great testbed for hacking (e.g. if you are working on a plugin you need a tailored test suite for that plugin). + +Does anyone have a comfortable setup or tips they would like to share? -- [[Jon]] + +> I've just been setting `libdir` in an existing wiki's setup file. When the plugin's in a decent state, I copy it over to a git checkout and commit. For the plugins I've been working on (auth and VCS), this has been just fine. Are you looking for something more? --[[schmonz]] + +>> I think this suffers from two problems. Firstly, unless you are tracking git +>> master in your existing wiki, there's the possibility that your plugin will +>> not work with a more modern version of ikiwiki (or that it would benefit +>> from using a newly added utility subroutine or similar). + +>>> Unlikely. I don't make changes to the plugin interface that break +>>> existing plugins. (Might change non-exported `IkiWiki::` things +>>> from time to time.) --[[Joey]] + +>> Second, sometimes I +>> find that even writing a plugin can involve making minor changes outside of +>> the plugin code (bug fixes, or moving functionality about). So, I think +>> having some kind of environment built around a git checkout is best. +>> +>> However, this does not address the issue of the tedium writing/maintaining a +>> setup file for testing things. +>> +>> I think I might personally benefit from a more consistent environment (I +>> move from machine-to-machine frequently). -- [[Jon]] + +> If you set `libdir` to point to a checkout of ikiwiki's git repository, +> it will override use of the installed version of ikiwiki, so ikiwiki will +> immediatly use any changed or new `.pm` files (with the exception of +> IkiWiki.pm), and you can use git to manage it all without an installation +> step. If I am modifying IkiWiki.pm, I generally symlink it from +> `/usr/share/perl5/IkiWiki.pm` to my git reporisitory. Granted, not ideal. +> +> I often use my laptop's local version of my personal wiki for testing. +> It has enough stuff that I can easily test most things, and if I need +> a test page I just dump test cases on the sandbox. I can make +> any changes necessary during testing and then `git reset --hard +> origin/master` to avoid publishing them. +> +> If the thing I'm testing involves templates, or underlays, +> I will instead use ikiwiki's `docwiki.setup` for testing, modifying it as +> needed, since it is preconfigured to use the templates and underlays +> from ikiwiki's source repository. +> --[[Joey]] diff --git a/doc/forum/ikiwiki_over_database__63__.wiki b/doc/forum/ikiwiki_over_database__63__.wiki index ff123e98d..a70f9c989 100644 --- a/doc/forum/ikiwiki_over_database__63__.wiki +++ b/doc/forum/ikiwiki_over_database__63__.wiki @@ -1 +1,11 @@ Is there here any possibility to modifying ikiwiki (via plugin) for store pages in database. I'm thinking about storing pages in sqlite or mysql for serving it much faster. The idea is from sputnik.org [http://sputnik.freewisdom.org/] but with perl ;-). Could we integrate the sputnik code in ikiwiki as a solution? + +----- + +ikiwiki generates static pages in a filesystem. It's responsible for editing and regenerating them, but they're served by any old web server. If you go to the trouble of stuffing the generated pages into a database, you'll need to go to further trouble to serve them back out somehow: write your own web server, perhaps, or a module for a particular web server. Either way you'll have sacrificed ikiwiki's interoperability, and it's not at all clear (since you're adding, in the best case, one layer of indirection reading the generated files) you'll have gained any improved page-serving performance. If it's source pages you want to store in a database, then you lose the ability to do random Unixy things to source pages, including managing them in a revision control system. + +Static HTML pages in a filesystem and the ability to do random Unixy things are two of the uniquely awesome features of ikiwiki. It's probably possible to do what you want, but it's unlikely that you really want it. I'd suggest you either get to know ikiwiki better, or choose one of the many wiki implementations that already works as you describe. --[[Schmonz]] + +--- + +Thanks, [[Schmonz]]. You clarify me much things,.... Xan. diff --git a/doc/forum/ikiwiki_vim_syntaxfile.mdwn b/doc/forum/ikiwiki_vim_syntaxfile.mdwn new file mode 100644 index 000000000..e6942cd2d --- /dev/null +++ b/doc/forum/ikiwiki_vim_syntaxfile.mdwn @@ -0,0 +1,26 @@ +See the new syntax file [[here|tips/vim_syntax_highlighting]]. It fixes both of +the problems reported below. + +---- + +Hi all, + +I'm teaching myself how to write syntax files for vim by fixing several issues +(and up to certain extent, taking over the maintenance) of the vim syntax +(highlighting) file for ikiwiki. + +I'd like you to document here which problems you have found, so I can hunt them +and see if I can fix them. + +## Problems Found + + * Arguments of directives with a value of length 1 cause the following text to + be highlighted incorrectly. Example: + + [[!directive param1="val1" param2="1"]] more text ... + + * A named wikilink in a line, followed by text, and then another wikilink, + makes the text in between the links to be incorrectly highlighted. Example: + + \[[a link|alink]] text that appears incorrectly .... \[[link]] + diff --git a/doc/forum/installation_and_setup_questions.mdwn b/doc/forum/installation_and_setup_questions.mdwn new file mode 100644 index 000000000..50633ec3f --- /dev/null +++ b/doc/forum/installation_and_setup_questions.mdwn @@ -0,0 +1,52 @@ +[[!meta date="2007-03-02 00:57:08 +0000"]] + +Ikiwiki creates a .ikiwiki directory in my wikiwc working directory. Should I +"svn add .ikiwiki" or add it to svn:ignore? + +> `.ikiwiki` is used by ikiwiki to store internal state. You can add it to +> svn:ignore. --[[Joey]] +> > Thanks a lot. + +Is there an easy way to log via e-mail to some webmaster address, instead +of via syslog? + +> Not sure why you'd want to do that, but couldn't you use a tool like +> logwatch to mail selected lines from the syslog? --[[Joey]] + +> > The reason is that I'm not logged in on the web server regularly to +> > check the log files. I'll see whether I can install a logwatch instance. + +I'm trying to install from scratch on a CentOS 4.6 system. I installed perl 5.8.8 from source and then added all the required modules via CPAN. When I build ikiwiki from the tarball, I get this message: + + rendering todo/calendar_--_archive_browsing_via_a_calendar_frontend.mdwn + *** glibc detected *** double free or corruption (!prev): 0x0922e478 *** + make: *** [extra_build] Aborted + +I'm kind of at a loss how to track this down or work around it. Any suggestions? --Monty + +> All I can tell you is that it looks like a problem with your C library or +> perl. Little perl programs like ikiwiki should only be able to trigger +> such bugs, not contain them. :-) Sorry I can't be of more help. +> --[[Joey]] + +> I had a similar problem after upgrading to the latest version of +> Text::Markdown from CPAN. You might try either looking for a Markdown +> package for CentOS or using the latest version of John Gruber's +> Markdown.pl: +> <http://daringfireball.net/projects/downloads/Markdown_1.0.2b8.tbz> +> --[[JasonBlevins]], April 1, 2008 18:22 EDT + +>> Unfortunately I couldn't find a CentOS package for markdown, and I +>> couldn't quite figure out how to use John Gruber's version instead. +>> I tried copying it to site_perl, etc., but the build doesn't pick +>> it up. For now I can just play with it on my Ubuntu laptop for which +>> the debian package installed flawlessly. I'll probably wait for an +>> updated version of Markdown to see if this is fixed in the future. +>> --Monty + +>I suggest that you pull an older version of Text::Markdown from CPAN. I am using <http://backpan.perl.org/authors/id/B/BO/BOBTFISH/Text-Markdown-1.0.5.tar.gz> and that works just fine. +>There is a step change in version and size between this version (dated 11Jan2008) and the next version (1.0.12 dated 18Feb2008). I shall have a little look to see why, in due course. +>Ubuntu Hardy Heron has a debian package now, but that does not work either. +> --Dirk 22Apr2008 + +> This might be related to [Text::Markdown bug #37297](http://rt.cpan.org/Public/Bug/Display.html?id=37297).--ChapmanFlack 9Jul2008 diff --git a/doc/forum/installation_as_non-root_user.mdwn b/doc/forum/installation_as_non-root_user.mdwn new file mode 100644 index 000000000..4997af2ac --- /dev/null +++ b/doc/forum/installation_as_non-root_user.mdwn @@ -0,0 +1,7 @@ +I'd like to install ikiwiki as a non-root user. I can plow through getting all the +perl dependencies installed because that's well documented in the perl world, +but I don't know how to tell ikiwiki to install somewhere other than / --BrianWilson + +> Checkout the tips section for [[tips/DreamHost]]. It should do the trick. --MattReynolds + +[[!meta date="2008-01-13 16:02:52 -0500"]] diff --git a/doc/forum/installation_of_selected_docs.mdwn b/doc/forum/installation_of_selected_docs.mdwn new file mode 100644 index 000000000..81dd1ee00 --- /dev/null +++ b/doc/forum/installation_of_selected_docs.mdwn @@ -0,0 +1,29 @@ +[[!meta date="2007-09-06 19:47:23 +0000"]] + +# Installation of selected docs (html) + +The latest release has around 560 files (over 2MB) in html. + +Any suggestions or ideas on limiting what html is installed? + +For example, I don't see value in every ikiwiki install out there to also install personal "users" ikiwiki pages. + +For now I copy ikiwiki.setup. And then use pax with -L switch to copy the targets of the symlinks of the basewiki. + +I was thinking of making a list of desired documents from the html directory to install. + +--JeremyReed + +> You don't need any of them, unless you want to read ikiwiki's docs locally. +> +> I don't understand why you're installing the basewiki files manually; +> ikiwiki has a Makefile that will do this for you. --[[Joey]] + +>> The Makefile's install doesn't do what I want so I use different installer for it. +>> It assumes wrong location for man pages for me. (And it should consider using INSTALLVENDORMAN1DIR and +>> MAN1EXT but I don't know about section 8 since I don't know of perl value for that.) +>> I don't want w3m cgi installed; it is optional for my package. +>> I will just patch for that instead of using my own installer. +>> Note: I am working on the pkgsrc package build specification for this. This is for creating +>> packages for NetBSD, DragonFly and other systems that use pkgsrc package system. +>> --JeremyReed diff --git a/doc/forum/link_autocompletion_in_vim.mdwn b/doc/forum/link_autocompletion_in_vim.mdwn new file mode 100644 index 000000000..7d3ed8b02 --- /dev/null +++ b/doc/forum/link_autocompletion_in_vim.mdwn @@ -0,0 +1,17 @@ +I extended the functionality of the [ikiwiki-nav plugin](http://www.vim.org/scripts/script.php?script_id=2968) +(see [[here|tips/follow_wikilinks_from_inside_vim]]) to allow completion of +wikilinks from inside vim, through the omnicompletion mechanism. + +It still has some bugs, but is usable, and will not destroy your data. It can +only complete links whose definition (text) is on a single line, and still can't +handle "named links" (`\[\[text|link\]\]`). + +I'd love to hear suggestions for improvement for it, and bug reports ;) For +example, regarding how are sorted and presented the available completions +(dates, alphabetically, etc). + +You can find a tarball for it +[here](http://devnull.li/~jerojasro/ikiwiki-nav-dev.tar.gz). To install it, +extract the tarball contents in your `.vim` directory. + +--[[jerojasro]] diff --git a/doc/forum/link_to_an_image_inside_the_wiki_without_inlining_it.mdwn b/doc/forum/link_to_an_image_inside_the_wiki_without_inlining_it.mdwn new file mode 100644 index 000000000..e92cc1b1c --- /dev/null +++ b/doc/forum/link_to_an_image_inside_the_wiki_without_inlining_it.mdwn @@ -0,0 +1,69 @@ +how can I create a link to an image which is part of the wiki, without having it inserted in my page? + +I tought this: + + \[[look at this|img/lolcat.png]] + +would work, but it doesn't. + +Any hints? --[[jerojasro]] + +> Well, currently the syntax above will display the image +> inline with the specified link text used as an alt attribute. Although +> that does not seem to be documented anywhere. +> +> A few places that use that (found with `git grep '\[\[' | egrep 'png|gif|jpeg|jpg' |grep \|`): +> +> * [[logo]] uses it to provide useful alt texts for the logos. (This +> could easily be changed to use [[ikiwiki/directive/img]] though.) +> * The `change.tmpl` template uses it to display +> the [[diff|wikiicons/diff.png]] with a very useful "diff" alt text. +> Using [[ikiwiki/directive/img]] here would mean that the +> [[plugins/recentchanges]] plugin would depend upon the img +> plugin. +> +> I do like your suggestion, it makes more sense than the current behavior. +> I'm not sure the transition pain to get from here to there is worth it, +> though. +> +> More broadly, if I were writing ikiwiki now, I might choose to leave out the +> auto-inlining of images altogether. In practice, it has added a certian level +> of complexity to ikiwiki, with numerous plugins needing to specify +> `noimageinline` to avoid accidentially inlining an image. And there has not +> been a lot of payoff from having the auto-inlining feature implicitly +> available most places. And the img directive allows much needed control over +> display, so it would be better for users to not have to worry about its +> lesser cousin. But the transition from here to *there* would be another order +> of pain. +> +> Anyway, the cheap and simple answer to your question is to use html +> or markdown instead of a [[ikiwiki/wikilink]]. Ie, +> `[look at this](img/lolcat.jpg)`. --[[Joey]] + +> > thanks a lot, that's a quite straightforward solution. I actually wrote a +> > broken plugin to do that, and now I can ditch it --[[jerojasro]] + +>>> The plugin approach is not a bad idea if you want either the ability +>>> to: +>>> +>>> * Have things that are wikilink-aware (like [[plugins/brokenlinks]] +>>> treat your link to the image as a wikilink. +>>> * Use standard wikilink path stuff (and not have to worry about +>>> a relative html link breaking if the page it's on is inlined, for +>>> example). +>>> +>>> I can help you bang that plugin into shape if need be. --[[Joey]] + +>>>> both my plugin and your suggestion yield broken html links when inlining the page (although probably that's what is expected from your suggestion (`[]()`)) +>>>> +>>>> I thought using the `bestlink` function would take care of that, but alas, it doesn't. +>>>> Get the "plugin" [here](http://devnull.li/~jerojasro/files/linktoimgonly.pm), see the broken +>>>> links generated [here](http://devnull.li/~jerojasro/blog/posts/job_offers/) and the source +>>>> file for that page [here](http://git.devnull.li/cgi-bin/gitweb.cgi?p=blog-jerojasro.git;a=blob;f=posts/job_offers.mdwn;hb=HEAD) --[[jerojasro]] + +>>>>> Use this --[[Joey]] + + return htmllink($params{page}, $params{destpage}, $params{"img"}, + linktext => $params{text}, + noimageinline => 1); + diff --git a/doc/forum/missing_pages_redirected_to_search-SOLVED.mdwn b/doc/forum/missing_pages_redirected_to_search-SOLVED.mdwn new file mode 100644 index 000000000..3af83396c --- /dev/null +++ b/doc/forum/missing_pages_redirected_to_search-SOLVED.mdwn @@ -0,0 +1,36 @@ +Is it possible to have any missing pages(404's) redirected to the search(omega) ? +So if someone comes to my site with http://example.com/foo_was_here it would result in 'foo_was_here' being passed as a search parameter to omega ? --[Mick](http://www.lunix.com.au) + +##DONE + +I use nginx instead of apache. +Just add the following to the `server` block outside of any location block in nginx.conf +You must also make sure you have setup and enabled the search plugin(omega) + + error_page 404 /ikiwiki.cgi?P=$uri; + + +My full nginx.conf + + server { + listen [::]:80; #IPv6 capable + server_name www.lunix.com.au; + access_log /var/log/nginx/www.lunix.com.au-access.log main; + error_log /var/log/nginx/www.lunix.com.au-error.log warn; + error_page 404 /ikiwiki.cgi?P=$uri; + + location / { + root /home/lunix/public_html/lunix; + index index.html index.htm; + } + + location ~ ikiwiki\.cgi$ { + root /home/lunix/public_html/lunix; + include /etc/nginx/fastcgi_params.cgi; + + fastcgi_pass 127.0.0.1:9999; + fastcgi_param SCRIPT_FILENAME /home/lunix/public_html/lunix$fastcgi_script_name; # same path as above + } + } + + diff --git a/doc/forum/multi-user_setup_of_ikiwiki__44___gitosis_and_apache2_in_Debian_Sid.mdwn b/doc/forum/multi-user_setup_of_ikiwiki__44___gitosis_and_apache2_in_Debian_Sid.mdwn index fa8b5010e..b8e28e0a3 100644 --- a/doc/forum/multi-user_setup_of_ikiwiki__44___gitosis_and_apache2_in_Debian_Sid.mdwn +++ b/doc/forum/multi-user_setup_of_ikiwiki__44___gitosis_and_apache2_in_Debian_Sid.mdwn @@ -65,3 +65,32 @@ Could you please enlighten me. It should be possible seeing for example this sit Thanks in advance, --[[PaulePanter]] + +--- + +## Current Working Notes + +I've spent a little time getting this working and I wanted to share my notes on this, for those that come after me. + +### My Setup + +- Debian Lenny + +- Gitosis (hand compiled, for no good reason, but it's the same version as in the repository) + +- Ikiwiki 3.12 installed using packages from Sid + +### What Works + +- Everything needs to be chowned git:git (or gitosis:gitosis) by whatever gitosis runs with. This includes: + - the bare repository (as always) + - the srcdir + - the destdir + +- Ikiwiki needs to run in gitosis' group (eg. git in my case, but probably gitosis in yours) + +- ikiwiki.cgi needs be set with the wrapper mode 6755. + +-- [[tychoish]] + + diff --git a/doc/forum/multi_domain_setup_possible__63__.mdwn b/doc/forum/multi_domain_setup_possible__63__.mdwn new file mode 100644 index 000000000..01b31aafb --- /dev/null +++ b/doc/forum/multi_domain_setup_possible__63__.mdwn @@ -0,0 +1,15 @@ +Hi! I am searching for a replacement of my blog and webpages made off static HTML with just some custom PHP around it for years already. ikiwiki seems to be one of the hot candidates, since it uses a RCS. + +I would like to have a multi domain setup like this: + +- myname.private.de => more of a personal page +- professional.de => more of my professional work related page +- and possibly others + +Now when I write a blog entry about some Linux, Debian or KDE stuff, I possibly would like to have it shown on my private and my professional domain. + +And I might like to use some kind of inter wiki links now and then. + +Is such a setup possible? I thought about have a big wiki with Apache serving sub directories from it under different domains, but then wiki links like would not work. + +Maybe having the same blog entry, same content on several domains is not such a hot idea, but as long as I do not see a problem with it, I'd like to do it. diff --git a/doc/forum/multi_domain_setup_possible__63__/comment_1_43f5df30d09046ccc4f7c44703979a11._comment b/doc/forum/multi_domain_setup_possible__63__/comment_1_43f5df30d09046ccc4f7c44703979a11._comment new file mode 100644 index 000000000..5b00272b3 --- /dev/null +++ b/doc/forum/multi_domain_setup_possible__63__/comment_1_43f5df30d09046ccc4f7c44703979a11._comment @@ -0,0 +1,17 @@ +[[!comment format=mdwn + username="http://joey.kitenet.net/" + nickname="joey" + subject="branches" + date="2010-08-15T16:06:43Z" + content=""" +This is where the git backend (or bzr if you prefer) shines. Make a site, and then branch it to a second site, and put your personal type stuff on the branch. cherry-pick or merge changes from one branch to another. + +The possibility to do this kind of thing is why our recently launched Ikiwiki hosting service is called +[Branchable.com](http://branchable.com). It makes it easy to create branches of a Ikiwiki site hosted +there: <http://www.branchable.com/tips/branching_an_existing_site/> +(Merging between branches need manual git, for now.) + +BTW, for links between the branched wikis you can just use the [[plugins/shortcut]] plugin. + +--[[Joey]] +"""]] diff --git a/doc/forum/multi_domain_setup_possible__63__/comment_2_75d6581f81b71fb8acbe3561047ea759._comment b/doc/forum/multi_domain_setup_possible__63__/comment_2_75d6581f81b71fb8acbe3561047ea759._comment new file mode 100644 index 000000000..473f52f5c --- /dev/null +++ b/doc/forum/multi_domain_setup_possible__63__/comment_2_75d6581f81b71fb8acbe3561047ea759._comment @@ -0,0 +1,16 @@ +[[!comment format=mdwn + username="http://claimid.com/helios" + nickname="helios" + subject="branches" + date="2010-08-15T16:18:35Z" + content=""" +So I I just put a blog entry, which is just a file on both branches. Seems I have to learn cherry-picking and merging only some changes. + +Still I am duplicating files then and when I edit one file I have to think to also edit the other one or merge the change to it. I thought of a way to tag a blog entry on which site it should appear. And then I just have to edit one file and contents changes on all sites that share it. + +But then I possibly can do some master blog / shared content branch, so that shared content is only stored once. Then I need to find a way to automatically replicate the changes there to all sites it belongs too. But how do I store it. + +I also thought about just using symlinks for files. Can I have two sites in one repository and symlink shared files stuff around? I know bzr can version control symlinks. + +Hmmm, I think I better read more about branching, cherry-picking and merging before I proceed. I used bzr and git, but from the user interface side of things prefer bzr, which should be fast enough for this use case. +"""]] diff --git a/doc/forum/navigation_of_wiki_pages_on_local_filesystem_with_vim.mdwn b/doc/forum/navigation_of_wiki_pages_on_local_filesystem_with_vim.mdwn new file mode 100644 index 000000000..7bfcf3088 --- /dev/null +++ b/doc/forum/navigation_of_wiki_pages_on_local_filesystem_with_vim.mdwn @@ -0,0 +1,130 @@ +**UPDATE** I have created a [[page|tips/follow_wikilinks_from_inside_vim]] in +the tips section about the plugin, how to get it, install it and use it. Check +that out. --[[jerojasro]] + +I wrote a vim function to help me navigate the wiki when I'm editing it. It extends the 'gf' (goto file) functionality. Once installed, you place the cursor on a wiki page name and press 'gf' (without the quotes); if the file exists, it gets loaded. + +This function takes into account the ikiwiki linking rules when deciding which file to go to. + +> 'gf' gets in the way when there are directories with the same name of a wiki page. The +> function below doesn't implement the linking rules properly (test the link (ignoring case), +> if there is no match ascend the dir. hierarchy and start over, until we reach the root of +> the wiki). I'm rewriting it to follow these rules properly +> +> I think the page for [[LinkingRules|ikiwiki/subpage/linkingrules]] should say that ikiwiki **ascends** +> the dir. hierarchy when looking for a wikilink, not that it **descends** it. Am I correct? --[[jerojasro]] + +>> Conventionally, the root directory is considered to be lower than other +>> directories, so I think the current wording is correct. --[[Joey]] + +let me know what you think + +> " NOTE: the root of the wiki is considered the first directory that contains a +> " .ikiwiki folder, except $HOME/.ikiwiki (the usual ikiwiki libdir) +> +> That's not going to work in all situations; for example, with an ikiwiki which uses git as the backend, the normal setup is that one has +> +> * a bare git repository +> * a git repository which ikiwiki builds the wiki from (which has a .ikiwiki directory in it) +> * an *additional* git repository cloned from the bare repository, which is used for making changes from the command-line rather than the web. It is this repository in which one would be editing files with vim, and *this* repository does not have a .ikiwiki directory in it. It does have a .git directory in the root, however, so I suppose you could use that as a method of detection of a root directory, but of course that would only work for git repositories. +> +> -- [[KathrynAndersen]] +> +>> You are completely right; all of my wikis are compiled both locally and +>> remotely, and so the local repo also has a `.ikiwiki` folder. And that's not the +>> "usual" setup. +>> +>> checking for a `.git` dir would not work when the wiki's source files aren't +>> located at the root of the repo. +>> +>> So, besides of doing a `touch .ikiwiki` at the root of the wiki in your local +>> repo, do you see any alternative? +>> +>> -- [[jerojasro]] + +well. I've rewritten the whole thing, to take into account: + + * file matching ignoring case (MyPage matches mypage.mdwn) + * checking all the way down (up) to the root of the wiki (if there is a link `\[[foo]]` on `a/b/page`), + try `a/b/page/foo`, then `a/b/foo`, and so on, up to `foo` + * the alternate name for a page: when looking for the file for `\[[foo]]`, try both `foo.mdwn` and `foo/index.mdwn` + +you can find the file [here](http://git.devnull.li/cgi-bin/gitweb.cgi?p=vim-jerojasro.git;a=blob;f=.vim/ftplugin/ikiwiki_nav.vim;hb=HEAD). To use it, place it in `$HOME/.vim/ftplugin`. After that, hitting `<CR>` (Enter) in normal mode over a wikilink will take you to that page, if it exists. + +the plugin has, as of now, two problems: + + * doesn't work with wikilinks that take more than one line (though this isn't really that bad) + * it assumes that the root of the wiki is the first directory down the filesystem hierarchy that + has a `.ikiwiki` folder on it. If your copy of the wiki doesn't have it, you must create it for + the plugin to work + +-- [[jerojasro]] + +> Interesting. I was at one point looking at "potwiki.vim", which implements a local wiki and follows CamelCase links, creating new files where necessary etc., to see if it could be adapted for ikiwiki (See [[tips/vim syntax highlighting/discussion]]). I didn't get anywhere. -- [[Jon]] + +>> when I wrote the plugin I also considered the possibility of creating files (and their dirs, if necessary) +>> from new wikilinks; the changes needed to get that working are fairly small -- [[jerojasro]] + +> Seems about ready for me to think about pulling it into ikiwiki +> alongside [[tips/vim_syntax_highlighting/ikiwiki.vim]]. If you'll +> please slap a license on it. :) --[[Joey]] +> +>> GPL version 2 or later (if that doesn't cause any problems here). I'll add it +>> to the file --[[jerojasro]] +>> +>>> I see you've put the plugin on vim.org. Do you think it makes sense to +>>> also include a copy in ikiwiki? --[[Joey]] +>>> +>>>> mmm, no. There would be two copies of it, and the git repo. I'd rather have +>>>> a unique place for the "official" version (vim.org), and another for the dev +>>>> version (its git repo). +>>>> +>>>> actually, I would also suggest to upload the [[`ikiwiki.vim`|tips/vim_syntax_highlighting]] file to vim.org --[[jerojasro]] +>>>>> +>>>>> If you have any interest in maintaining the syntax highlighting +>>>>> plugin and putting it there, I'd be fine with that. I think it needs +>>>>> some slight work to catch up with changes to ikiwiki's directives +>>>>> (!-prefixed now), and wikilinks (able to have spaces now). --[[Joey]] + +<a id='syn-maintenance'> + +>>>>> +>>>>>> I don't really know too much about syntax definitions in vim. But I'll give it a stab. I know it fails when there are 2 \[[my text|link]] wikilinks in the same page. +>>>>>> I'm not promising anything, though ;) --[[jerojasro]] +> +> Also, I have a possible other approach for finding ikiwiki's root. One +> could consider that any subdirectory of an ikiwiki wiki is itself +> a standalone wiki, though probably one missing a toplevel index page. +> The relative wikilinks work such that this assumption makes sense; +> you can build any subdirectory with ikiwiki and probably get something +> reasonable with links that work, etc. +> +> So, if that's the case, then one could say that the directory that the +> user considers to be the toplevel of their wiki is really also a subwiki, +> enclosed in a succession of parents that go all the way down to the root +> directory (or alternatively, to the user's home directory). I think that +> logically makes some sense. +> +> And if that's the case, you can resolve an absolute link by looking for +> the page closest to the root that matches the link. +> +>> I like your idea; it doesn't alter the matching of the relative links, and +>> should work fine with absolute links too. I'll implement it, though I see +>> some potential (but small) issues with it --[[jerojasro]] +> +> It may even make sense to change ikiwiki's own handling of "absolute" +> links to work that way. But even without changing ikiwiki, I think it +> would be a reasonable thing for vim to do. It would only fail in two +> unusual circumstances: +> +> 1. There is a file further down, outside what the user considers +> the wiki, that matches. Say a `$HOME/index.mdwn` +> 2. An absolute link is broken in that the page linked to does +> not exist in the root of the wiki. But it does exist in a subdir, +> and vim would go to that file. +> +> --[[Joey]] +> +>> your approach will add more noise when the plugin grows the page-creation +>> feature, since there will be no real root to limit the possible locations for +>> the new page. But it is far better than demanding for a `.ikiwiki` dir --[[jerojasro]] diff --git a/doc/forum/recentchanges_dir_should_be_under_control_of_RCS__63__.mdwn b/doc/forum/recentchanges_dir_should_be_under_control_of_RCS__63__.mdwn new file mode 100644 index 000000000..2fe97366b --- /dev/null +++ b/doc/forum/recentchanges_dir_should_be_under_control_of_RCS__63__.mdwn @@ -0,0 +1,105 @@ +Hello Joey, + +I noticed that my Ikiwiki started to rebuild pages very slowly after my last changes +when I upgraded Ikiwiki to version 3.20100623. Now I have the latest release 3.20100704, +but it doesn't help me. + +I started to debug the problem and I found that I can see a lot of messages +like below when I try to rebuild my wiki manually: + + svn: '/path/to/ikiwiki/trunk/pages/ostatnie_zmiany' is not a working copy + svn: Can't open file '/path/to/ikiwiki/trunk/pages/ostatnie_zmiany/.svn/entries': No such file or directory + svn log exited 256 + +"ostatnie_zmiany" is a value of `recentchangespage` parameter in my +`ikiwiki.setup` file. It is not under control Subversion I use for Ikiwiki: + + $ svn status pages/ostatnie_zmiany + ? pages/ostatnie_zmiany + + $ ls pages/ostatnie_zmiany/*._change |wc -l + 100 + +`recentchangesnum` parameter has value 100 for me and I noticed that my Ikiwiki +takes a lot of time to parse all `._change` files. Finally it doesn't refresh +/ostatnie_zmiany.html page. + +Do you think I should add `ostatnie_zmiany` directory under control of my +Subversion repo? If it's not necessary, could you please give me any hint +to find a reason of problem with my Ikiwiki? + +My best regards, +Pawel + +> No, the recentchanges pages are automatically generated and should not +> themselves be in revision control. +> +> Ikiwiki has recently started automatically enabing `--gettime`, but +> it should not do it every time, but only on the initial build +> of a wiki. It will print "querying svn for file creation and modification +> times.." when it does this. If it's doing it every time, something +> is wrong. (Specifically, `.ikiwiki/indexdb` must be missing somehow.) +> +> The support for svn with --gettime is rather poor. (While with git it is +> quite fast.) But as it's only supposed to happen on the first build, +> I haven't tried to speed it up. It would be hard to do it fast with svn. +> It would be possible to avoid the warning message above, or even skip +> processing files in directories not checked into svn -- but I'd much +> rather understand why you are seeing this happen on repeated builds. +> --[[Joey]] + +>> Thanks a lot for your reply! I've just checked my `rebuild-pages.sh` +>> script and discovered that it contains +>> `/usr/bin/ikiwiki --setup ikiwiki.setup --gettime` command... :D +>> The warnings disappeared when I removed `--gettime` parameter. +>> Sorry for confusing! :) +>> +>> I have `.ikiwiki/indexdb` file here, but I noticed that it has been +>> modified about 1 minute **after** last Subversion commit: +>> +>> $ LANG=C svn up +>> At revision 5951. +>> +>> $ LANG=C svn log -r 5951 +>> ------------------------------------------------------------------------ +>> r5951 | svn | 2010-07-06 09:02:30 +0200 (Tue, 06 Jul 2010) | 1 line +>> +>> web commit by xahil +>> ------------------------------------------------------------------------ +>> +>> $ LANG=C stat pages/.ikiwiki/indexdb +>> File: `pages/.ikiwiki/indexdb' +>> Size: 184520 Blocks: 368 IO Block: 131072 regular file +>> Device: 2bh/43d Inode: 1931145 Links: 1 +>> Access: (0644/-rw-r--r--) Uid: ( 1005/ svn) Gid: ( 1005/ svn) +>> Access: 2010-07-06 12:06:24.000000000 +0200 +>> Modify: 2010-07-06 09:03:38.000000000 +0200 +>> Change: 2010-07-06 09:03:38.000000000 +0200 +>> +>> I believe it's the time I have to wait to see that my wiki page has been rebuilt. +>> Do you have any idea how to find a reason of that delay? --[[Paweł|ptecza]] + +>>> Well, I hope that your svn post-commit hook is not running your +>>> `rebuild-pages.sh`. That script rebuilds everything, rather than just +>>> refreshing what's been changed. +>>> +>>> Using subversion is not asking for speed. Especially if your svn +>>> repository is on a remote host. You might try disabling +>>> recentchanges and see if that speeds up the refreshes (it will avoid +>>> one `svn log`). +>>> +>>> Otherwise, take a look at [[tips/optimising_ikiwiki]] +>>> for some advice on things that can make ikiwiki run slowly. --[[Joey]] + +>>>> Thanks for the hints! I don't understand it, but it seems that refreshing +>>>> all pages has resolved the problem and now my wiki works well again :) +>>>> +>>>> No, I use `rebuild-pages.sh` script only when I want to rebuild +>>>> my wiki manually, for example when you release new Ikiwiki version +>>>> then I need to update my templates. Some of them have been translated +>>>> to Polish by me. +>>>> +>>>> Fortunately my wiki and its Subversion repo are located on the same host. +>>>> We have a lot of Subversion repos for our projects and I don't want to +>>>> change only wiki repo for better performance. I'm rather satisfied with +>>>> its speed. --[[Paweł|ptecza]] diff --git a/doc/forum/remove_css__63__.mdwn b/doc/forum/remove_css__63__.mdwn new file mode 100644 index 000000000..80da621d6 --- /dev/null +++ b/doc/forum/remove_css__63__.mdwn @@ -0,0 +1,5 @@ +I removed a local.css file and pushed the changes to git but the 'compiled' wiki still shows the same css. +Is this a bug or you are supposed to remove the css by hand? +ikiwiki version 3.20100705 + +> It's a [[bug|bugs/underlaydir_file_expose]]. --[[Joey]] diff --git a/doc/forum/report_pagination.mdwn b/doc/forum/report_pagination.mdwn new file mode 100644 index 000000000..d609fd502 --- /dev/null +++ b/doc/forum/report_pagination.mdwn @@ -0,0 +1,11 @@ +I am thinking of adding pagination to the [[plugins/contrib/report]] plugin, but I'm not sure which is the best approach to take. (By "pagination" I mean breaking up a report into multiple pages with N entries per page.) + +Approaches: + +1. generate additional HTML files on the fly which are placed in the sub-directory for the page the report is on. These are not "pages", they are not under revision control, they aren't in the %pagesources hash etc. But using the `will_render` mechanism assures that they will be removed when they are no longer needed. + +2. create new pages which each have a report directive which shows a subset of the full result; add them to revision control, treat them as full pages. Problems with this are: (a) trying to figure out when to create these new pages and when not to, (b) whether or not these pages can be deleted automatically. + +3. some other approach I haven't thought of. + +I'm afraid that whatever approach I take, it will end up being a kludge. diff --git a/doc/forum/speeding_up_ikiwiki.mdwn b/doc/forum/speeding_up_ikiwiki.mdwn index 0b2164238..799186cf8 100644 --- a/doc/forum/speeding_up_ikiwiki.mdwn +++ b/doc/forum/speeding_up_ikiwiki.mdwn @@ -56,7 +56,7 @@ number is still too large to really visualize: the graphviz PNG and PDF output engines segfault for me, the PS one works but I can't get any PS software to render it without exploding. -Now, the relations in the links hash are not the same thing as IkiWiki's notion of dependencies. Can anyone point me at that data structure / where I might be able to add some debugging foo to generate a graph of it? +Now, the relations in the links hash are not the same thing as Ikiwiki's notion of dependencies. Can anyone point me at that data structure / where I might be able to add some debugging foo to generate a graph of it? Once I've figured out that I might be able to optimize some pagespecs. I understand pagespecs are essentially translated into sequential perl code. I @@ -77,15 +77,14 @@ wrapper. > Dependencies go in the `%IkiWiki::depends` hash, which is not exported. It > can also be dumped out as part of the wiki state - see [[tips/inside_dot_ikiwiki]]. > -> It's a map from page name to increasingly complex pagespec, although -> the `optimize-depends` branch in my git repository changes that to a -> map from a page name to a *list* of pagespecs which are automatically -> or'd together for use (this at least means duplicates can be weeded out). -> -> See [[todo/should_optimise_pagespecs]] for more on that. +> Nowadays, it's a hash of pagespecs, and there +> is also a `IkiWiki::depends_simple` hash of simple page names. > > I've been hoping to speed up IkiWiki too - making a lot of photo albums > with my [[plugins/contrib/album]] plugin makes it pretty slow. > > One thing that I found was a big improvement was to use `quick=yes` on all > my `archive=yes` [[ikiwiki/directive/inline]]s. --[[smcv]] + +> Take a look at [[tips/optimising_ikiwiki]] for lots of helpful advice. +> --[[Joey]] diff --git a/doc/forum/square_brackets_inside_backticks_generates_incorrect_html___40__interpreted_as_wikilinks__41____63__.html b/doc/forum/square_brackets_inside_backticks_generates_incorrect_html___40__interpreted_as_wikilinks__41____63__.html new file mode 100644 index 000000000..fe20a05b1 --- /dev/null +++ b/doc/forum/square_brackets_inside_backticks_generates_incorrect_html___40__interpreted_as_wikilinks__41____63__.html @@ -0,0 +1,19 @@ +the following markdown code +`foo \[["bar"]]` generates the following output: + +foo <span class="createlink"><a href="http://euler/~dabd/wiki/ikiwiki.cgi?page=__34__bar__34__&from=foo&do=create" rel="nofollow">?</a>"bar"</span> + +Perhaps this is a bug in the markdown processor? + +<blockquote> +This has nothing to do with markdown; wikilinks and directives +are not part of markdown, and just get expanded to html before +markdown processing. + +There is a bug open about this: +[[bugs/wiki_links_still_processed_inside_code_blocks]] + +Note that escaping the wikilink with a slash will avoid it being expanded +to html. +--[[Joey]] +</blockquote> diff --git a/doc/forum/suppressing_output_of_pages_included_only_for_their_side_effects.mdwn b/doc/forum/suppressing_output_of_pages_included_only_for_their_side_effects.mdwn new file mode 100644 index 000000000..99784cdd2 --- /dev/null +++ b/doc/forum/suppressing_output_of_pages_included_only_for_their_side_effects.mdwn @@ -0,0 +1,16 @@ +In particular, it's kind of annoying that using the sidebar plugin results in the creation of a free-standing sidebar.html (which in the simplest case of course includes a copy of *its own content* as a sidebar). It would be nice if there were a way to tell Ikiwiki to treat a file like sidebar.mdwn as "inline only": allow its content to be inlined but not to render it separately nor allow linking to it. + +In reading through the code and associated docs, it appears that the ideal method is for the file to be removed from the $changed array by plugin's "needsbuild" hook. Either the sidebar plugin could define such a hook, or perhaps a more general solution is the creation of a meta variable or config file regexp that would handle it according to the user's wishes. + +I'm about ready to code up such a change but want to find out if I'm thinking along the right lines. --[[blipvert]] + +> Internal pages should be able to be used for this, as they are used for +> comments. So you'd have +> `sidebar._mdwn`. However, mwdn would need to be changed to register a +> htmlize hook for the `_mdwn` extension for that to really work. +> +> But, if there's no rendered sidebar page, how can users easily edit the page +> in the web interface? In the specific case of the sidebar, It seems +> better to have the page display something different when built standalone +> than when built as the sidebar. +> --[[Joey]] diff --git a/doc/forum/tag_plugin:_rebuilding_autocreated_pages.mdwn b/doc/forum/tag_plugin:_rebuilding_autocreated_pages.mdwn new file mode 100644 index 000000000..114837031 --- /dev/null +++ b/doc/forum/tag_plugin:_rebuilding_autocreated_pages.mdwn @@ -0,0 +1,11 @@ +I have a bunch of tag pages autogenerated by the tag plugin. As part of a redesign for my wiki, I have changed the `autotag.tmpl` template, but IkiWiki refuses to rebuild existing tag pages using the updated template. I understand that existing tag pages are not rebuilt because they have been marked as "created" in `.ikiwiki/indexdb`. Is there a way to purge all tag pages from `indexdb`? --dkobozev + +> Well, you can delete the indexdb and rebuild, and that will do it. +> The tag plugin is careful not to replace existing pages or even recreate +> tag pages if you delete them, which does cause a problem if you need to +> update them. --[[Joey]] + +>> Thanks. I thought about deleting `indexdb`, but hesitated to do that. According to [[tips/inside dot ikiwiki]], `indexdb` stores "all persistent state about pages". I didn't know if it's harmless to lose all persistent state. --dkobozev + +>>> The persistant state is best thought of as a cache, +>>> so it's safe to delete it. --[[Joey]] diff --git a/doc/forum/transition_from_handwritten_html_to_ikiwiki.mdwn b/doc/forum/transition_from_handwritten_html_to_ikiwiki.mdwn new file mode 100644 index 000000000..a8d04a0ad --- /dev/null +++ b/doc/forum/transition_from_handwritten_html_to_ikiwiki.mdwn @@ -0,0 +1,90 @@ +I'm trying to convert hand written html site to ikiwiki and maintain url compatibility. html plugin with indexpages=1 converts all dir_name/index.html correctly to dir_name urls with wiki/css based content, but somedir/somefile.html files are only accessible as somedir/somefile/. Non .html files seem to accessible with their full paths, for example somedir/pic.jpg from hand written html can be accessed by same path under ikiwiki. + +How to make somedir/somefile.html accessible as somedir/somefile.html under ikiwiki? + +Thanks, + +-Mikko + +> Hello! The options you need to investigate are `--usedirs` and +> `--no-usedirs`. The default `--usedirs` takes any source page foo +> (regardless of its format, be it markdown or html) and converts it into a +> destination page foo/index.html (URL foo/). By comparison, `--no-usedirs` +> maps the source file onto a destination file directly: src/foo.html becomes +> dest/foo.html, src/bar.mdwn becomes dest/bar.html, etc. +> +> It sounds like you want `--no-usedirs`, or the corresponding `usedirs => 0,` +> option in your setup file. See [[usage]] for more information. -- [[Jon]] + +Thanks, usedirs seems to be just the thing I need. + +-Mikko + +Actually usedirs didn't do exactly what I want. The old site contains both +somedir/index.html and somedir/somename.html files. With html plugin and +indexpages=1 the somedir/index.html pages are accessed correctly but +somedir/somefile.html files not. + +With usedirs => 0, somedir/somename.html pages are accessed correctly but +somedir/index.html pages are not. Actually the handwritten somedir/index.html +files were removed on a rebuild: + + $ ikiwiki -setup blog.setup -rebuild -v + ... + removing test2/index.html, no longer built by test2 + +Is there a way for both index.html and somename.html raw html files to show up through ikiwki? + +-Mikko + +> I think you want usedirs => 0 and indexpages => 0? +> +> What IkiWiki does is to map the source filename to an abstract page name +> (indexpages alters how this is done), then map the abstract page name +> to an output filename (usedirs alters how this is done). +> +> The three columns here are input, abstract page name, output: +> +> usedirs => 0, indexpages => 0: +> a/index.html -> a/index -> a/index.html +> a/b.html -> a/b -> a/b.html +> usedirs => 1, indexpages => 0: +> a/index.html -> a/index -> a/index/index.html +> a/b.html -> a/b -> a/b/index.html +> usedirs => 0, indexpages => 1: +> a/index.html -> a -> a.html +> a/b.html -> a/b -> a/b.html +> usedirs => 1, indexpages => 1: +> a/index.html -> a -> a/index.html +> a/b.html -> a/b -> a/b/index.html +> +> The abstract page name is what you use in wikilinks and pagespecs. +> +> What I would suggest you do instead, though, is break your URLs once +> (but put in Apache redirections), to get everything to be consistent; +> I strongly recommend usedirs => 1 and indexpages => 0, then always +> advertising URLs that look like <http://www.example.com/a/b/>. This is +> what ikiwiki.info itself does, for instance. --[[smcv]] + +Thanks for the explanation. usedirs => 0 and indexpages => 0 does the trick, +but I'll try to setup mod_rewrite from foo/bar.html to foo/bar in the final +conversion. + +-Mikko + +> That's roughly what I do, but you can do it with `Redirect` and `RedirectMatch` from `mod_alias`, rather than fire up rewrite. Mind you I don't write a generic rule, I have a finite set of pages to redirect which I know. -- [[Jon]] + +I'm getting closer. Now with usedirs => 1 and raw html pages, ikiwiki transforms foo/index.html to foo/index/index.html. +Can ikiwiki be instructed map foo/index.html to page foo instead that foo/index? + +-Mikko + +> If you don't already have a foo.html in your source, why not just rename foo/index.html to foo.html? With usedirs, it will then map to foo/index.html. Before, you had 'foo/' and 'foo/index.html' as working URLS, and they will work after too. +> +> If you did have a foo.html and a foo/index.html, hmm, that's a tricky one. -- [[Jon]] + +> We may be going round in circles - that's what indexpages => 1 does :-) +> See the table I constructed above, which explains the mapping from input +> files to abstract page names, and then the mapping from abstract page +> names to output files. (I personally think that moving your source pages +> around like Jon suggested is a better solution, though. --[[smcv]] diff --git a/doc/forum/understanding_filter_hooks.mdwn b/doc/forum/understanding_filter_hooks.mdwn index 061d6d295..e6ddc91cc 100644 --- a/doc/forum/understanding_filter_hooks.mdwn +++ b/doc/forum/understanding_filter_hooks.mdwn @@ -7,3 +7,11 @@ but right now I have to have a look at the content, which I don't like so much. Is there a better hook to use for this? I need to transform the input before preprocessing. [[DavidBremner]] + +>You can check the type of the page without having to look at the content of the page: + + my $page_file=$pagesources{$page}; + my $page_type=pagetype($page_file); + +>Then you can check whether `$page_type` is "tex". +>--[[KathrynAndersen]] diff --git a/doc/forum/upgrade_steps.mdwn b/doc/forum/upgrade_steps.mdwn new file mode 100644 index 000000000..1c85e6402 --- /dev/null +++ b/doc/forum/upgrade_steps.mdwn @@ -0,0 +1,147 @@ +[[!meta date="2007-08-27 21:52:18 +0000"]] + +I upgrades from 1.40 to 2.6.1. I ran "ikiwiki --setup" using my existing ikiwiki.setup configuration. +I had many errors like: + + /home/bsdwiki/www/wiki/wikilink/index.html independently created, not overwriting with version from wikilink + BEGIN failed--compilation aborted at (eval 5) line 129. + +and: + + failed renaming /home/bsdwiki/www/wiki/smileys.ikiwiki-new to /home/bsdwiki/www/wiki/smileys: Is a directory + BEGIN failed--compilation aborted at (eval 5) line 129. + +Probably about six errors like this. I worked around this by removing the files and directories it complained about. +Finally it finished. + +> As of version 2.0, ikiwiki enables usedirs by default. See +> [[tips/switching_to_usedirs]] for details. --[[Joey]] + +>> I read the config wrong. I was thinking that it showed the defaults even though commented out +>> (like ssh configs do). I fixed that part. --JeremyReed + +My next problem was that ikiwiki start letting me edit without any password authentication. It used to prompt +me for a password but now just goes right into the "editing" mode. +The release notes for 2.0 say password auth is still on by default. + +> It sounds like you have the anonok plugin enabled? + +>> Where is the default documented? My config doesn't have it uncommented. + +The third problem is that when editing my textbox is empty -- no content. + +This is using my custom rcs.pm which has been used thousands of times. + +> Have you rebuilt the cgi wrapper since you upgraded ikiwiki? AFAIK I +> fixed a bug that could result in the edit box always being empty back in +> version 2.3. The only other way it could happen is if ikiwiki does not +> have saved state about the page that it's editing (in .ikiwiki/index). + +>> Rebuilt it several times. Now that I think of it, I think my early problem of having +>> no content in the textbox was before I rebuilt the cgi. And after I rebuilt the whole webpage was empty. + +Now I regenerated my ikiwiki.cgi again (no change to my configuration, +and I just get an empty HTML page when attempting editing or "create". + +> If the page is completly empty then ikiwiki is crashing before it can +> output anything, though this seems unlikely. Check the webserver logs. + +Now I see it created directories for my data. I fixed that by setting +usedirs (I see that is in the release notes for 2.0) and rerunning ikiwiki --setup +but I still have empty pages for editing (no textbox no html at all). + +> Is IkiWiki crashing? If so, it would probably leave error text in the apache logs. --[[TaylorKillian]] + +>> Not using apache. Nothing useful in logs other thn the HTTP return codes are "0" and bytes is "-" +>> on the empty ikiwiki.cgi output (should say " 200 " followed by bytes). + +>>> You need to either figure out what your web server does with stderr +>>> from cgi programs, or run ikiwiki.cgi at the command line with an +>>> appropriate environment so it thinks it's being called from a web +>>> server, so you can see how it's failing. --[[Joey]] + +(I am posting this now, but will do some research and post some more.) + +Is there any webpage with upgrade steps? + +> Users are expected to read [[news]], which points out any incompatible +> changes or cases where manual action is needed. + +>> I read it but read the usedirs option wrong :(. +>> Also it appears to be missing the news from between 1.40 to 2.0 unless they dont' exist. +>> If they do exist maybe they have release notes I need? + +>>> All the old ones are in the NEWS file. --[[Joey]] + +--JeremyReed + +My followup: I used a new ikiwiki.setup based on the latest version. But no changes for me. + +Also I forgot to mention that do=recentchanges works good for me. It uses my +rcs_recentchanges in my rcs perl module. + +The do=prefs does nothing though -- just a blank webpage. + +> You need to figure out why ikiwiki is crashing. The webserver logs should +> tell you. + +I also set verbose => 1 and running ikiwiki --setup was verbose, but no changes in running CGI. +I was hoping for some output. + +I am guessing that my rcs perl module stopped working on the upgrade. I didn't notice any release notes +on changes to revision control modules. Has something changed? I will also look. + +> No, the rcs interface has not needed to change in a long time. Also, +> nothing is done with the rcs for do=prefs. + +>> Thanks. I also checked differences between 1.40 Rcs plugins and didn't notice anything significant. + +--JeremyReed + +Another Followup: I created a new ikiwiki configuration and did the --setup to +create an entirely different website. I have same problem there. No prompt for password +and empty webpage when using the cgi. +I never upgraded any perl modules so maybe a new perl module is required but I don't see any errors so I don't know. + +The only errors I see when building and installing ikiwiki are: + + Can't exec "otl2html": No such file or directory at IkiWiki/Plugin/otl.pm line 66. + + gettext 0.14 too old, not updating the pot file + +I don't use GNU gettext on here. + +I may need to revert back to my old ikiwiki install which has been used to thousands of times (with around +1000 rcs commits via ikiwiki). + +--JeremyReed + +I downgraded to version 1.40 (that was what I had before I wrote wrong above). +Now ikiwiki is working for me again (but using 1.40). I shouldn't have tested on production system :) + +--JeremyReed + +I am back. On a different system, I installed ikiwiki 2.6.1. Same problem -- blank CGI webpage. + +So I manually ran with: + + REQUEST_METHOD=GET QUERY_STRING='do=create&page=jcr' kiwiki.cgi + +And clearly saw the error: + + [IkiWiki::main] Fatal: Bad template engine CGI::FormBuilder::Template::div: Can't locate CGI/FormBuilder/Template/div.pm + +So I found my version was too old and 3.05 is the first to provide "Div" support. I upgraded my p5-CGI-FormBuilder to 3.0501. +And ikiwiki CGI started working for me. + +The Ikiwiki docs about this requirement got removed in Revision 4367. There should be a page that lists the requirements. +(I guess I could have used the debian/control file.) + +> There is a page, [[install]] documents that 3.05 is needed. + +>> Sorry, I missed that. With hundreds of wikipages it is hard to read all of them. +>> I am updating the download page now to link to it. + +I am now using ikiwiki 2.6.1 on my testing system. + +--JeremyReed diff --git a/doc/forum/use_php-markdown-extra_with_ikiwiki__63__.mdwn b/doc/forum/use_php-markdown-extra_with_ikiwiki__63__.mdwn new file mode 100644 index 000000000..86ed70fd2 --- /dev/null +++ b/doc/forum/use_php-markdown-extra_with_ikiwiki__63__.mdwn @@ -0,0 +1,3 @@ +Is it possible to use php-markdown-extra with ikiwiki? + +Thanks. diff --git a/doc/forum/use_php-markdown-extra_with_ikiwiki__63__/comment_1_66d48218361caa4c07bd714b82ed0021._comment b/doc/forum/use_php-markdown-extra_with_ikiwiki__63__/comment_1_66d48218361caa4c07bd714b82ed0021._comment new file mode 100644 index 000000000..af60ecbdb --- /dev/null +++ b/doc/forum/use_php-markdown-extra_with_ikiwiki__63__/comment_1_66d48218361caa4c07bd714b82ed0021._comment @@ -0,0 +1,8 @@ +[[!comment format=mdwn + username="http://kerravonsen.dreamwidth.org/" + ip="60.241.8.244" + subject="PHP != Perl" + date="2010-07-10T12:44:15Z" + content=""" +Er, why? IkiWiki is written in Perl. Presumably php-markdown-extra is written in PHP, which is a completely different language. +"""]] diff --git a/doc/forum/use_php-markdown-extra_with_ikiwiki__63__/comment_2_f2ee0a4dce571d329f795e52139084c0._comment b/doc/forum/use_php-markdown-extra_with_ikiwiki__63__/comment_2_f2ee0a4dce571d329f795e52139084c0._comment new file mode 100644 index 000000000..ce60f4b3a --- /dev/null +++ b/doc/forum/use_php-markdown-extra_with_ikiwiki__63__/comment_2_f2ee0a4dce571d329f795e52139084c0._comment @@ -0,0 +1,8 @@ +[[!comment format=mdwn + username="https://www.google.com/accounts/o8/id?id=AItOawlzADDUvepOXauF4Aq1VZ4rJaW_Dwrl6xE" + nickname="Dário" + subject="comment 2" + date="2010-07-10T21:48:13Z" + content=""" +Because php-markdown-extra extends the basic markdown language (footnotes, etc.) +"""]] diff --git a/doc/forum/use_php-markdown-extra_with_ikiwiki__63__/comment_3_e388714f457ccb6ef73630179914558c._comment b/doc/forum/use_php-markdown-extra_with_ikiwiki__63__/comment_3_e388714f457ccb6ef73630179914558c._comment new file mode 100644 index 000000000..72ce7bb6f --- /dev/null +++ b/doc/forum/use_php-markdown-extra_with_ikiwiki__63__/comment_3_e388714f457ccb6ef73630179914558c._comment @@ -0,0 +1,9 @@ +[[!comment format=mdwn + username="http://kerravonsen.dreamwidth.org/" + ip="202.173.183.92" + subject="I still don't get it" + date="2010-07-11T07:18:35Z" + content=""" +But if you need the \"extra\" features of Markdown, all you have to do is turn on the \"multimarkdown\" option in your configuration. It makes no sense to try to use PHP with Perl. + +"""]] diff --git a/doc/forum/utf8_warnings_for___34____92__xAB__34__.mdwn b/doc/forum/utf8_warnings_for___34____92__xAB__34__.mdwn new file mode 100644 index 000000000..72f2d38e0 --- /dev/null +++ b/doc/forum/utf8_warnings_for___34____92__xAB__34__.mdwn @@ -0,0 +1,47 @@ +# getting Warnings about UTF8-Chars. + +I'm getting multiple warnings: + + utf8 "\xAB" does not map to Unicode at /usr/share/perl5/IkiWiki.pm line 774, <$in> chunk 1. + + +I'm assuming this is once per File, but even in verbose mode, it doesn't tell me which file is a problem. +It first reads all the files, and afterwards when parsing/compiling them, it outputs the warning, so I can't +deduce the offending files. + +Is there a way to have ikiwiki output the position, where it encounters the character? + +Probably all this has to do with locale-settings, and usage of mixed locales in a distributed setup ... +I'd rather cleanup some of the file(name)s of unexpected characters. --[[jwalzer]] + +-------- + +**Update** : So I took the chance to insert debug into ikiwiki.pm: + + root@novalis:/usr/share/perl5# diff -p /tmp/IkiWiki.orig.pm IkiWiki.pm + *** /tmp/IkiWiki.orig.pm Sun Feb 14 15:16:08 2010 + --- IkiWiki.pm Sun Feb 14 15:16:28 2010 + *************** sub readfile ($;$$) { + *** 768,773 **** + --- 768,774 ---- + } + + local $/=undef; + + debug("opening File: $file:"); + open (my $in, "<", $file) || error("failed to read $file: $!"); + binmode($in) if ($binary); + return \*$in if $wantfd; + + +But what I see now is not quite helpful, as it seems, STDERR and DEBUG are asyncronous, so they mix up in a way, that I can't really see, whats the problem ... Maybe I'm better off for troubleshooting, to insert an printf to strerr to have it in the same stream.. --[[jwalzer]] + + +---- + +**Update:** The "print STDERR $file;"-Trick did it .. I was able to find a mdwn-file, that (was generated by a script of me) had \0xAB in it. + +Nevertheless I still wonder if this should be a problem. This character happend to be in an *\[\[meta title='$CHAR'\]\]-tag* and an *\[$CHAR\]http://foo)-Link* + +Should this throw an warning? Maybe this warning could be catched an reported inclusively the containing filename? maybe even with an override, if one knows that it is correct that way? --[[jwalzer]] + +[[!tag solved]] diff --git a/doc/forum/web_service_API__44___fastcgi_support.mdwn b/doc/forum/web_service_API__44___fastcgi_support.mdwn new file mode 100644 index 000000000..4a78fb932 --- /dev/null +++ b/doc/forum/web_service_API__44___fastcgi_support.mdwn @@ -0,0 +1,13 @@ +This is a half-baked thought of mine so I thought I would post it in forum for discussion. + +There are some things that ikiwiki.cgi is asked to do which do not involve changing the repository: these include form generation, handling logins, the "goto" from [[recentchanges]], edit previews, etc. + +For one thing I am working on slowly ([[todo/interactive todo lists]]), I've hit a situation where I am likely to need to implement doing markup evaluation for a subset of a page. The problem I face is, if a user edits content in the browser, markup, ikiwiki directives etc. need to be expanded. I could possibly do this with a round-trip through edit preview, but that would be for the whole content of a page, and I hit the problem with editing a list item. + +> (slight addendum on this front. I'm planning to split the javascript code for interactive todo lists into two parts: one for handling round trips of content to and from ikiwiki.cgi, and the various failure modes that might occur (permission denied, edit conflicts, login required, etc.) ; then the list-specific stuff can build on top of this. The first chunk might be reusable by others for other AJAXY-edit fu.) + +Anyway - I've realised that a big part of the interactive todo lists stuff is trying to handle round trips to ikiwiki.cgi through javascript. A web services API would make handling the various conditions of this easier (e.g. need to login, login failed, etc.). I'm not sure what else might benefit from a web services API and I have no real experience of either using or writing them so I don't know what pros/cons there are for REST vs SOAP etc. + +Second, and in a way related, I've been mooting hacking fastcgi support into ikiwiki. Essentially one ikiwiki.cgi process would persist and serve CGI-ish requests on stdin/stdout. The initial content-scanning and dependency generation would happen once and not need to be repeated for future requests. Although, all state-changing operations would need to be careful to ensure the in-memory models were accurate. Also, I don't know how suited the data structures would be for persistence, since the current model is build em up, throw em away, they might not be space-efficient enough for persistence. + +If I did attempt this, I would want to avoid restructuring things in a way which would impair ikiwiki's core philosophy of being a static compiler. -- [[Jon]] diff --git a/doc/forum/what_is_the_easiest_way_to_implement_order:_disallow_all__44___allow_chosen__95__few_page_editing_policy__63__.mdwn b/doc/forum/what_is_the_easiest_way_to_implement_order:_disallow_all__44___allow_chosen__95__few_page_editing_policy__63__.mdwn new file mode 100644 index 000000000..fbc5c58e2 --- /dev/null +++ b/doc/forum/what_is_the_easiest_way_to_implement_order:_disallow_all__44___allow_chosen__95__few_page_editing_policy__63__.mdwn @@ -0,0 +1,3 @@ +As in title, I'd like to allow editing only some pages on my wiki. Rest by default is not editable by users except admin. Thanks + +> See [[plugins/lockedit]]. --[[schmonz]] diff --git a/doc/forum/where_are_the_tags.mdwn b/doc/forum/where_are_the_tags.mdwn new file mode 100644 index 000000000..ecb49fe43 --- /dev/null +++ b/doc/forum/where_are_the_tags.mdwn @@ -0,0 +1,9 @@ +Where is the tag cloud/tag listing of all the tags used in this wiki? I know we +have tags enabled. --[[jerojasro]] + +> This wiki does not use one global toplevel set of tags (`tagbase` is not +> set). +> +> There are tags used for the [[plugins]], and a tag cloud of those +> there. [[wishlist]] and [[patch]] are tags too, but I don't see the point +> of a tag cloud for such tags. --[[Joey]] diff --git a/doc/forum/wiki_clones_on_dynamic_IPs.mdwn b/doc/forum/wiki_clones_on_dynamic_IPs.mdwn new file mode 100644 index 000000000..f69f6501e --- /dev/null +++ b/doc/forum/wiki_clones_on_dynamic_IPs.mdwn @@ -0,0 +1,10 @@ +OK, this is not really a ikiwiki problem... but ikiwiki makes wiki clones +really easy to setup, and this is related to having a website cloned at +different places and pulling from the servers which are online. + +My setup is like this: I have a server at home and another at my dorm +which will serve as a wiki clone. Each of them has a dynamic IP with DynDNS +set up. Now the problem lies in linking my domain to these two DynDNS addresses. +Multiple CNAMEs are not supported, and I don't know if there's any utility +which can update the A records on the DNS to point directly point to the +two separate IPs. diff --git a/doc/forum/wiki_name_in_page_titles.mdwn b/doc/forum/wiki_name_in_page_titles.mdwn index 01ff8d817..4e9e51835 100644 --- a/doc/forum/wiki_name_in_page_titles.mdwn +++ b/doc/forum/wiki_name_in_page_titles.mdwn @@ -23,4 +23,10 @@ that provides a `IS_HOMEPAGE` template variable? --[[JasonBlevins]] >> few other small plugins brewing so I'll try to put up some contrib >> pages for them soon. --[[JasonBlevins]] -[path]: http://code.jblevins.org/ikiwiki/plugins.git/plain/path.pm +[path]: http://jblevins.org/git/ikiwiki/plugins.git/plain/path.pm + +> I used the following trick in some page.tmpl: +> +> <title><TMPL_VAR WIKINAME><TMPL_IF NAME="PARENTLINKS">: <TMPL_VAR TITLE></TMPL_IF></title> +> +> --[[JeanPrivat]] diff --git a/doc/forum/wishlist-discussion:_Editformular_showing_existing_tags.mdwn b/doc/forum/wishlist-discussion:_Editformular_showing_existing_tags.mdwn new file mode 100644 index 000000000..49c55e20e --- /dev/null +++ b/doc/forum/wishlist-discussion:_Editformular_showing_existing_tags.mdwn @@ -0,0 +1,15 @@ +# How about: + +having a list of all existing tags in the Edit-Formular as a selectionbox? + +Assume I have tagbase=/tags/ and for every tag I have given to articles an existing page there. + +Would it be possible to list all these tags together with the Formular, as selectionbox. +Maybe even with parsing of the content and preselecting the tags, that are given in the article and vice-versa when selecting the fields then also generating the \[\[\!tag\]\]-sourcecode ? + +this would need a bit JS-work and somehow on compiletime we need to put the list of tags somewhere, where the cgi could read them from. +This way, even a pagespec would suffice to determine the usable list of tags and not only the tagbase-variable. + +> I think this would be very hard to achieve with the current tag plugin, due to the nature of its implementation. +> +> I've had a "tag2" plugin on the go for a while which supports this. It's in a very rough stage but I'll try to find it and upload it somewhere. -- [[Jon]] diff --git a/doc/freesoftware.mdwn b/doc/freesoftware.mdwn index 7ac1ac6b4..2243d9b1f 100644 --- a/doc/freesoftware.mdwn +++ b/doc/freesoftware.mdwn @@ -4,7 +4,7 @@ ikiwiki, and this documentation wiki, are licensed under the terms of the GNU [[GPL]], version 2 or later. The parts of ikiwiki that become part of your own wiki (the [[basewiki]] -pages (but not the smilies) and the [[templates|wikitemplates]]) are licensed +pages (but not the smilies) and the [[templates]]) are licensed as follows: > Redistribution and use in source and compiled forms, with or without diff --git a/doc/git.mdwn b/doc/git.mdwn index 18fbd238b..88608d4b5 100644 --- a/doc/git.mdwn +++ b/doc/git.mdwn @@ -26,13 +26,16 @@ be browsed, subscribed to etc on its You are of course free to set up your own ikiwiki git repository with your own [[patches|patch]]. If you list it here, the `gitremotes` script will automatically add it to git remotes. Your repo will automatically be pulled -into [[Joey]]'s working tree. This is recommended. :-) +into [[Joey]]'s working repository where he can see your branches and +think about merging them. This is recommended. :-) <!-- Machine-parsed format: * wikilink <git:url> --> * github `git://github.com/joeyh/ikiwiki.git` ([browse](http://github.com/joeyh/ikiwiki/tree/master)) A mirror of the main repo, automatically updated. +* l10n `git://l10n.ikiwiki.info/` + Open push localization branch used for <http://l10n.ikiwiki.info/> * [[smcv]] `git://git.pseudorandom.co.uk/git/smcv/ikiwiki.git` * [[intrigeri]] `git://gaffer.ptitcanardnoir.org/ikiwiki.git` * [[gmcmanus]] `git://github.com/gmcmanus/ikiwiki.git` @@ -46,21 +49,41 @@ into [[Joey]]'s working tree. This is recommended. :-) * [[arpitjain]] `git://github.com/arpitjain11/ikiwiki.git` * [[chrysn]] `git://github.com/github076986099/ikiwiki.git` * [[simonraven]] `git://github.com/kjikaqawej/ikiwiki-simon.git` +* [[schmonz]] `git://github.com/schmonz/ikiwiki.git` +* [[will]] `http://www.cse.unsw.edu.au/~willu/ikiwiki.git` +* [[kaizer]] `git://github.com/engla/ikiwiki.git` +* [[bbb]] `http://git.boulgour.com/bbb/ikiwiki.git` +* [[KathrynAndersen]] `git://github.com/rubykat/ikiplugins.git` +* [[ktf]] `git://github.com/ktf/ikiwiki.git` +* [[tove]] `git://github.com/tove/ikiwiki.git` +* [[GiuseppeBilotta]] `git://git.oblomov.eu/ikiwiki` +* [[roktas]] `git://github.com/roktas/ikiwiki.git` +* [[davrieb|David_Riebenbauer]] `git://git.liegesta.at/git/ikiwiki` + ([browse](http://git.liegesta.at/?p=ikiwiki.git;a=summary)) +* [[GustafThorslund]] `http://gustaf.thorslund.org/src/ikiwiki.git` +* [[peteg]] `git://git.hcoop.net/git/peteg/ikiwiki.git` +* [[privat]] `git://github.com/privat/ikiwiki.git` +* [[blipvert]] `git://github.com/blipvert/ikiwiki.git` +* [[bzed|BerndZeimetz]] `git://git.recluse.de/users/bzed/ikiwiki.git` ## branches +In order to refer to a branch in one of the above git repositories, for +example when submitting a [[patch]], you can use the +[[templates/gitbranch]] template. + Some of the branches included in the main repository include: * `gallery` contains the [[todo/Gallery]] plugin. It's not yet merged due to license issues. Also some bits need to be tweaked to make it work with the current *master* branch again. -* `html` is an unfinished attempt at making ikiwiki output HTML 4.01 - instead of xhtml. * `wikiwyg` adds [[todo/wikiwyg]] support. It is unmerged pending some changes. * `debian-stable` is used for updates to the old version included in Debian's stable release, and `debian-testing` is used for updates to - Debian's testing release. + Debian's testing release. (These and similar branches will be rebased.) +* `ignore` gets various branches merged to it that Joey wishes to ignore + when looking at everyone's unmerged changes. * `pristine-tar` contains deltas that [pristine-tar](http://kitenet.net/~joey/code/pristine-tar) can use to recreate released tarballs of ikiwiki diff --git a/doc/ikiwiki-calendar.mdwn b/doc/ikiwiki-calendar.mdwn new file mode 100644 index 000000000..03cbdd86c --- /dev/null +++ b/doc/ikiwiki-calendar.mdwn @@ -0,0 +1,53 @@ +# NAME + +ikiwiki-calendar - create calendar archive pages + +# SYNOPSIS + +ikiwiki-calendar [-f] your.setup [pagespec] [startyear [endyear]] + +# DESCRIPTION + +`ikiwiki-calendar` creates pages that use the [[ikiwiki/directive/calendar]] +directive, allowing the archives to be browsed one month +at a time, with calendar-based navigation. + +You must specify the setup file for your wiki. The pages will +be created inside its `srcdir`, beneath the `archivebase` +directory used by the calendar plugin (default "archives"). + +To control which pages are included on the calendars, +a [[ikiwiki/PageSpec]] can be specified. The default is +all pages, or the pages specified by the `comments_pagespec` +setting in the config file. A pagespec can also be specified +on the command line. To limit it to only posts in a blog, +use something like "posts/* and !*/Discussion". + +It defaults to creating calendar pages for the current +year. If you specify a year, it will create pages for that year. +Specify a second year to create pages for a span of years. + +Existing pages will not be overwritten by this command by default. +Use the `-f` switch to force it to overwrite any existing pages. + +# CRONTAB + +While this command only needs to be run once a year to update +the archive pages for each new year, you are recommended to set up +a cron job to run it daily, at midnight. Then it will also update +the calendars to highlight the current day. + +An example crontab: + + 0 0 * * * ikiwiki-calendar ~/ikiwiki.setup "posts/* and !*/Discussion" + +# TEMPLATES + +This command uses two [[templates]] to generate +the pages, `calendarmonth.tmpl` and `calendaryear.tmpl`. + +# AUTHOR + +Joey Hess <joey@ikiwiki.info> + +Warning: this page is automatically made into ikiwiki-calendar's man page, edit with care diff --git a/doc/ikiwiki-makerepo.mdwn b/doc/ikiwiki-makerepo.mdwn index 13f88dc27..acb1211de 100644 --- a/doc/ikiwiki-makerepo.mdwn +++ b/doc/ikiwiki-makerepo.mdwn @@ -4,7 +4,7 @@ ikiwiki-makerepo - check an ikiwiki srcdir into revision control # SYNOPSIS -ikiwiki-makerepo svn|git|monotone|darcs srcdir repository +ikiwiki-makerepo git|svn|monotone|darcs|cvs srcdir repository ikiwiki-makerepo bzr|mercurial srcdir @@ -24,6 +24,14 @@ ikiwiki wrapper. Note that for monotone, you are assumed to already have run "mtn genkey" to generate a key. + +# EXAMPLE + +`ikiwiki-makerepo git /var/www/wiki /home/user/wiki/` + +The above command creates a new git repo in /home/user/wiki as well as a new git repo in the /var/www/wiki directory. +It then initializes the /home/user/wiki git repo and makes the /var/www/wiki a clone. + # AUTHOR Joey Hess <joey@ikiwiki.info> diff --git a/doc/ikiwiki-makerepo/discussion.mdwn b/doc/ikiwiki-makerepo/discussion.mdwn new file mode 100644 index 000000000..660823fbb --- /dev/null +++ b/doc/ikiwiki-makerepo/discussion.mdwn @@ -0,0 +1 @@ +Not sure how well I described the example. :-/ -- Jeremiah diff --git a/doc/ikiwiki.mdwn b/doc/ikiwiki.mdwn index e0a971d96..4d840696c 100644 --- a/doc/ikiwiki.mdwn +++ b/doc/ikiwiki.mdwn @@ -14,3 +14,4 @@ Some documentation on using ikiwiki: * [[ikiwiki/markdown]] * [[ikiwiki/openid]] * [[ikiwiki/searching]] +* [[templates]] diff --git a/doc/ikiwiki/directive/brokenlinks/discussion.mdwn b/doc/ikiwiki/directive/brokenlinks/discussion.mdwn new file mode 100644 index 000000000..34760584d --- /dev/null +++ b/doc/ikiwiki/directive/brokenlinks/discussion.mdwn @@ -0,0 +1,3 @@ +Would it be possible to have such a thing also checking for external links? -- [[user/emptty]] + +> I guess this is not very practical. For internal wiki links, we only need to check if the linked source file exist or not. But for external links, we have to ping every links, which will slow down the build process a lot. --[[weakish]] diff --git a/doc/ikiwiki/directive/calendar.mdwn b/doc/ikiwiki/directive/calendar.mdwn index 8a257d6eb..cb40f884e 100644 --- a/doc/ikiwiki/directive/calendar.mdwn +++ b/doc/ikiwiki/directive/calendar.mdwn @@ -1,5 +1,4 @@ The `calendar` directive is supplied by the [[!iki plugins/calendar desc=calendar]] plugin. -This plugin requires extra setup. See the plugin documentation for details. This directive displays a calendar, similar to the typical calendars shown on some blogs. @@ -12,16 +11,28 @@ some blogs. \[[!calendar type="year" year="2005" pages="blog/* and !*/Discussion"]] +## setup + The calendar is essentially a fancy front end to archives of previous pages, usually used for blogs. It can produce a calendar for a given month, -or a list of months for a given year. +or a list of months for a given year. The month format calendar simply +links to any page posted on each day of the month. The year format calendar +links to archive pages, with names like `archives/2007` (for all of 2007) +and `archives/2007/01` (for January, 2007). + +While you can insert calendar directives anywhere on your wiki, including +in the sidebar, you'll also need to create these archive pages. They +typically use this directive to display a calendar, and also use [[inline]] +to display or list pages created in the given time frame. + +The `ikiwiki-calendar` command can be used to automatically generate the +archive pages. It also refreshes the wiki, updating the calendars to +highlight the current day. This command is typically run at midnight from +cron. + +An example crontab: -The month format calendar simply links to any page posted on each -day of the month. The year format calendar links to archive pages, with -names like `archives/2007` (for all of 2007) and `archives/2007/01` -(for January, 2007). For this to work, you'll need to create these archive -pages. They typically use [[inline]] to display or list pages created in -the given time frame. + 0 0 * * * ikiwiki-calendar ~/ikiwiki.setup "posts/* and !*/Discussion" ## usage @@ -29,18 +40,21 @@ the given time frame. "month" or "year". The default is a month view calendar. * `pages` - Specifies the [[ikiwiki/PageSpec]] of pages to link to from the month calendar. Defaults to "*". -* `archivebase` - Configures the base of the archives hierarchy. The - default is "archives". Note that this default can also be overridden +* `archivebase` - Configures the base of the archives hierarchy. + The default is "archives". Note that this default can also be overridden for the whole wiki by setting `archivebase` in ikiwiki's setup file. + Calendars link to pages under here, with names like "2010/04" and + "2010". These pages can be automatically created using the + `ikiwiki-calendar` program. * `year` - The year for which the calendar is requested. Defaults to the - current year. + current year. Can also use -1 to refer to last year, and so on. * `month` - The numeric month for which the calendar is requested, in the range 1..12. Used only for the month view calendar, and defaults to the - current month. + current month. Can also use -1 to refer to last month, and so on. * `week_start_day` - A number, in the range 0..6, which represents the day of the week that the month calendar starts with. 0 is Sunday, 1 is Monday, and so on. Defaults to 0, which is Sunday. -* `months_per_row` - In the annual calendar, number of months to place in +* `months_per_row` - In the year calendar, number of months to place in each row. Defaults to 3. [[!meta robots="noindex, follow"]] diff --git a/doc/ikiwiki/directive/comment.mdwn b/doc/ikiwiki/directive/comment.mdwn index 21386dfc3..398130e2e 100644 --- a/doc/ikiwiki/directive/comment.mdwn +++ b/doc/ikiwiki/directive/comment.mdwn @@ -29,10 +29,12 @@ metadata of the comment. nearly any format, since it's parsed by [[!cpan TimeDate]] * `username` - Used to record the username (or OpenID) of a logged in commenter. +* `nickname` - Name to display for a logged in commenter. + (Optional; used for OpenIDs.) * `ip` - Can be used to record the IP address of a commenter, if they posted anonymously. * `claimedauthor` - Records the name that the user entered, - if anonmous commenters are allowed to enter their (unverified) + if anonymous commenters are allowed to enter their (unverified) name. [[!meta robots="noindex, follow"]] diff --git a/doc/ikiwiki/directive/commentmoderation.mdwn b/doc/ikiwiki/directive/commentmoderation.mdwn new file mode 100644 index 000000000..8553b5b17 --- /dev/null +++ b/doc/ikiwiki/directive/commentmoderation.mdwn @@ -0,0 +1,9 @@ +The `commentmoderation` directive is supplied by the +[[!iki plugins/comments desc=comments]] plugin, and is used to link +to the comment moderation queue. + +Example: + + \[[!commentmoderation desc="here is the comment moderation queue"]] + +[[!meta robots="noindex, follow"]] diff --git a/doc/ikiwiki/directive/date.mdwn b/doc/ikiwiki/directive/date.mdwn new file mode 100644 index 000000000..b89241e4c --- /dev/null +++ b/doc/ikiwiki/directive/date.mdwn @@ -0,0 +1,16 @@ +The `date` directive is supplied by the [[!iki plugins/date desc=date]] plugin. + +This directive can be used to display a date on a page, using the same +display method that is used to display the modification date in the page +footer, and other dates in the wiki. This can be useful for consistency +of display, or if you want to embed parseable dates into the page source. + +Like the dates used by the [[meta]] directive, the date can be entered in +nearly any format, since it's parsed by [[!cpan TimeDate]]. + +For example, an update to a page with an embedded date stamp could look +like: + + Updated \[[!date "Wed, 25 Nov 2009 01:11:55 -0500"]]: mumble mumble + +[[!meta robots="noindex, follow"]] diff --git a/doc/ikiwiki/directive/edittemplate.mdwn b/doc/ikiwiki/directive/edittemplate.mdwn index d731bdb47..c486e821b 100644 --- a/doc/ikiwiki/directive/edittemplate.mdwn +++ b/doc/ikiwiki/directive/edittemplate.mdwn @@ -21,7 +21,7 @@ something like: Details: The template page can also contain [[!cpan HTML::Template]] directives, -similar to other ikiwiki [[templates]]. Currently only one variable is +like other ikiwiki [[templates]]. Currently only one variable is set: `<TMPL_VAR name>` is replaced with the name of the page being created. diff --git a/doc/ikiwiki/directive/flattr.mdwn b/doc/ikiwiki/directive/flattr.mdwn new file mode 100644 index 000000000..5083005ce --- /dev/null +++ b/doc/ikiwiki/directive/flattr.mdwn @@ -0,0 +1,45 @@ +The `flattr` directive is supplied by the [[!iki plugins/flattr desc=flattr]] plugin. + +This directive allows easily inserting Flattr buttons onto wiki pages. + +Flattr supports both static buttons and javascript buttons. This directive +only creates dynamic javascript buttons. If you want to insert a static +Flattr button, you can simply copy the html code for it from Flattr, instead. +Note that this directive inserts javascript code into the page, that +loads more javascript code from Flattr.com. So only use it if you feel +comfortable with that. + +The directive can be used to display a button for a thing you have already +manually submitted to Flattr. In this mode, the only parameter you need to +include is the exact url to the thing that was submitted to Flattr. +(If the button is for the current page, you can leave that out.) For +example, this is the Flattr button for ikiwiki. Feel free to add it to all +your pages. ;) + + \[[!flattr url="http://ikiwiki.info/" button=compact]] + +The directive can also be used to create a button that automatically +submits a page to Flattr when a user clicks on it. In this mode you +need to include parameters to specify your uid, and a title, category, tags, +and description for the page. For example, this is a Flattr button for +a blog post: + + \[[!flattr uid=25634 title="my new blog post" category=text + tags="blog,example" description="This is a post on my blog."]] + +Here are all possible parameters you can pass to the Flattr directive. + +* `button` - Set to "compact" for a small button. +* `url` - The url to the thing to be Flattr'd. If omitted, defaults + to the url of the current page. +* `uid` - Your numeric Flattr userid. Not needed if the flattr plugin + has been configured with a global `flattr_userid`. +* `title` - A short title for the thing, to show on its Flattr page. +* `description` - A description of the thing, to show on its Flattr + page. +* `category` - One of: text, images, video, audio, software, rest. +* `tags` - A list of tags separated by a comma. +* `language` - A language code. +* `hidden` - Set to 1 to hide the button from listings on Flattr.com. + +[[!meta robots="noindex, follow"]] diff --git a/doc/ikiwiki/directive/format.mdwn b/doc/ikiwiki/directive/format.mdwn index 23830e9cd..7d11d225f 100644 --- a/doc/ikiwiki/directive/format.mdwn +++ b/doc/ikiwiki/directive/format.mdwn @@ -22,7 +22,7 @@ Note that if the highlight plugin is enabled, this directive can also be used to display syntax highlighted code. Many languages and formats are supported. For example: - \[[format perl """ + \[[!format perl """ print "hello, world\n"; """]] diff --git a/doc/ikiwiki/directive/img.mdwn b/doc/ikiwiki/directive/img.mdwn index 66efd008e..cda62b58f 100644 --- a/doc/ikiwiki/directive/img.mdwn +++ b/doc/ikiwiki/directive/img.mdwn @@ -18,11 +18,12 @@ making the image smaller than the specified size. You can also specify only the width or the height, and the other value will be calculated based on it: "200x", "x200" -You can also pass `alt`, `title`, `class`, `align` and `id` parameters. +You can also pass `alt`, `title`, `class`, `align`, `id`, `hspace`, and +`vspace` parameters. These are passed through unchanged to the html img tag. If you include a `caption` parameter, the caption will be displayed centered beneath the image. -The `link` parameter is used to control whether the scaled down image links +The `link` parameter is used to control whether the scaled image links to the full size version. By default it does; set "link=somepage" to link to another page instead, or "link=no" to disable the link, or "link=http://url" to link to a given url. diff --git a/doc/ikiwiki/directive/inline.mdwn b/doc/ikiwiki/directive/inline.mdwn index 9c55e07c2..c6a23ce3c 100644 --- a/doc/ikiwiki/directive/inline.mdwn +++ b/doc/ikiwiki/directive/inline.mdwn @@ -56,7 +56,7 @@ directive. These are the commonly used ones: * `postform` - Set to "yes" to enable a form to post new pages to a blog. * `postformtext` - Set to specify text that is displayed in a postform. -* `rootpage` - Enable the postform, and allows controling where +* `rootpage` - Enables the postform, and allows controling where newly posted pages should go, by specifiying the page that they should be a [[SubPage]] of. @@ -86,19 +86,15 @@ Here are some less often needed parameters: if raw is set to "yes", the page will be included raw, without additional markup around it, as if it were a literal part of the source of the inlining page. -* `sort` - Controls how inlined pages are sorted. The default, "age" is to - sort newest created pages first. Setting it to "title" will sort pages by - title, and "mtime" sorts most recently modified pages first. If - [[!cpan Sort::Naturally]] is installed, `sort` can be set to "title_natural" - to sort by title with numbers treated as such ("1 2 9 10 20" instead of - "1 10 2 20 9"). +* `sort` - Controls how inlined pages are [[sorted|pagespec/sorting]]. + The default is to sort the newest created pages first. * `reverse` - If set to "yes", causes the sort order to be reversed. * `feedshow` - Specify the maximum number of matching pages to include in the rss/atom feeds. The default is the same as the `show` value above. * `feedonly` - Only generate the feed, do not display the pages inline on the page. * `quick` - Build archives in quick mode, without reading page contents for - metadata. By default, this also turns off generation of any feeds. + metadata. This also turns off generation of any feeds. * `timeformat` - Use this to specify how to display the time or date for pages in the blog. The format string is passed to the strftime(3) function. * `feedpages` - A [[PageSpec]] of inlined pages to include in the rss/atom diff --git a/doc/ikiwiki/directive/inline/discussion.mdwn b/doc/ikiwiki/directive/inline/discussion.mdwn index be0665d04..6a186cd93 100644 --- a/doc/ikiwiki/directive/inline/discussion.mdwn +++ b/doc/ikiwiki/directive/inline/discussion.mdwn @@ -1,3 +1,10 @@ +## Combine inline and toggle + +Is it possible to combine the behaviour of toggle and inline? ie, have it present of list of 'headlines' which are created from seperate subpages which can be clicked to expand to the body of the inlined page. Thanks. + +-- Thiana + +--- ## How do you provide the per post discussion links in your own blog? > That's configured by the "actions" parameter to the inline directive. See @@ -124,3 +131,23 @@ My index page has: Else can you please suggest a smarter way of getting certain data out from pages for a inline index? --[[hendry]] + +--- + +## Interaction of `show` and `feedshow` + +Reading the documentation I would think that `feedshow` does not +influence `show`. + + \[[!inline pages="./blog/*" archive=yes quick=yes feedshow=10 sort=title reverse=yes]] + +Only ten pages are listed in this example although `archive` is set to +yes. Removing `feedshow=10` all matching pages are shown. + +Is that behaviour intended? + +> Is something going wrong because `quick="yes"` [[»turns off generation of any feeds«|inline]]? --[[PaulePanter]] + +--[[PaulePanter]] + +>> Bug was that if feedshow was specified without show it limited to it incorrectly. Fixed. --[[Joey]] diff --git a/doc/ikiwiki/directive/linkmap.mdwn b/doc/ikiwiki/directive/linkmap.mdwn index db79a1491..baa6fff61 100644 --- a/doc/ikiwiki/directive/linkmap.mdwn +++ b/doc/ikiwiki/directive/linkmap.mdwn @@ -7,11 +7,7 @@ graph showing the links between a set of pages in the wiki. Example usage: Only links between mapped pages will be shown; links pointing to or from unmapped pages will be omitted. If the pages to include are not specified, -the links between all pages (and other files) in the wiki are mapped. For -best results, only a small set of pages should be mapped, since otherwise -the map can become very large, unwieldy, and complicated. Also, the map is -rebuilt whenever one of the mapped pages is changed, which can make the -wiki a bit slow. +the links between all pages (and other files) in the wiki are mapped. Here are descriptions of all the supported parameters to the `linkmap` directive: @@ -20,5 +16,14 @@ directive: * `height`, `width` - Limit the size of the map to a given height and width, in inches. Both must be specified for the limiting to take effect, otherwise the map's size is not limited. +* `connected` - Controls whether to include pages on the map that link to + no other pages (connected=no, the default), or to only show pages that + link to others (connected=yes). + +For best results, only a small set of pages should be mapped, since +otherwise the map can become very large, unwieldy, and complicated. +If too many pages are included, the map may get so large that graphviz +cannot render it. Using the `connected` parameter is a good way to prune +out pages that clutter the map. [[!meta robots="noindex, follow"]] diff --git a/doc/ikiwiki/directive/map.mdwn b/doc/ikiwiki/directive/map.mdwn index 09c95a0c9..4b6499547 100644 --- a/doc/ikiwiki/directive/map.mdwn +++ b/doc/ikiwiki/directive/map.mdwn @@ -13,6 +13,8 @@ the [[meta]] directive). For example: \[[!map pages="* and !blog/* and !*/Discussion" show=title]] + \[[!map pages="* and !blog/* and !*/Discussion" show=description]] + Hint: To limit the map to displaying pages less than a certain level deep, use a [[ikiwiki/PageSpec]] like this: `pages="* and !*/*/*"` diff --git a/doc/ikiwiki/directive/map/discussion.mdwn b/doc/ikiwiki/directive/map/discussion.mdwn index 062b4267a..b7ac17b1a 100644 --- a/doc/ikiwiki/directive/map/discussion.mdwn +++ b/doc/ikiwiki/directive/map/discussion.mdwn @@ -1,3 +1,14 @@ +### Sorting + +Is there a way to have the generated maps sorted by *title* instead of *filename* when show=title is used? +Thanks + +-- Thiana + +> [[bugs/map_sorts_by_pagename_and_not_title_when_show__61__title_is_used]] --[[Joey]] + +---- + Question: Is there a way to generate a listing that shows *both* title and description meta information? Currently, a \[\[!map ...]] shows only one of the two, but I'd like to generate a navigation that looks like a description list. For example: * This is the title meta information. @@ -72,3 +83,15 @@ Is there any way to do that? I don't mind mucking around with `\[[!meta]]` on e >>> I think that the ideas and code in >>> [[todo/tracking_bugs_with_dependencies]] might also handle this case. >>> --[[Joey]] + +---- + +I feel like this should be obvious, but I can't figure out how to sort numerically. + +I have `map pages="./* and !*/Discussion and !*/sidebar"` and a bunch of pages with names like 1, 2, 3, 11, 12, 1/1.1, 12/12.3 etc. I want to sort them numerically. I see lots of conversation implying there's a simple way to do it, but not how. + +> No, you can't: map can't currently use a non-default sort order. If it +> could, then you could use [[plugins/sortnaturally]]. There's a +> [[feature_request|todo/sort_parameter_for_map_plugin_and_directive]]; +> [[a_bug_references_it|bugs/map_sorts_by_pagename_and_not_title_when_show=title_is_used]]. +> --[[smcv]] diff --git a/doc/ikiwiki/directive/meta.mdwn b/doc/ikiwiki/directive/meta.mdwn index 000f461c9..1f5bde964 100644 --- a/doc/ikiwiki/directive/meta.mdwn +++ b/doc/ikiwiki/directive/meta.mdwn @@ -23,6 +23,13 @@ Supported fields: be set to a true value in the template; this can be used to format things differently in this case. + An optional `sortas` parameter will be used preferentially when + [[ikiwiki/pagespec/sorting]] by `meta(title)`: + + \[[!meta title="The Beatles" sortas="Beatles, The"]] + + \[[!meta title="David Bowie" sortas="Bowie, David"]] + * license Specifies a license for the page, for example, "GPL". Can contain @@ -37,14 +44,19 @@ Supported fields: Specifies the author of a page. + An optional `sortas` parameter will be used preferentially when + [[ikiwiki/pagespec/sorting]] by `meta(author)`: + + \[[!meta author="Joey Hess" sortas="Hess, Joey"]] + * authorurl Specifies an url for the author of a page. * description - Specifies a "description" of the page. You could use this to provide - a summary, for example, to be picked up by the [[map]] directive. + Specifies a short description for the page. This will be put in + the html header, and can also be displayed by eg, the [[map]] directive. * permalink @@ -79,7 +91,7 @@ Supported fields: Example: - \\[[!meta openid="http://joeyh.myopenid.com/" + \[[!meta openid="http://joeyh.myopenid.com/" server="http://www.myopenid.com/server" xrds-location="http://www.myopenid.com/xrds?username=joeyh.myopenid.com""]] diff --git a/doc/ikiwiki/directive/pagestats.mdwn b/doc/ikiwiki/directive/pagestats.mdwn index 426f3e4af..d0e0e7be7 100644 --- a/doc/ikiwiki/directive/pagestats.mdwn +++ b/doc/ikiwiki/directive/pagestats.mdwn @@ -4,22 +4,37 @@ This directive can generate stats about how pages link to each other. It can produce either a tag cloud, or a table counting the number of links to each page. -Here's how to use it to create a [[tag]] cloud: +Here's how to use it to create a [[tag]] cloud, with tags sized based +on frequency of use: \[[!pagestats pages="tags/*"]] +Here's how to create a list of tags, sized by use as they would be in a +cloud. + + \[[!pagestats style="list" pages="tags/*"]] + And here's how to create a table of all the pages on the wiki: \[[!pagestats style="table"]] -The optional `among` parameter limits counting to pages that match a -[[ikiwiki/PageSpec]]. For instance, to display a cloud of tags used on blog -entries, you could use: +The optional `among` parameter limits the pages whose outgoing links are +considered. For instance, to display a cloud of tags used on blog +entries, while ignoring other pages that use those tags, you could use: \[[!pagestats pages="tags/*" among="blog/posts/*"]] -or to display a cloud of tags related to Linux, you could use: +Or to display a cloud of tags related to Linux, you could use: \[[!pagestats pages="tags/* and not tags/linux" among="tagged(linux)"]] +The optional `show` parameter limits display to the specified number of +pages. For instance, to show a table of the top ten pages with the most +links: + + \[[!pagestats style="table" show="10"]] + +The optional `class` parameter can be used to control the class +of the generated tag cloud `div` or page stats `table`. + [[!meta robots="noindex, follow"]] diff --git a/doc/ikiwiki/directive/pagestats/discussion.mdwn b/doc/ikiwiki/directive/pagestats/discussion.mdwn index 3c9dc7104..99029e88e 100644 --- a/doc/ikiwiki/directive/pagestats/discussion.mdwn +++ b/doc/ikiwiki/directive/pagestats/discussion.mdwn @@ -7,4 +7,12 @@ I would rather not find and create a page for every tag I have created or will c Thanks ----- +> Hello unknown person. + +> I think it would require a different approach to what "tags" are, and/or what "pagestats" are. The pagestats plugin gives statistical information about *pages*, so it requires the pages in question to exist before it can get information about them. The tags plugin creates links to tag *pages*, with the expectation that a human being will create said pages and put whatever content they want on them (such as describing what the tag is about, and a map linking back to the tagged pages). + +> The approach that [PmWiki](http://www.pmwiki.org) takes is that it enables the optional auto-creation of (empty) pages which match a particular "group" (set of sub-pages); thus one could set all the "tags/*" pages to be auto-created, creating a new tags/foo page the first time the \[[!tag foo]] directive is used. See [[todo/auto-create_tag_pages_according_to_a_template]] for more discussion on this idea. +> -- [[KathrynAndersen]] + +> Update: Ikiwiki can auto-create tags now, though it only defaults to +> doing so when tagbase is set. --[[Joey]] diff --git a/doc/ikiwiki/directive/pagetemplate.mdwn b/doc/ikiwiki/directive/pagetemplate.mdwn index 0e4066f34..401b38099 100644 --- a/doc/ikiwiki/directive/pagetemplate.mdwn +++ b/doc/ikiwiki/directive/pagetemplate.mdwn @@ -1,15 +1,13 @@ The `pagetemplate` directive is supplied by the [[!iki plugins/pagetemplate desc=pagetemplate]] plugin. -This directive allows a page to be created using a different wikitemplates. +This directive allows a page to be displayed using a different +[[template|templates]] than the default `page.tmpl` template. + The page text is inserted into the template, so the template controls the overall look and feel of the wiki page. This is in contrast to the [[ikiwiki/directive/template]] directive, which allows inserting templates _into_ the body of a page. -This directive can only reference templates that are already installed -by the system administrator, typically into the -`/usr/share/ikiwiki/templates` directory. Example: - \[[!pagetemplate template="my_fancy.tmpl"]] [[!meta robots="noindex, follow"]] diff --git a/doc/ikiwiki/directive/postsparkline.mdwn b/doc/ikiwiki/directive/postsparkline.mdwn index a7cb6822a..318512eef 100644 --- a/doc/ikiwiki/directive/postsparkline.mdwn +++ b/doc/ikiwiki/directive/postsparkline.mdwn @@ -24,7 +24,7 @@ times of pages matched by the specified `pages` [[ikiwiki/PageSpec]]. A maximum of `max` data points will be generated. The `formula` parameter controls the formula used to generate data points. -Available forumlae: +Available formulae: * `interval` - The height of each point represents how long it has been since the previous post. @@ -32,10 +32,10 @@ Available forumlae: many posts were made that day. * `permonth` - Each point represents a month; the height represents how many posts were made that month. -* `peryear` - Each point represents a day; the height represents how +* `peryear` - Each point represents a year; the height represents how many posts were made that year. -The `time` parameter has a default value of "ctime", since forumae use +The `time` parameter has a default value of "ctime", since formulae use the creation times of pages by default. If you instead want them to use the modification times of pages, set it to "mtime". diff --git a/doc/ikiwiki/directive/sidebar.mdwn b/doc/ikiwiki/directive/sidebar.mdwn new file mode 100644 index 000000000..599695d22 --- /dev/null +++ b/doc/ikiwiki/directive/sidebar.mdwn @@ -0,0 +1,20 @@ +The `sidebar` directive is supplied by the [[!iki plugins/sidebar desc=sidebar]] plugin. + +This directive can specify a custom sidebar to display on the page, +overriding any sidebar that is displayed globally. + +If no custom sidebar content is specified, it forces the sidebar page to +be used as the sidebar, even if the `global_sidebars` setting has been +used to disable use of the sidebar page by default. + +## examples + + \[[!sidebar content=""" + This is my custom sidebar for this page. + + \[[!calendar pages="posts/*"]] + """]] + + \[[!sidebar]] + +[[!meta robots="noindex, follow"]] diff --git a/doc/ikiwiki/directive/sparkline.mdwn b/doc/ikiwiki/directive/sparkline.mdwn index 1f691e14f..e5a03f84e 100644 --- a/doc/ikiwiki/directive/sparkline.mdwn +++ b/doc/ikiwiki/directive/sparkline.mdwn @@ -2,7 +2,7 @@ The `sparkline` directive is supplied by the [[!iki plugins/sparkline desc=spark This directive allows for embedding sparklines into wiki pages. A sparkline is a small word-size graphic chart, that is designed to be -displayes alongside text. +displayed alongside text. # examples diff --git a/doc/ikiwiki/directive/table.mdwn b/doc/ikiwiki/directive/table.mdwn index e27a94b7c..a6692f92c 100644 --- a/doc/ikiwiki/directive/table.mdwn +++ b/doc/ikiwiki/directive/table.mdwn @@ -6,8 +6,8 @@ or DSV (delimiter-separated values) format. ## examples \[[!table data=""" - Customer|Amount - Fulanito|134,34 + Customer |Amount + Fulanito |134,34 Menganito|234,56 Menganito|234,56 """]] @@ -42,4 +42,9 @@ cells. For example: as the table header. Set it to "no" to make a table without a header, or "column" to make the first column be the header. +For tab-delimited tables (often obtained by copying and pasting from HTML +or a spreadsheet), `delimiter` must be set to a literal tab character. These +are difficult to type in most web browsers - copying and pasting one from +the table data is likely to be the easiest way. + [[!meta robots="noindex, follow"]] diff --git a/doc/ikiwiki/directive/tag.mdwn b/doc/ikiwiki/directive/tag.mdwn index 64736f8cd..c8d9b9816 100644 --- a/doc/ikiwiki/directive/tag.mdwn +++ b/doc/ikiwiki/directive/tag.mdwn @@ -19,7 +19,8 @@ instead: Note that if the wiki is configured to use a tagbase, then the tags will be located under a base directory, such as "tags/". This is a useful way to avoid having to write the full path to tags, if you want to keep them -grouped together out of the way. +grouped together out of the way. Also, since ikiwiki then knows where to put +tags, it will automatically create tag pages when new tags are used. Bear in mind that specifying a tagbase means you will need to incorporate it into the `link()` [[ikiwiki/PageSpec]] you use: e.g., if your tagbase is @@ -28,7 +29,7 @@ into the `link()` [[ikiwiki/PageSpec]] you use: e.g., if your tagbase is If you want to override the tagbase for a particular tag, you can use something like this: - \[[!tag ./foo]] + \[[!tag /foo]] \[[!taglink /foo]] [[!meta robots="noindex, follow"]] diff --git a/doc/ikiwiki/directive/template.mdwn b/doc/ikiwiki/directive/template.mdwn index d538b69b1..6c50fa32e 100644 --- a/doc/ikiwiki/directive/template.mdwn +++ b/doc/ikiwiki/directive/template.mdwn @@ -1,18 +1,87 @@ The `template` directive is supplied by the [[!iki plugins/template desc=template]] plugin. -[[Templates]] are files that can be filled out and inserted into pages in the -wiki, by using the template directive. The directive has an `id` parameter +The template directive allows wiki pages to be used as templates. +These templates can be filled out and inserted into other pages in the +wiki using the directive. The [[templates]] page lists templates +that can be used with this directive. + +The directive has an `id` parameter that identifies the template to use. The remaining parameters are used to fill out the template. -Example: +## Example \[[!template id=note text="""Here is the text to insert into my note."""]] This fills out the `note` template, filling in the `text` field with the specified value, and inserts the result into the page. -For a list of available templates, and details about how to create more, -see the [[templates]] page. +## Using a template + +Generally, a value can include any markup that would be allowed in the wiki +page outside the template. Triple-quoting the value even allows quotes to +be included in it. Combined with multi-line quoted values, this allows for +large chunks of marked up text to be embedded into a template: + + \[[!template id=foo name="Sally" color="green" age=8 notes=""" + * \[[Charley]]'s sister. + * "I want to be an astronaut when I grow up." + * Really 8 and a half. + """]] + +## Creating a template + +The template is a regular wiki page, located in the `templates/` +subdirectory inside the source directory of the wiki. + +(Alternatively, templates can be stored in a directory outside the wiki, +as files with the extension ".tmpl". +By default, these are searched for in `/usr/share/ikiwiki/templates`; +the `templatedir` setting can be used to make another directory be searched +first.) + +The template uses the syntax used by the [[!cpan HTML::Template]] perl +module, which allows for some fairly complex things to be done. Consult its +documentation for the full syntax, but all you really need to know are a +few things: + +* Each parameter you pass to the template directive will generate a + template variable. There are also some pre-defined variables like PAGE + and BASENAME. +* To insert the value of a variable, use `<TMPL_VAR variable>`. Wiki markup + in the value will first be converted to html. +* To insert the raw value of a variable, with wiki markup not yet converted + to html, use `<TMPL_VAR raw_variable>`. +* To make a block of text conditional on a variable being set use + `<TMPL_IF variable>text</TMPL_IF>`. +* To use one block of text if a variable is set and a second if it's not, + use `<TMPL_IF variable>text<TMPL_ELSE>other text</TMPL_IF>` + +Here's a sample template: + + <span class="infobox"> + Name: \[[<TMPL_VAR raw_name>]]<br /> + Age: <TMPL_VAR age><br /> + <TMPL_IF color> + Favorite color: <TMPL_VAR color><br /> + <TMPL_ELSE> + No favorite color.<br /> + </TMPL_IF> + <TMPL_IF notes> + <hr /> + <TMPL_VAR notes> + </TMPL_IF> + </span> + +The filled out template will be formatted the same as the rest of the page +that contains it, so you can include WikiLinks and all other forms of wiki +markup in the template. Note though that such WikiLinks will not show up as +backlinks to the page that uses the template. + +Note the use of "raw_name" inside the [[ikiwiki/WikiLink]] generator in the +example above. This ensures that if the name contains something that might +be mistaken for wiki markup, it's not converted to html before being +processed as a [[ikiwiki/WikiLink]]. + [[!meta robots="noindex, follow"]] diff --git a/doc/ikiwiki/directive/toc.mdwn b/doc/ikiwiki/directive/toc.mdwn index bf504dafc..bb1afa1ac 100644 --- a/doc/ikiwiki/directive/toc.mdwn +++ b/doc/ikiwiki/directive/toc.mdwn @@ -14,6 +14,12 @@ the `levels` parameter: The toc directive will take the level of the first header as the topmost level, even if there are higher levels seen later in the file. +To create a table of contents that only shows headers starting with a given +level, use the `startlevel` parameter. For example, to show only h2 and +smaller headers: + + \[[!toc startlevel=2]] + The table of contents will be created as an ordered list. If you want an unordered list instead, you can change the list-style in your local style sheet. diff --git a/doc/ikiwiki/formatting.mdwn b/doc/ikiwiki/formatting.mdwn index 2ed5cc26f..befbce9aa 100644 --- a/doc/ikiwiki/formatting.mdwn +++ b/doc/ikiwiki/formatting.mdwn @@ -7,7 +7,7 @@ called [[MarkDown]], and it works like this: Leave blank lines between paragraphs. -You can \**emphasise*\* or \*\***strongly emphasise**\*\* text by placing it +You can *\*emphasise\** or **\*\*strongly emphasise\*\*** text by placing it in single or double asterisks. To create a list, start each line with an asterisk: diff --git a/doc/ikiwiki/openid.mdwn b/doc/ikiwiki/openid.mdwn index a79655284..2fa972ede 100644 --- a/doc/ikiwiki/openid.mdwn +++ b/doc/ikiwiki/openid.mdwn @@ -9,16 +9,10 @@ that allows you to have one login that you can use on a growing number of websites. -To sign up for an OpenID, visit one of the following identity providers: +If you have an account with some of the larger web service providers, +you might already have an OpenID. +[Directory of OpenID providers](http://openiddirectory.com/openid-providers-c-1.html) -* [MyOpenID](https://www.myopenid.com/) -* [GetOpenID](https://getopenid.com/) -* [Videntity](http://videntity.org/) -* [LiveJournal](http://www.livejournal.com/openid/) -* [TrustBearer](https://openid.trustbearer.com/) -* or any of the [many others out there](http://openiddirectory.com/openid-providers-c-1.html) - -Your OpenID is the URL that you are given when you sign up. [[!if test="enabled(openid)" then=""" To sign in to this wiki using OpenID, just enter it in the OpenID field in the signin form. You do not need to give this wiki a password or go through any diff --git a/doc/ikiwiki/pagespec.mdwn b/doc/ikiwiki/pagespec.mdwn index 5f0f44e2e..c66395f84 100644 --- a/doc/ikiwiki/pagespec.mdwn +++ b/doc/ikiwiki/pagespec.mdwn @@ -24,31 +24,35 @@ match all pages except for Discussion pages and the SandBox: Some more elaborate limits can be added to what matches using these functions: +* "`glob(someglob)`" - matches pages and other files that match the given glob. + Just writing the glob by itself is actually a shorthand for this function. +* "`page(glob)`" - like `glob()`, but only matches pages, not other files * "`link(page)`" - matches only pages that link to a given page (or glob) * "`tagged(tag)`" - matches pages that are tagged or link to the given tag (or tags matched by a glob) * "`backlink(page)`" - matches only pages that a given page links to -* "`creation_month(month)`" - matches only pages created on the given month +* "`creation_month(month)`" - matches only files created on the given month * "`creation_day(mday)`" - or day of the month * "`creation_year(year)`" - or year -* "`created_after(page)`" - matches only pages created after the given page +* "`created_after(page)`" - matches only files created after the given page was created -* "`created_before(page)`" - matches only pages created before the given page +* "`created_before(page)`" - matches only files created before the given page was created -* "`glob(someglob)`" - matches pages that match the given glob. Just writing - the glob by itself is actually a shorthand for this function. * "`internal(glob)`" - like `glob()`, but matches even internal-use pages that globs do not usually match. * "`title(glob)`", "`author(glob)`", "`authorurl(glob)`", - "`license(glob)`", "`copyright(glob)`" - match pages that have the given - metadata, matching the specified glob. + "`license(glob)`", "`copyright(glob)`", "`guid(glob)`" + - match pages that have the given metadata, matching the specified glob. * "`user(username)`" - tests whether a modification is being made by a user with the specified username. If openid is enabled, an openid can also - be put here. + be put here. Glob patterns can be used in the username. For example, + to match all openid users, use `user(*://*)` * "`admin()`" - tests whether a modification is being made by one of the wiki admins. * "`ip(address)`" - tests whether a modification is being made from the specified IP address. +* "`comment(glob)`" - matches comments to a page matching the glob. +* "`comment_pending(glob)`" - matches unmoderated, pending comments. * "`postcomment(glob)`" - matches only when comments are being posted to a page matching the specified glob diff --git a/doc/ikiwiki/pagespec/attachment.mdwn b/doc/ikiwiki/pagespec/attachment.mdwn index 419f00ee4..fa2bc5867 100644 --- a/doc/ikiwiki/pagespec/attachment.mdwn +++ b/doc/ikiwiki/pagespec/attachment.mdwn @@ -7,11 +7,12 @@ If attachments are enabled, the wiki admin can control what types of attachments will be accepted, via the `allowed_attachments` configuration setting. -For example, to limit arbitrary files to 50 kilobytes, but allow -larger mp3 files to be uploaded by joey into a specific directory, and -check all attachments for viruses, something like this could be used: +For example, to limit most users to uploading small images, and nothing else, +while allowing larger mp3 files to be uploaded by joey into a specific +directory, and check all attachments for viruses, something like this could be +used: - virusfree() and ((user(joey) and podcast/*.mp3 and mimetype(audio/mpeg) and maxsize(15mb)) or (!ispage() and maxsize(50kb))) + virusfree() and ((user(joey) and podcast/*.mp3 and mimetype(audio/mpeg) and maxsize(15mb)) or (mimetype(image/*) and maxsize(50kb))) The regular [[ikiwiki/PageSpec]] syntax is expanded with the following additional tests: diff --git a/doc/ikiwiki/pagespec/discussion.mdwn b/doc/ikiwiki/pagespec/discussion.mdwn index 4eed3722c..4c553925a 100644 --- a/doc/ikiwiki/pagespec/discussion.mdwn +++ b/doc/ikiwiki/pagespec/discussion.mdwn @@ -92,3 +92,14 @@ does not seem suitable for this, as > \[[!map pages="./*"]] also lists the current page and all its siblings. + +--- + +I am a little lost. I want to match the start page `/index.mdwn`. So I use + + \[[!inline pages="/index"]] + +which does not work though. I also tried it in this Wiki. Just take a look at the end of the [[SandBox|sandbox]]. --[[PaulePanter]] + +> Unlike wikilinks, pagespecs match relative to the top of the wiki by +> default. So lose the "/" and it will work. --[[Joey]] diff --git a/doc/ikiwiki/pagespec/po.mdwn b/doc/ikiwiki/pagespec/po.mdwn index e0264dd50..f9956404c 100644 --- a/doc/ikiwiki/pagespec/po.mdwn +++ b/doc/ikiwiki/pagespec/po.mdwn @@ -11,6 +11,13 @@ wiki: specified as a ISO639-1 (two-letter) language code. * "`currentlang()`" - tests whether a page is written in the same language as the current page. +* "`needstranslation()`" - tests whether a page needs translation + work. Only slave pages match this PageSpec. A minimum target + translation percentage can optionally be passed as an integer + parameter: "`needstranslation(50)`" matches only pages less than 50% + translated. Note that every non-po page is considered to be written in `po_master_language`, as specified in `ikiwiki.setup`. + +[[!meta robots="noindex, follow"]] diff --git a/doc/ikiwiki/pagespec/sorting.mdwn b/doc/ikiwiki/pagespec/sorting.mdwn new file mode 100644 index 000000000..ccd7f7eaa --- /dev/null +++ b/doc/ikiwiki/pagespec/sorting.mdwn @@ -0,0 +1,26 @@ +Some [[directives|ikiwiki/directive]] that use +[[PageSpecs|ikiwiki/pagespec]] allow +specifying the order that matching pages are shown in. The following sort +orders can be specified. + +* `age` - List pages from the most recently created to the oldest. + +* `mtime` - List pages with the most recently modified first. + +* `title` - Order by title (page name). +[[!if test="enabled(sortnaturally)" then=""" +* `title_natural` - Orders by title, but numbers in the title are treated + as such, ("1 2 9 10 20" instead of "1 10 2 20 9") +"""]] +[[!if test="enabled(meta)" then=""" +* `meta(title)` - Order according to the `\[[!meta title="foo" sortas="bar"]]` + or `\[[!meta title="foo"]]` [[ikiwiki/directive]], or the page name if no + full title was set. `meta(author)`, `meta(date)`, `meta(updated)`, etc. + also work. +"""]] + +In addition, you can combine several sort orders and/or reverse the order of +sorting, with a string like `age -title` (which would sort by age, then by +title in reverse order if two pages have the same age). + +[[!meta robots="noindex, follow"]] diff --git a/doc/ikiwiki/searching.mdwn b/doc/ikiwiki/searching.mdwn index 539e7193d..4c1287933 100644 --- a/doc/ikiwiki/searching.mdwn +++ b/doc/ikiwiki/searching.mdwn @@ -5,7 +5,7 @@ then="This wiki has searching **enabled**." else="This wiki has searching **disabled**."]] If searching is enabled, you can enter search terms in the search field, -as you'd expect. There are a few special things you can do to constuct +as you'd expect. There are a few special things you can do to construct more powerful searches. * To match a phrase, enclose it in double quotes. diff --git a/doc/ikiwiki/subpage.mdwn b/doc/ikiwiki/subpage.mdwn index e047b860c..862f45ec1 100644 --- a/doc/ikiwiki/subpage.mdwn +++ b/doc/ikiwiki/subpage.mdwn @@ -5,8 +5,8 @@ this page, [[SubPage]] has some related pages placed under it, like wiki rather than just having a great big directory full of pages. To add a SubPage, just make a subdirectory and put pages in it. For -example, this page is SubPage.mdwn in this wiki's source, and there is also -a SubPage subdirectory, which contains SubPage/LinkingRules.mdwn. Subpages +example, this page is subpage.mdwn in this wiki's source, and there is also +a subpage subdirectory, which contains subpage/linkingrules.mdwn. Subpages can be nested as deeply as you'd like. Linking to and from a SubPage is explained in [[LinkingRules]]. diff --git a/doc/ikiwiki/wikilink.mdwn b/doc/ikiwiki/wikilink.mdwn index f561d5850..cf3b89c76 100644 --- a/doc/ikiwiki/wikilink.mdwn +++ b/doc/ikiwiki/wikilink.mdwn @@ -9,9 +9,6 @@ wikilink, just prefix it with a `\`, like `\\[[WikiLink]]`. There are some special [[SubPage/LinkingRules]] that come into play when linking between [[SubPages|SubPage]]. -Also, if the file linked to by a WikiLink looks like an image, it will -be displayed inline on the page. - WikiLinks are matched with page names in a case-insensitive manner, so you don't need to worry about getting the case the same, and can capitalise links at the start of a sentence, and so on. @@ -23,14 +20,10 @@ page, but the link will appear like this: [[foo_bar|SandBox]]. To link to an anchor inside a page, you can use something like `\[[WikiLink#foo]]` . -## Directives and WikiLinks - -ikiwiki has two syntaxes for -[[directives|directive]]. The older syntax -used spaces to distinguish between directives and -wikilinks; as a result, with that syntax in use, you cannot use spaces -in WikiLinks, and must replace spaces with underscores. The newer -syntax, enabled with the `prefix_directives` option in an ikiwiki -setup file, prefixes directives with `!`, and thus does not prevent -links with spaces. Future versions of ikiwiki will turn this option -on by default. +If the file linked to by a WikiLink looks like an image, it will +be displayed inline on the page. + +--- + +You can also put an url in a WikiLink, to link to an external page. +Email addresses can also be used to generate a mailto link. diff --git a/doc/ikiwiki/wikilink/discussion.mdwn b/doc/ikiwiki/wikilink/discussion.mdwn index b146c9447..89affc502 100644 --- a/doc/ikiwiki/wikilink/discussion.mdwn +++ b/doc/ikiwiki/wikilink/discussion.mdwn @@ -43,6 +43,11 @@ BTW, ikiwiki doesn't displays the #foo anchor in the example >> Fixed that --[[Joey]] +The 'name' attribute of the 'a' element is a depracated way to create a named anchor. The right way to do that is using the 'id' attribute of any element. This is because an anchor may refer to a complete element rather than some point in the page. + +Standard purity aside, if you define an anchor (using either 'a name' or 'id') to a single point in the document but refer to a complete section, the browser may just show that specific point at the bottom of the page rather than trying to show all the section. +--[[tzafrir]] + --- Considering a hierarchy like `foo/bar/bar`, I had the need to link from the @@ -79,3 +84,8 @@ Is it possible to refer to a page, say \[[foobar]], such that the link text is t > Not yet. :-) Any suggestion for a syntax for it? Maybe something like \[[|foobar]] ? --[[Joey]] I like your suggestion because it's short and conscise. However, it would be nice to be able to refer to more or less arbitrary meta tags in links, not just "title". To do that, the link needs two parameters: the page name and the tag name, i.e. \[[pagename!metatag]]. Any sufficiently weird separater can be used instead of '!', of course. I like \[[pagename->metatag]], too, because it reminds me of accessing a data member of a structure (which is what referencing a meta tag is, really). --Peter + +> I dislike \[[pagename->metatag]] because other wikis use that as their normal link/label syntax. +> I'm not sure that it is a good idea to refer to arbitrary meta tags in links in the first place - what other meta tags would you really be interested in? Description? Author? It makes sense to me to refer to the title, because that is a "label" for a page. +> As for syntax, I do like the \[[|foobar]] idea, or perhaps something like what <a href="http://www.pmwiki.org">PmWiki</a> does - they have their links the other way around, so they go \[[page|label]] and for link-text-as-title, they have \[[page|+]]. So for IkiWiki, that would be \[[+|page]] I guess. +> --[[KathrynAndersen]] diff --git a/doc/ikiwiki_is_so_sexy__33__.mdwn b/doc/ikiwiki_is_so_sexy__33__.mdwn deleted file mode 100644 index f59b664c2..000000000 --- a/doc/ikiwiki_is_so_sexy__33__.mdwn +++ /dev/null @@ -1 +0,0 @@ -Yes it is! diff --git a/doc/ikiwikiusers.mdwn b/doc/ikiwikiusers.mdwn index 72bdbf3d8..2f70c67b7 100644 --- a/doc/ikiwikiusers.mdwn +++ b/doc/ikiwikiusers.mdwn @@ -1,10 +1,21 @@ +Ikiwiki Hosting +=============== + +* [Branchable](http://branchable.com/) + Projects & Organizations ======================== * [This wiki](http://ikiwiki.info) (of course!) +<!-- * [NetBSD wiki](http://wiki.netbsd.org) --> +* The [GNU Hurd](http://www.gnu.org/software/hurd/) +* [DragonFly BSD](http://www.dragonflybsd.org/) +* [Monotone](http://monotone.ca/wiki/FrontPage/) +* The [Free Software Foundation](http://fsf.org) uses it for their internal wiki, with subversion. +* The [cairo graphics library](http://cairographics.org/) website. +* The [Portland State Aerospace Society](http://psas.pdx.edu) website. Converted from a combination of TWiki and MoinMoin to ikiwiki, including full history ([[rcs/Git]] backend). * [Planet Debian upstream](http://updo.debian.net/) * [Debian Mentors wiki](http://jameswestby.net/mentors/) -* The [Sparse wiki](http://kernel.org/pub/linux/kernel/people/josh/sparse). * [The BSD Associate Admin Book Project](http://bsdwiki.reedmedia.net/) * The [maildirman wiki](http://svcs.cs.pdx.edu/maildirman) * The [linuxbierwanderung wiki/homepage](http://www.linuxbierwanderung.org) @@ -13,8 +24,6 @@ Projects & Organizations * [Braawi Ltd](http://braawi.com/) and the community site [Braawi.org](http://braawi.org/) * [Webconverger](http://webconverger.org/) (a Web only linux distribution) with a [blog](http://webconverger.org/blog/) * [debian-community.org](http://debian-community.org/) -* The [cairo graphics library](http://cairographics.org/) website. -* The [Portland State Aerospace Society](http://psas.pdx.edu) website. Converted from a combination of TWiki and MoinMoin to ikiwiki, including full history ([[rcs/Git]] backend). * [DebTorrent](http://debtorrent.alioth.debian.org) * The [netconf project](http://netconf.alioth.debian.org) * The [Debian Packaging Handbook project](http://packaging-handbook.alioth.debian.org/wiki/) @@ -27,7 +36,6 @@ Projects & Organizations * [Enemies of Carlotta](http://www.e-o-c.org/) * [vcs-pkg](http://vcs-pkg.org) * [vcs-home](http://vcs-home.madduck.net) -* [GNU Hurd wiki](http://www.bddebian.com/~wiki/) * [Query Object Framework](http://qof.alioth.debian.org/) * [Estron - Object Relational Mapping interpreter](http://estron.alioth.debian.org/) * [Public Domain collection of Debian related tips & tricks](http://dabase.com/tips/) - please add any tips too @@ -37,19 +45,26 @@ Projects & Organizations * [Chaos Computer Club Düsseldorf](https://www.chaosdorf.de) * [monkeysphere](http://web.monkeysphere.info/) * [The Walden Effect](http://www.waldeneffect.org/) -* [Monotone](http://monotone.ca/wiki/FrontPage/) -* The support pages for [Trinity Centre for High Performance Computing](http://www.tchpc.tcd.ie/support/) * [St Hugh of Lincoln Catholic Primary School in Surrey](http://www.sthugh-of-lincoln.surrey.sch.uk/) -* [Pigro Network](http://www.pigro.net) is running a hg based ikiwiki. (And provides ikiwiki hosting for $10/m.) * [Cosin Homepage](http://cosin.ch) uses an Ikiwiki with a subversion repository. * [Bosco Free Orienteering Software](http://bosco.durcheinandertal.ch) -* The [GNU Hurd](http://www.gnu.org/software/hurd/)'s web pages -* The [Free Software Foundation](http://fsf.org) uses it for their internal wiki, with subversion. +* [MIT Student Information Processing Board](http://sipb.mit.edu/) +* [Tinc VPN](http://tinc-vpn.org/) +* [The XCB library](http://xcb.freedesktop.org/) +* [The Philolexian Society of Columbia University](http://www.columbia.edu/cu/philo/) +* [Fachschaft Informatik HU Berlin](http://fachschaft.informatik.hu-berlin.de/) +* [Wetknee Books](http://www.wetknee.com/) +* [IPOL Image Processing On Line](http://www.ipol.im) +* [Debian Costa Rica](http://cr.debian.net/) +* [Fvwm Wiki](http://fvwmwiki.xteddy.org) +* [Serialist](http://serialist.net/)'s static pages (documentation, blog). We actually have ikiwiki generate its static content as HTML fragments using a modified page.tmpl template, and then the FastCGI powering our site grabs those fragments and embeds them in the standard dynamic site template. +* [Apua IT](http://apua.se/) +* [PDFpirate Community](http://community.pdfpirate.org/) +* [Banu](https://www.banu.com/) Personal sites and blogs ======================== -* [Skirv's Wiki](http://wiki.killfile.org) - formerly Skirv's Homepage * [[Joey]]'s [homepage](http://kitenet.net/~joey/), including his weblog * [Kyle's MacLea Genealogy wiki](http://kitenet.net/~kyle/family/wiki) and [Livingstone and MacLea Emigration Registry](http://kitenet.net/~kyle/family/registry) * [Ulrik's personal web page](http://kaizer.se/wiki/) @@ -77,7 +92,6 @@ Personal sites and blogs * [Proper Treatment 正當作法](http://conway.rutgers.edu/~ccshan/wiki/) * [lost scraps](http://web.mornfall.net), pages/blog of Petr Ročkai aka mornfall * [Ronan Le Hy's blog](http://bayesien.org), in French. -* <http://iainmclaren.com>. * [formorers blog and website](http://www.formorer.de/webwiki/) * [Mark Jaroski's blog](http://movemearound.org/) * I keep my personal project notes and specs in a private ikiwiki - it's the perfect tool for this task. - [the daniel](http://neoglam.com) @@ -113,7 +127,7 @@ Personal sites and blogs * [[Adam_Trickett|ajt]]'s home intranet/sanbox system ([Internet site & blog](http://www.iredale.net/) -- not ikiwiki yet) * [[Simon_McVittie|smcv]]'s [website](http://www.pseudorandom.co.uk/) and [blog](http://smcv.pseudorandom.co.uk/) -* Svend's [website](http://www.ciffer.net/~svend/) and [blog](http://www.ciffer.net/~svend/blog/) +* Svend's [website](http://ciffer.net/~svend/) and [blog](http://ciffer.net/~svend/blog/) * [muammar's site](http://muammar.me) * [Per Bothner's blog](http://per.bothner.com/blog/) * [Bernd Zeimetz (bzed)](http://bzed.de/) @@ -123,10 +137,34 @@ Personal sites and blogs * [Natalian - Kai Hendry's personal blog](http://natalian.org/) * [Mick Pollard aka \_lunix_ - Personal sysadmin blog and wiki](http://www.lunix.com.au) * [tumashu's page](http://tumashu.github.com) This is my personal site in github created with ikiwiki and only a page,you can get the [source](http://github.com/tumashu/tumashu/tree/master) +* [Skirv's Wiki](http://wiki.killfile.org) - formerly Skirv's Homepage +* [Jimmy Tang - personal blog and wiki](http://www.sgenomics.org/~jtang) +* [Nico Schottelius' homepage](http://www.nico.schottelius.org) +* [Andreas Zwinkaus homepage](http://beza1e1.tuxen.de) +* [Salient Dream](http://salient.dre.am) +* [Walden Effect](http://waldeneffect.org) +* [Avian Aqua Miser](http://www.avianaquamiser.com/) +* [Cosmic Cookout](http://www.cosmiccookout.com/) +* [Backyard Deer](http://www.backyarddeer.com/) +* [Alex Ghitza homepage and blog](http://aghitza.org/) +* [Andreas's homepage](http://0x7.ch/) - Ikiwiki, Subversion and CSS template +* [Chris Dombroski's boring bliki](https://www.icanttype.org/) +* [Josh Triplett's homepage](http://joshtriplett.org/) - Git backend with the CGI disabled, to publish a static site with the convenience of ikiwiki. +* [Gustaf Thorslund's blog](http://blog.thorslund.org) +* [Ertug Karamatli](http://pages.karamatli.com) +* [Jonatan Walck](http://jonatan.walck.i2p/) a weblog + wiki over [I2P](http://i2p2.de/). Also [mirrored](http://jonatan.walck.se/) to the Internet a few times per day. +* [Daniel Wayne Armstrong](http://circuidipity.com/) +* [Mukund](https://www.mukund.org/) +* [Nicolas Schodet](http://ni.fr.eu.org/) +* [weakish](http://weakish.github.com) +* [Thomas Kane](http://planetkane.org/) +* [Marco Silva](http://marcot.eti.br/) a weblog + wiki using the [darcs](http://darcs.net) backend Please feel free to add your own ikiwiki site! See also: [Debian ikiwiki popcon graph](http://qa.debian.org/popcon.php?package=ikiwiki) and [google search for ikiwiki powered sites](http://www.google.com/search?q=%22powered%20by%20ikiwiki%22). -While nothing makes me happier than knowing that ikiwiki has happy users, dropping some change in the [[TipJar]] is a nice way to show extra appreciation. +While nothing makes me happier than knowing that ikiwiki has happy users, +dropping some change in the [[TipJar]] is a nice way to show extra +appreciation. diff --git a/doc/ikiwikiusers/discussion.mdwn b/doc/ikiwikiusers/discussion.mdwn index 39a9bb921..2c211b097 100644 --- a/doc/ikiwikiusers/discussion.mdwn +++ b/doc/ikiwikiusers/discussion.mdwn @@ -33,3 +33,7 @@ Hopefully I will be one of the ikiwiki users one day :) cheers --[[Chao]] ---- Are there automated hosting sites for ikiwiki yet? If you know one, can you add one in a new section on [[ikiwikiusers]] please? If you don't know any and you're willing to pay to set one up (shouldn't be much more expensive than a single ikiwiki IMO), [contact me](http://www.ttllp.co.uk/contact.html) and let's talk. -- MJR + +---- + +People who have interests in getting a webhost for ikiwiki may have a look at [this site](http://www.pigro.net). -- weakish diff --git a/doc/index.mdwn b/doc/index.mdwn index 93526c42c..06acc9cec 100644 --- a/doc/index.mdwn +++ b/doc/index.mdwn @@ -20,8 +20,9 @@ ikiwiki [[!version ]]. ## developer resources The [[RoadMap]] describes where the project is going. +The [[forum]] is open for discussions. [[Bugs]], [[TODO]] items, [[wishlist]] items, and [[patches|patch]] can be submitted and tracked using this wiki. -ikiwiki is developed by [[Joey]] and many contributors, +Ikiwiki is developed by [[Joey]] and many contributors, and is [[FreeSoftware]]. diff --git a/doc/index/discussion.mdwn b/doc/index/discussion.mdwn index d52cb0a2d..749042910 100644 --- a/doc/index/discussion.mdwn +++ b/doc/index/discussion.mdwn @@ -1,464 +1 @@ -Seems like there should be a page for you to post your thoughts about -ikiwiki, both pro and con, anything that didn't work, ideas, or whatever. -Do so here.. - -Note that for more formal bug reports or todo items, you can also edit the -[[bugs]] and [[todo]] pages. - -[[!toc ]] - -# Installation/Setup questions - -Ikiwiki creates a .ikiwiki directory in my wikiwc working directory. Should I -"svn add .ikiwiki" or add it to svn:ignore? - -> `.ikiwiki` is used by ikiwiki to store internal state. You can add it to -> svn:ignore. --[[Joey]] -> > Thanks a lot. - -Is there an easy way to log via e-mail to some webmaster address, instead -of via syslog? - -> Not sure why you'd want to do that, but couldn't you use a tool like -> logwatch to mail selected lines from the syslog? --[[Joey]] - -> > The reason is that I'm not logged in on the web server regularly to -> > check the log files. I'll see whether I can install a logwatch instance. - -I'm trying to install from scratch on a CentOS 4.6 system. I installed perl 5.8.8 from source and then added all the required modules via CPAN. When I build ikiwiki from the tarball, I get this message: - - rendering todo/calendar_--_archive_browsing_via_a_calendar_frontend.mdwn - *** glibc detected *** double free or corruption (!prev): 0x0922e478 *** - make: *** [extra_build] Aborted - -I'm kind of at a loss how to track this down or work around it. Any suggestions? --Monty - -> All I can tell you is that it looks like a problem with your C library or -> perl. Little perl programs like ikiwiki should only be able to trigger -> such bugs, not contain them. :-) Sorry I can't be of more help. -> --[[Joey]] - -> I had a similar problem after upgrading to the latest version of -> Text::Markdown from CPAN. You might try either looking for a Markdown -> package for CentOS or using the latest version of John Gruber's -> Markdown.pl: -> <http://daringfireball.net/projects/downloads/Markdown_1.0.2b8.tbz> -> --[[JasonBlevins]], April 1, 2008 18:22 EDT - ->> Unfortunately I couldn't find a CentOS package for markdown, and I ->> couldn't quite figure out how to use John Gruber's version instead. ->> I tried copying it to site_perl, etc., but the build doesn't pick ->> it up. For now I can just play with it on my Ubuntu laptop for which ->> the debian package installed flawlessly. I'll probably wait for an ->> updated version of Markdown to see if this is fixed in the future. ->> --Monty - ->I suggest that you pull an older version of Text::Markdown from CPAN. I am using <http://backpan.perl.org/authors/id/B/BO/BOBTFISH/Text-Markdown-1.0.5.tar.gz> and that works just fine. ->There is a step change in version and size between this version (dated 11Jan2008) and the next version (1.0.12 dated 18Feb2008). I shall have a little look to see why, in due course. ->Ubuntu Hardy Heron has a debian package now, but that does not work either. -> --Dirk 22Apr2008 - -> This might be related to [Text::Markdown bug #37297](http://rt.cpan.org/Public/Bug/Display.html?id=37297).--ChapmanFlack 9Jul2008 - ----- - -# Installation of selected docs (html) - -The latest release has around 560 files (over 2MB) in html. - -Any suggestions or ideas on limiting what html is installed? - -For example, I don't see value in every ikiwiki install out there to also install personal "users" ikiwiki pages. - -For now I copy ikiwiki.setup. And then use pax with -L switch to copy the targets of the symlinks of the basewiki. - -I was thinking of making a list of desired documents from the html directory to install. - ---JeremyReed - -> You don't need any of them, unless you want to read ikiwiki's docs locally. -> -> I don't understand why you're installing the basewiki files manually; -> ikiwiki has a Makefile that will do this for you. --[[Joey]] - ->> The Makefile's install doesn't do what I want so I use different installer for it. ->> It assumes wrong location for man pages for me. (And it should consider using INSTALLVENDORMAN1DIR and ->> MAN1EXT but I don't know about section 8 since I don't know of perl value for that.) ->> I don't want w3m cgi installed; it is optional for my package. ->> I will just patch for that instead of using my own installer. ->> Note: I am working on the pkgsrc package build specification for this. This is for creating ->> packages for NetBSD, DragonFly and other systems that use pkgsrc package system. ->> --JeremyReed - -# Installation as non-root user - -I'd like to install ikiwiki as a non-root user. I can plow through getting all the -perl dependencies installed because that's well documented in the perl world, -but I don't know how to tell ikiwiki to install somewhere other than / --BrianWilson - -> Checkout the tips section for [[tips/DreamHost]]. It should do the trick. --MattReynolds - ----- - -# Upgrade steps - -I upgrades from 1.40 to 2.6.1. I ran "ikiwiki --setup" using my existing ikiwiki.setup configuration. -I had many errors like: - - /home/bsdwiki/www/wiki/wikilink/index.html independently created, not overwriting with version from wikilink - BEGIN failed--compilation aborted at (eval 5) line 129. - -and: - - failed renaming /home/bsdwiki/www/wiki/smileys.ikiwiki-new to /home/bsdwiki/www/wiki/smileys: Is a directory - BEGIN failed--compilation aborted at (eval 5) line 129. - -Probably about six errors like this. I worked around this by removing the files and directories it complained about. -Finally it finished. - -> As of version 2.0, ikiwiki enables usedirs by default. See -> [[tips/switching_to_usedirs]] for details. --[[Joey]] - ->> I read the config wrong. I was thinking that it showed the defaults even though commented out ->> (like ssh configs do). I fixed that part. --JeremyReed - -My next problem was that ikiwiki start letting me edit without any password authentication. It used to prompt -me for a password but now just goes right into the "editing" mode. -The release notes for 2.0 say password auth is still on by default. - -> It sounds like you have the anonok plugin enabled? - ->> Where is the default documented? My config doesn't have it uncommented. - -The third problem is that when editing my textbox is empty -- no content. - -This is using my custom rcs.pm which has been used thousands of times. - -> Have you rebuilt the cgi wrapper since you upgraded ikiwiki? AFAIK I -> fixed a bug that could result in the edit box always being empty back in -> version 2.3. The only other way it could happen is if ikiwiki does not -> have saved state about the page that it's editing (in .ikiwiki/index). - ->> Rebuilt it several times. Now that I think of it, I think my early problem of having ->> no content in the textbox was before I rebuilt the cgi. And after I rebuilt the whole webpage was empty. - -Now I regenerated my ikiwiki.cgi again (no change to my configuration, -and I just get an empty HTML page when attempting editing or "create". - -> If the page is completly empty then ikiwiki is crashing before it can -> output anything, though this seems unlikely. Check the webserver logs. - -Now I see it created directories for my data. I fixed that by setting -usedirs (I see that is in the release notes for 2.0) and rerunning ikiwiki --setup -but I still have empty pages for editing (no textbox no html at all). - -> Is IkiWiki crashing? If so, it would probably leave error text in the apache logs. --[[TaylorKillian]] - ->> Not using apache. Nothing useful in logs other thn the HTTP return codes are "0" and bytes is "-" ->> on the empty ikiwiki.cgi output (should say " 200 " followed by bytes). - ->>> You need to either figure out what your web server does with stderr ->>> from cgi programs, or run ikiwiki.cgi at the command line with an ->>> appropriate environment so it thinks it's being called from a web ->>> server, so you can see how it's failing. --[[Joey]] - -(I am posting this now, but will do some research and post some more.) - -Is there any webpage with upgrade steps? - -> Users are expected to read [[news]], which points out any incompatible -> changes or cases where manual action is needed. - ->> I read it but read the usedirs option wrong :(. ->> Also it appears to be missing the news from between 1.40 to 2.0 unless they dont' exist. ->> If they do exist maybe they have release notes I need? - ->>> All the old ones are in the NEWS file. --[[Joey]] - ---JeremyReed - -My followup: I used a new ikiwiki.setup based on the latest version. But no changes for me. - -Also I forgot to mention that do=recentchanges works good for me. It uses my -rcs_recentchanges in my rcs perl module. - -The do=prefs does nothing though -- just a blank webpage. - -> You need to figure out why ikiwiki is crashing. The webserver logs should -> tell you. - -I also set verbose => 1 and running ikiwiki --setup was verbose, but no changes in running CGI. -I was hoping for some output. - -I am guessing that my rcs perl module stopped working on the upgrade. I didn't notice any release notes -on changes to revision control modules. Has something changed? I will also look. - -> No, the rcs interface has not needed to change in a long time. Also, -> nothing is done with the rcs for do=prefs. - ->> Thanks. I also checked differences between 1.40 Rcs plugins and didn't notice anything significant. - ---JeremyReed - -Another Followup: I created a new ikiwiki configuration and did the --setup to -create an entirely different website. I have same problem there. No prompt for password -and empty webpage when using the cgi. -I never upgraded any perl modules so maybe a new perl module is required but I don't see any errors so I don't know. - -The only errors I see when building and installing ikiwiki are: - - Can't exec "otl2html": No such file or directory at IkiWiki/Plugin/otl.pm line 66. - - gettext 0.14 too old, not updating the pot file - -I don't use GNU gettext on here. - -I may need to revert back to my old ikiwiki install which has been used to thousands of times (with around -1000 rcs commits via ikiwiki). - ---JeremyReed - -I downgraded to version 1.40 (that was what I had before I wrote wrong above). -Now ikiwiki is working for me again (but using 1.40). I shouldn't have tested on production system :) - ---JeremyReed - -I am back. On a different system, I installed ikiwiki 2.6.1. Same problem -- blank CGI webpage. - -So I manually ran with: - - REQUEST_METHOD=GET QUERY_STRING='do=create&page=jcr' kiwiki.cgi - -And clearly saw the error: - - [IkiWiki::main] Fatal: Bad template engine CGI::FormBuilder::Template::div: Can't locate CGI/FormBuilder/Template/div.pm - -So I found my version was too old and 3.05 is the first to provide "Div" support. I upgraded my p5-CGI-FormBuilder to 3.0501. -And ikiwiki CGI started working for me. - -The Ikiwiki docs about this requirement got removed in Revision 4367. There should be a page that lists the requirements. -(I guess I could have used the debian/control file.) - -> There is a page, [[install]] documents that 3.05 is needed. - ->> Sorry, I missed that. With hundreds of wikipages it is hard to read all of them. ->> I am updating the download page now to link to it. - -I am now using ikiwiki 2.6.1 on my testing system. - ---JeremyReed - ----- -# Excellent - how do I translate a TWiki site? - -I just discovered ikiwiki quite by chance, I was looking for a console/terminal -menu system and found pdmenu. So pdmenu brought me to here and I've found ikiwiki! -It looks as if it's just what I've been wanting for a long time. I wanted something -to create mostly text web pages which, as far as possible, have source which is human -readable or at least in a standard format. ikiwiki does this twice over by using -markdown for the source and producing static HTML from it. - -I'm currently using TWiki and have a fair number of pages in that format, does -anyone have any bright ideas for translating? I can knock up awk scripts fairly -easily, perl is possible (but I'm not strong in perl). - -> Let us know if you come up with something to transition from the other -> format. Another option would be writing a ikiwiki plugin to support the -> TWiki format. --[[Joey]] - -> Jamey Sharp and I have a set of scripts in progress to convert other wikis to ikiwiki, including history, so that we can migrate a few of our wikis. We already have support for migrating MoinMoin wikis to ikiwiki, including conversion of the entire history to Git. We used this to convert the [XCB wiki](http://xcb.freedesktop.org/wiki/) to ikiwiki; until we finalize the conversion and put the new wiki in place of the old one, you can browse the converted result at <http://xcb.freedesktop.org/ikiwiki>. We already plan to add support for TWiki (including history, since you can just run parsecvs on the TWiki RCS files to get Git), so that we can convert the [Portland State Aerospace Society wiki](http://psas.pdx.edu) (currently in Moin, but with much of its history in TWiki, and with many of its pages still in TWiki format using Jamey's TWiki format for MoinMoin). -> -> Our scripts convert by way of HTML, using portions of the source wiki's code to render as HTML (with some additional code to do things like translate MoinMoin's `\[[TableOfContents]]` to ikiwiki's `\[[!toc ]]`), and then using a modified [[!cpan HTML::WikiConverter]] to turn this into markdown and ikiwiki. This produces quite satisfactory results, apart from things that don't have any markdown equivalent and thus remain HTML, such as tables and definition lists. Conversion of the history occurs by first using another script we wrote to translate MoinMoin history to Git, then using our git-map script to map a transformation over the Git history. -> -> We will post the scripts as soon as we have them complete enough to convert our wikis. -> -> -- [[JoshTriplett]] - ->> Thanks for an excellent Xmas present, I will appreciate the additional ->> users this will help switch to ikiwiki! --[[Joey]] - - ->> Sounds great indeed. Learning from [here](http://www.bddebian.com/~wiki/AboutTheTWikiToIkiwikiConversion/) that HTML::WikiConverter needed for your conversion was not up-to-date on Debian I have now done an unofficial package, including your proposed Markdown patches, apt-get'able at <pre>deb http://debian.jones.dk/ sid wikitools</pre> ->> -- [[JonasSmedegaard]] - - ->>I see the "We will post the scripts ...." was committed about a year ago. A current site search for "Moin" does not turn them up. Any chance of an appearance in the near (end of year) future? ->> ->> -- [[MichaelRasmussen]] - ->>> It appears the scripts were never posted? I recently imported my Mediawiki site into Iki. If it helps, my notes are here: <http://iki.u32.net/Mediawiki_Conversion> --[[sabr]] - ->>>>> The scripts have been posted now, see [[joshtriplett]]'s user page, ->>>>> and I've pulled together all ways I can find to [[convert]] other ->>>>> systems into ikiwiki. --[[Joey]] - ----- - -# LaTeX support? - -Moved to [[todo/latex]] --[[Joey]] - ----- - -# Using with CVS? - -Moved to a [[todo_item|todo/CVS_backend]]. --[[JoshTriplett]] - ----- - -# Show differences before saving page? - -Moved to the existing [[todo_item|todo/preview_changes]]. --[[JoshTriplett]] - ----- - -# Max submit size? - -Any setting for limiting how many kilobytes can be submitted via the "edit" form? --- [[JeremyReed]] - ->>> See [[todo/fileupload]] for an idea on limiting page size. --[[Joey]] - ----- - -# Editing the style sheet. - -It would be nice to be able to edit the stylesheet by means of the cgi. Or is this possible? I wasn't able to achieve it. -Ok, that's my last 2 cents for a while. --[Mazirian](http://mazirian.com) - -> I don't support editing it, but if/when ikiwiki gets [[todo/fileupload]] support, -> it'll be possible to upload a style sheet. (If .css is in the allowed -> extensions list.. no idea how safe that would be, a style sheet is -> probably a great place to put XSS attacks and evil javascript that would -> be filtered out of any regular page in ikiwiki). --[[Joey]] - ->> I hadn't thought of that at all. It's a common feature and one I've ->> relied on safely, because the wikis I am maintaining at the moment ->> are all private and restricted to trusted users. Given that the whole ->> point of ikiwiki is to be able to access and edit via the shell as ->> well as the web, I suppose the features doesn't add a lot. By the ->> way, the w3m mode is brilliant. I haven't tried it yet, but the idea ->> is great. - ----- - -# Should not create an existing page - -This might be a bug, but will discuss it here first. -Clicking on an old "?" or going to a create link but new Markdown content exists, should not go into "create" mode, but should do a regular "edit". - -> I belive that currently it does a redirect to the new static web page. -> At least that's the intent of the code. --[[Joey]] - ->> Try at your site: `?page=discussion&from=index&do=create` ->> It brings up an empty textarea to start a new webpage -- even though it already exists here. --reed - ->>> Ah, right. Notice that the resulting form allows saving the page as ->>> discussion, or users/discussion, but not index/discussion, since this ->>> page already exists. If all the pages existed, it would do the redirect ->>> thing. --[[Joey]] - ----- - -# Spaces in WikiLinks? - -Hello Joey, - -I've just switched from ikiwiki 2.0 to ikiwiki 2.2 and I'm really surprised -that I can't use the spaces in WikiLinks. Could you please tell me why the spaces -aren't allowed in WikiLinks now? - -My best regards, - ---[[PaweB|ptecza]] - -> See [[bugs/Spaces_in_link_text_for_ikiwiki_links]] - ----- - -# Build in OpenSolaris? - -Moved to [[bugs/build_in_opensolaris]] --[[Joey]] - ----- - -# Various ways to use Subversion with ikiwiki - -I'm playing around with various ways that I can use subversion with ikiwiki. - -* Is it possible to have ikiwiki point to a subversion repository which is on a different server? The basic checkin/checkout functionality seems to work but there doesn't seem to be any way to make the post-commit hook work for a non-local server? - -> This is difficult to do since ikiwiki's post-commit wrapper expects to -> run on a machine that contains both the svn repository and the .ikiwiki -> state directory. However, with recent versions of ikiwiki, you can get -> away without running the post-commit wrapper on commit, and all you lose -> is the ability to send commit notification emails. - -> (And now that [[recentchanges]] includes rss, you can just subscribe to -> that, no need to worry about commit notification emails anymore.) - -* Is it possible / sensible to have ikiwiki share a subversion repository with other data (either completely unrelated files or another ikiwiki instance)? This works in part but again the post-commit hook seems problematic. - ---[[AdamShand]] - -> Sure, see ikiwiki's subversion repository for example of non-wiki files -> in the same repo. If you have two wikis in one repository, you will need -> to write a post-commit script that calls the post-commit wrappers for each -> wiki. - ----- - -# Regex for Valid Characters in Filenames - -I'm sure that this is documented somewhere but I've ransacked the wiki and I can't find it. :-( What are the allowed characters in an ikiwiki page name? I'm writing a simple script to make updating my blog easier and need to filter invalid characters (so far I've found that # and , aren't allowed ;-)). Thanks for any pointers. -- [[AdamShand]] - -> The default `wiki_file_regexp` matches filenames containing only -> `[-[:alnum:]_.:/+]` -> -> The titlepage() function will convert freeform text to a valid -> page name. See [[todo/should_use_a_standard_encoding_for_utf_chars_in_filenames]] -> for an example. --[[Joey]] - ->> Perfect, thanks! ->> ->> In the end I decided that I didn't need any special characters in filenames and replaced everything but alphanumeric characters with underscores. In addition to replacing bad characters I also collapse multiple underscores into a single one, and strip off trailing and leading underscores to make tidy filenames. If it's useful to anybody else here's a sed example: ->> ->> # echo "++ Bad: ~@#$%^&*()_=}{[];,? Iki: +_-:./ Num: 65.5 ++" | sed -e 's/[^A-Za-z0-9_]/_/g' -e 's/__*/_/g' -e 's/^_//g' -e 's/_$//g' ->> Bad_Iki_Num_65_5 ->> ->>--[[AdamShand]] - -# Upgrade steps from RecentChanges CGI to static page? - -Where are the upgrade steps for RecentChanges change from CGI to static feed? -I run multiple ikiwiki-powered sites on multiple servers, but today I just upgraded one to 2.32.3. -Please have a look at -<http://bsdwiki.reedmedia.net/wiki/recentchanges.html> -Any suggestions? - -> There are no upgrade steps required. It does look like you need to enable -> the meta plugin to get a good recentchanges page though.. --[[Joey]] - -# News site where articles are submitted and then reviewed before posting? - -I am considering moving a news site to Ikiwiki. I am hoping that Ikiwiki has a feature where anonymous posters can submit a form that moderators can review and then accept for it to be posted on a news webpage (like front page of the website). - -> Well, you can have one blog that contains unreviewed articles, and -> moderators can then add a tag that makes the article show up in the main -> news feed. There's nothing stopping someone submitting an article -> pre-tagged though. If you absolutely need to lock that down, you could -> have one blog with unreviewed articles in one subdirectory, and reviewers -> then move the file over to another subdirectory when they're ready to -> publish it. (This second subdirectory would be locked to prevent others -> from writing to it.) --[[Joey]] - -Also it would be good if the news page would keep maybe just the latest 10 entries with links to an archive that make it easy to browse to old entries by date. (Could have over a thousand news articles.) - -> The inline plugin allows setting up things like this. - -Plus users be able to post feedback to news items. If anonymous, they must be approved first. I'd prefer to not use normal "wiki" editor for feedback. - -Any thoughts or examples on this? Any links to examples of news sites or blogs with outside feedback using ikiwiki? - -Thanks --[[JeremyReed]] - +All discussion that used to be here has moved to the [[forum]]. diff --git a/doc/news/code_swarm/discussion.mdwn b/doc/news/code_swarm/discussion.mdwn new file mode 100644 index 000000000..3ecc81b86 --- /dev/null +++ b/doc/news/code_swarm/discussion.mdwn @@ -0,0 +1,3 @@ +Looks like ImageMagick isn't install on the new server! :-) -- AdamShand + +> Thanks for pointing out problem, fixed now. --[[Joey]] diff --git a/doc/news/discussion.mdwn b/doc/news/discussion.mdwn index 351e39c62..d6a548f8b 100644 --- a/doc/news/discussion.mdwn +++ b/doc/news/discussion.mdwn @@ -1,3 +1,9 @@ +## 3.20091017 news item removed? +Hi! Why have you [removed](http://git.ikiwiki.info/?p=ikiwiki;a=blobdiff;f=doc/news/version_3.20091017.mdwn;h=0000000000000000000000000000000000000000;hp=aba830a82f881bd97d11fe644eb2c78b99c2258d;hb=9fdd9af2db2bd21e543fa0f5f4bfa85b56b8dd5c;hpb=b74dceb884a60f6f7be395378a009ee414726d0b) the item for +3.20091017? Perhaps, it's an error, isn't it? The corresponding code AFAIU is still there. --Ivan Z. + +> I always remove old news items when making a new release. The info is still there in the changelog if needed. --[[Joey]] + ## Ikiwiki 3.12 Joey, what about news for Ikiwiki 3.12? The changelog says is has been released diff --git a/doc/news/ikiwiki-hosting.mdwn b/doc/news/ikiwiki-hosting.mdwn new file mode 100644 index 000000000..092530a14 --- /dev/null +++ b/doc/news/ikiwiki-hosting.mdwn @@ -0,0 +1,16 @@ +ikiwiki-hosting is an interface on top of Ikiwiki to allow easy management +of lots of ikiwiki sites. I developed it for +[Branchable](http://www.branchable.com/), an Ikiwiki hosting provider. +It has a powerful, scriptable command-line interface, and also +includes special-purpose ikiwiki plugins for things like a user control +panel. + +To get a feel for it, here are some examples: + + ikisite create foo.ikiwiki.net --admin http://joey.kitenet.net/ + ikisite branch foo.ikiwiki.net bar.ikiwiki.net + ikisite backup bar.ikiwiki.net --stdout | ssh otherhost 'ikisite restore bar.ikiwiki.net --stdin' + +ikiwiki-hosting is free software, released under the AGPL. Its website: +<http://ikiwiki-hosting.branchable.com/> +--[[Joey]] diff --git a/doc/news/openid.mdwn b/doc/news/openid.mdwn index cc13a6548..4f1ee7bf7 100644 --- a/doc/news/openid.mdwn +++ b/doc/news/openid.mdwn @@ -10,4 +10,4 @@ log back in, try out the OpenID signup process if you don't already have an OpenID, and see how OpenID works for you. And let me know your feelings about making such a switch. --[[Joey]] -[[!poll 62 "Accept only OpenID for logins" 19 "Accept only password logins" 36 "Accept both"]] +[[!poll 64 "Accept only OpenID for logins" 21 "Accept only password logins" 36 "Accept both"]] diff --git a/doc/news/openid/discussion.mdwn b/doc/news/openid/discussion.mdwn index aa9f3f0be..e611fa77b 100644 --- a/doc/news/openid/discussion.mdwn +++ b/doc/news/openid/discussion.mdwn @@ -15,6 +15,8 @@ as well. Also have I just created an account on this wiki as well? > can configure it to eg, subscribe your email address to changes to pages. > --[[Joey]] +OK, my openid login works too. One question though, is there a setup parameter which controls whether new registrations are permitted at all? For instance, I'm thinking that I'd like to use the wiki format for content, but I don't want it editable by anyone who isn't already set up. Does this work? --[[Tim Lavoie]] + ---- # How to ban an IP address? diff --git a/doc/news/server_move_2009.mdwn b/doc/news/server_move_2009.mdwn new file mode 100644 index 000000000..8be5debe1 --- /dev/null +++ b/doc/news/server_move_2009.mdwn @@ -0,0 +1,6 @@ +[[!meta title="server move"]] + +The ikiwiki.info domain has been moved to a new server. If you can see +this, your DNS has already caught up and you are using the new server. +By the way, the new server should be somewhat faster. +--[[Joey]] diff --git a/doc/news/version_3.14.mdwn b/doc/news/version_3.14.mdwn deleted file mode 100644 index 83c2b9188..000000000 --- a/doc/news/version_3.14.mdwn +++ /dev/null @@ -1,13 +0,0 @@ -ikiwiki 3.14 released with [[!toggle text="these changes"]] -[[!toggleable text=""" - * highlight: New plugin supporting syntax highlighting of pretty much - anything. - * debian/control: Add suggests for libhighlight-perl, although - that package is not yet created by Debian's highlight source package. - (See #529869) - * format: Provide a htmlizefallback hook that other plugins - can use to handle formats that are not suitable for general-purpose - htmlize hooks. Used by highlight. - * Fix test suite to not rely on an installed copy of ikiwiki after - underlaydir change. Closes: #[530502](http://bugs.debian.org/530502) - * Danish translation update. Closes: #[530877](http://bugs.debian.org/530877)"""]]
\ No newline at end of file diff --git a/doc/news/version_3.141.mdwn b/doc/news/version_3.141.mdwn deleted file mode 100644 index ac76ce023..000000000 --- a/doc/news/version_3.141.mdwn +++ /dev/null @@ -1,28 +0,0 @@ -ikiwiki 3.141 released with [[!toggle text="these changes"]] -[[!toggleable text=""" - * comment: Make comment directives no longer use the internal "\_comment" - form, and document the comment directive syntax. - * Avoid relying on translators preserving the case when translating - "discussion", which caused Discussion pages to get unwanted Discussion - links. - * Tighten up matching of bare words inside directives; do not - allow an unterminated triple string to be treated as a series - of bare words. Fixes runaway regexp recursion/backtracking - in strange situations. - * Setup automator: Check that each plugin added to the generated - setup file can be loaded and that its config is ok. If a plugin - fails for any reason, disable it in the generated file. - Closes: [532001](http://bugs.debian.org/532001) - * pagecount: Fix broken optimisation for * pagespec. - * goto: Support being passed a page title that is not a valid page - name, to support several cases including mercurial's long user - names on the RecentChanges page, and urls with spaces being handled - by the 404 plugin. - * Optimise use of gettext, and avoid ugly warnings if Locale::gettext - is not available. Closes: #[532285](http://bugs.debian.org/532285) - * meta: Add openid delegate parameter to allow delegating only - openid or openid2. - * Disable the Preferences link if no plugin with an auth hook is enabled. - * Updated French translation. Closes: #[532654](http://bugs.debian.org/532654) - * aggregate: Fix storing of changed md5. - * aggregate: Avoid resetting ctime when an item md5 changes."""]] diff --git a/doc/news/version_3.141/discussion.mdwn b/doc/news/version_3.141/discussion.mdwn deleted file mode 100644 index 1f5f39282..000000000 --- a/doc/news/version_3.141/discussion.mdwn +++ /dev/null @@ -1,16 +0,0 @@ -Version 3.141!? Is it not a mistake? Maybe you meant 3.14.1 or 3.15? ---[[Paweł|users/ptecza]] - -> I suspect the next version will be 3.1415 ;) -- [[Jon]] - ->> And next 3.14159, 3.141592, etc. :) I think that version schema ->> should be patented by Joey ;) --[[Paweł|users/ptecza]] - ->>> That's not exactly new; quoting from <http://www-cs-faculty.stanford.edu/~knuth/abcde.html>: ->>> ->>>> The latest and best TeX is currently version 3.1415926 (and plain.tex is version 3.141592653); METAFONT is currently version 2.718281 (and plain.mf is version 2.71). My last will and testament for TeX and METAFONT is that their version numbers ultimately become $\pi$ and $e$, respectively. At that point they will be completely error-free by definition. ->>> ->>> --[[tschwinge]] - ->>>> Thanks for the info, Thomas! I didn't know about it. Sorry Joey, ->>>> but Don Knuth was faster. What a pity... ;) --[[Paweł|users/ptecza]] diff --git a/doc/news/version_3.1415.mdwn b/doc/news/version_3.1415.mdwn deleted file mode 100644 index 93310bc64..000000000 --- a/doc/news/version_3.1415.mdwn +++ /dev/null @@ -1,7 +0,0 @@ -ikiwiki 3.1415 released with [[!toggle text="these changes"]] -[[!toggleable text=""" - * img: Fix extra double quote with alt text. (smcv) - * Updated French debconf templates translation. Closes: #[535103](http://bugs.debian.org/535103) - * openid: Support Net::OpenID 2.x when pretty-printing - openids. (smcv) - * highlight: Fix utf-8 encoding bug. Closes: #[535028](http://bugs.debian.org/535028)"""]]
\ No newline at end of file diff --git a/doc/news/version_3.14159.mdwn b/doc/news/version_3.14159.mdwn deleted file mode 100644 index 21f91fdb4..000000000 --- a/doc/news/version_3.14159.mdwn +++ /dev/null @@ -1,5 +0,0 @@ -ikiwiki 3.14159 released with [[!toggle text="these changes"]] -[[!toggleable text=""" - * svn: Fix rcs\_rename to properly scope call to dirname. - * img: Pass the align parameter through to the generated img tag. - * Move OpenID pretty-printing from openid plugin to core (smcv)"""]]
\ No newline at end of file diff --git a/doc/news/version_3.20100610.mdwn b/doc/news/version_3.20100610.mdwn new file mode 100644 index 000000000..ee7c86d72 --- /dev/null +++ b/doc/news/version_3.20100610.mdwn @@ -0,0 +1,18 @@ +ikiwiki 3.20100610 released with [[!toggle text="these changes"]] +[[!toggleable text=""" + * creation\_day() etc use local time, not gmtime. To match calendars, which + use local time. + * img: Fill in missing height or width when scaling image. + * Remove example blog tag pages; allow autotag creation to create them + when used. + * Fix support for globbing in tagged() pagespecs. + * Fix display of sidebar when previewing page edit. (Thanks, privat) + * relativedate: Fix problem with localised dates not working. + * editpage: Avoid storing accidental state changes when previewing pages. + * page.tmpl: Add a div around the page content, and comments, to aide in + sidebar styling. + * style.css: Improvements to make floating sidebar fit much better on + pages with inlines. + * calendar: Shorten day names, and improve styling of month calendar. + * style.css: Reduced sidebar width back to 20ex from 30; the month calendar + will now fit in the smaller width, and 30 was feeling too large."""]]
\ No newline at end of file diff --git a/doc/news/version_3.20100623.mdwn b/doc/news/version_3.20100623.mdwn new file mode 100644 index 000000000..684217e26 --- /dev/null +++ b/doc/news/version_3.20100623.mdwn @@ -0,0 +1,29 @@ +ikiwiki 3.20100623 released with [[!toggle text="these changes"]] +[[!toggleable text=""" + * openid: Add openid\_realm and openid\_cgiurl configuration options, + useful in a few edge case setups. + * attachment: Show files from underlay in attachments list. + * img: Support hspace and vspace attributes. + * editpage: Rename "comments" field to avoid CSS conflict with the + comments div. + * edittemplate: Make silent mode not disable display when the template + page does not exist, so it can be easily created. + * edittemplate: Look for template pages under templates/ like everything + else (still looks in old location for backwards compatibility). + * attachment: When inserting links, insert img directives for images, + if that plugin is enabled. + * websetup: Allow enabling plugins listed in disable\_plugins. + * editpage, comments: Fix broken links in sidebar (due to forcebaseurl). + (Thanks, privat) + * calendar: Tune archive\_pagespec to only match pages, not other files. + * Fix issues with combining unicode srcdirs and source files. + (Workaround bug #586045) + * Make --gettime be honored after initial setup. + * git: Fix --gettime to properly support utf8 filenames. + * attachment: Support Windows paths when taking basename of client-supplied + file name. + * theme: New plugin, allows easily themeing a site via the underlay. + * Added actiontabs theme by Svend Sorensen. + * Added blueview theme by Bernd Zeimetz. + * mercurial: Fix buggy getctime code. Closes: #[586279](http://bugs.debian.org/586279) + * link: Enhanced to handle URLs and email addresses. (Bernd Zeimetz)"""]]
\ No newline at end of file diff --git a/doc/news/version_3.20100704.mdwn b/doc/news/version_3.20100704.mdwn new file mode 100644 index 000000000..9d2792ce6 --- /dev/null +++ b/doc/news/version_3.20100704.mdwn @@ -0,0 +1,26 @@ +ikiwiki 3.20100704 released with [[!toggle text="these changes"]] +[[!toggleable text=""" + * Changes to avoid display of ugly google openids, by displaying + a username taken from openid. + * API: Add new optional field nickname to rcs\_recentchanges. + * API: rcs\_commit and rcs\_commit\_staged are now passed named + parameters. + * openid: Store nickname based on username or email provided from + openid provider. + * git: Record the nickname from openid in the git author email. + * comment: Record the username from openid in the comment page. + * Fixed some confusion and bugginess about whether + rcs\_getctime/rcs\_getmtime were passed absolute or relative filenames. + (Make it relative like everything else.) + * hnb: Fixed broken use of mkstemp that had caused dangling temp files, + and prevented actually rendering hnb files. + * Use comment template on comments page of example blog. + * comment.tmpl: Fix up display when inline uses it to display a non-comment + page. (Such as a discussion page.) + * git: Added git\_wrapper\_background\_command option. Can be used to eg, + make the git wrapper push to github in the background after ikiwiki + runs. + * po: Added needstranslation() pagespec. (intrigeri) + * po: Added support for .html source pages. (intrigeri) + * comment: Fix problem moderating comments of certian pages with utf-8 + in their name."""]]
\ No newline at end of file diff --git a/doc/news/version_3.20100722.mdwn b/doc/news/version_3.20100722.mdwn new file mode 100644 index 000000000..27ed1fd4d --- /dev/null +++ b/doc/news/version_3.20100722.mdwn @@ -0,0 +1,25 @@ +ikiwiki 3.20100722 released with [[!toggle text="these changes"]] +[[!toggleable text=""" + * img: Add a margin around images displayed by this directive. + * comments: Added commentmoderation directive for easy linking to the + comment moderation queue. + * aggregate: Write timestamp next aggregation can happen to + .ikiwiki/aggregatetime, to allow for more sophisticated cron jobs. + * Add --changesetup mode that allows easily changing options in a + setup file. + * openid: Fix handling of utf-8 nicknames. + * Clarified what the filter hook should be passed: Only be the raw, + complete text of a page. Not a snippet, or data read in from an + unrelated file. + * template: Do not pass filled in template through filter hook. + Avoids causing breakage in po plugin. + * color, comments, conditional, cutpaste, more, sidebar, toggle: Also + avoid unnecessary calls to filter hook. + * po: needstranslation() pagespec can have a percent specified. + * Drop Cache-Control must-revalidate (Firefox 3.5.10 does not seem to have + the caching problem that was added to work around). Closes: #[588623](http://bugs.debian.org/588623) + * Made much more robust in cases where multiple source files produce + conflicting files/directories in the destdir. + * Updated French translation from Philippe Batailler. Closes: #[589423](http://bugs.debian.org/589423) + * po: Fix selflink display on tranlsated pages. (intrigeri) + * Avoid showing 'Add a comment' link at the bottom of the comment post form."""]]
\ No newline at end of file diff --git a/doc/news/version_3.20100804.mdwn b/doc/news/version_3.20100804.mdwn new file mode 100644 index 000000000..be85cb949 --- /dev/null +++ b/doc/news/version_3.20100804.mdwn @@ -0,0 +1,14 @@ +ikiwiki 3.20100804 released with [[!toggle text="these changes"]] +[[!toggleable text=""" + * template: Fix dependency tracking. Broken in version 3.20100427. + * po: The po\_slave\_languages setting is now a list, so the order of + translated languages can be controlled. (intrigeri) + * git: Fix gitweb historyurl examples so "diff to current" links work. + (Thanks jrayhawk) + * meta: Allow syntax closer to html meta to be used. + * Add new disable hook, allowing plugins to perform cleanup after they + have been disabled. + * Use Digest::SHA built into perl rather than external Digest::SHA1 + to simplify dependencies. Closes: #[591040](http://bugs.debian.org/591040) + * Fixes a bug that prevented matching deleted pages when using the page() + PageSpec."""]]
\ No newline at end of file diff --git a/doc/news/version_3.20100815.mdwn b/doc/news/version_3.20100815.mdwn new file mode 100644 index 000000000..1073933a1 --- /dev/null +++ b/doc/news/version_3.20100815.mdwn @@ -0,0 +1,4 @@ +ikiwiki 3.20100815 released with [[!toggle text="these changes"]] +[[!toggleable text=""" + * Fix po test suite to not assume ikiwiki's underlay is already installed. + Closes: #[593047](http://bugs.debian.org/593047)"""]]
\ No newline at end of file diff --git a/doc/peteg.mdwn b/doc/peteg.mdwn new file mode 100644 index 000000000..4e2face0e --- /dev/null +++ b/doc/peteg.mdwn @@ -0,0 +1,7 @@ +I'm adding some plugins to Ikiwiki to support a bioacoustic wiki. See here: + +<http://bioacoustics.cse.unsw.edu.au/wiki/> + +Personal home page: + +<http://peteg.org/> diff --git a/doc/plugins.mdwn b/doc/plugins.mdwn index 527568208..0bea33592 100644 --- a/doc/plugins.mdwn +++ b/doc/plugins.mdwn @@ -1,7 +1,7 @@ Most of ikiwiki's [[features]] are implemented as plugins. Many of these plugins are included with ikiwiki. -[[!pagestats pages="plugins/type/* and !plugins/type/slow"]] +[[!pagestats pages="plugins/type/* and !plugins/type/slow" among="plugins/*"]] There's documentation if you want to [[write]] your own plugins, or you can [[install]] plugins [[contributed|contrib]] by others. @@ -13,7 +13,5 @@ will fit most uses of ikiwiki. ## Plugin directory -[[!inline pages="plugins/* and !plugins/type/* and !plugins/write and -!plugins/write/* and !plugins/contrib and !plugins/install and !*/Discussion" -feedpages="created_after(plugins/graphviz)" archive="yes" sort=title -rootpage="plugins/contrib" postformtext="Add a new plugin named:" show=0]] +[[!map pages="plugins/* and !plugins/type/* and !plugins/write and +!plugins/write/* and !plugins/contrib and !plugins/contrib/*/* and !plugins/install and !*/Discussion"]] diff --git a/doc/plugins/404.mdwn b/doc/plugins/404.mdwn index ad332ee04..bf033202a 100644 --- a/doc/plugins/404.mdwn +++ b/doc/plugins/404.mdwn @@ -1,5 +1,5 @@ [[!template id=plugin name=404 author="[[Simon_McVittie|smcv]]"]] -[[!tag type/useful]] +[[!tag type/web]] This plugin lets you use the IkiWiki CGI script as an Apache 404 handler, to give the behaviour of various other wiki engines where visiting a @@ -13,8 +13,4 @@ file: (The path here needs to be whatever the path is to the ikiwiki.cgi from the root of your web server.) -Or put something like this in the wiki's Lighttpd (>=1.4.17) configuration file: - - server.error-handler-404 = "/ikiwiki.cgi" - diff --git a/doc/plugins/404/discussion.mdwn b/doc/plugins/404/discussion.mdwn new file mode 100644 index 000000000..5a8e8ed85 --- /dev/null +++ b/doc/plugins/404/discussion.mdwn @@ -0,0 +1,3 @@ +With Apache, if you have a page foo/bar/baz but no foo/bar, and if you've +disabled `Indexes` option, you'll end up with a `403` response for foo/bar. +The 404 plugin doesn't try to handle that. But it should. -- [[Jogo]] diff --git a/doc/plugins/aggregate.mdwn b/doc/plugins/aggregate.mdwn index e2efcd83f..2925b6fba 100644 --- a/doc/plugins/aggregate.mdwn +++ b/doc/plugins/aggregate.mdwn @@ -1,13 +1,17 @@ [[!template id=plugin name=aggregate author="[[Joey]]"]] -[[!tag type/useful]] +[[!tag type/special-purpose]] This plugin allows content from other feeds to be aggregated into the wiki. To specify feeds to aggregate, use the [[ikiwiki/directive/aggregate]] [[ikiwiki/directive]]. -The [[meta]] and [[tag]] plugins are also recommended. Either the -[[htmltidy]] or [[htmlbalance]] plugin is suggested, since feeds can easily -contain html problems, some of which these plugins can fix. +## requirements + +The [[meta]] and [[tag]] plugins are also recommended to be used with this +one. Either the [[htmltidy]] or [[htmlbalance]] plugin is suggested, since +feeds can easily contain html problems, some of which these plugins can fix. + +## triggering aggregation You will need to run ikiwiki periodically from a cron job, passing it the --aggregate parameter, to make it check for new posts. Here's an example @@ -15,6 +19,11 @@ crontab entry: */15 * * * * ikiwiki --setup my.wiki --aggregate --refresh +The plugin updates a file `.ikiwiki/aggregatetime` with the unix time stamp +when the next aggregation run could occur. (The file may be empty, if no +aggregation is required.) This can be integrated into more complex cron +jobs or systems to trigger aggregation only when needed. + Alternatively, you can allow `ikiwiki.cgi` to trigger the aggregation. You should only need this if for some reason you cannot use cron, and instead want to use a service such as [WebCron](http://webcron.org). To enable diff --git a/doc/plugins/autoindex.mdwn b/doc/plugins/autoindex.mdwn index 03e2d12f3..7c4e40419 100644 --- a/doc/plugins/autoindex.mdwn +++ b/doc/plugins/autoindex.mdwn @@ -1,7 +1,7 @@ [[!template id=plugin name=autoindex core=0 author="[[Joey]]"]] -[[!tag type/useful]] +[[!tag type/special-purpose]] This plugin searches for [[SubPages|ikiwiki/subpage]] with a missing parent page, and generates the parent pages. The generated page content is -controlled by the `autoindex.tmpl` [[template|wikitemplates]], which by +controlled by the `autoindex.tmpl` [[template|templates]], which by default, uses a [[map]] to list the SubPages. diff --git a/doc/plugins/autoindex/discussion.mdwn b/doc/plugins/autoindex/discussion.mdwn index 82e30aab1..2d6b6f1f0 100644 --- a/doc/plugins/autoindex/discussion.mdwn +++ b/doc/plugins/autoindex/discussion.mdwn @@ -5,3 +5,59 @@ The reason being that I have a lot of directories which need to be autoindexed, but I would prefer if the index files didn't clutter up my git repository. even without that feature the plugin is a great help, thanks + + +------ + +If you just don't want to clutter your git repo, below it's a patch does the following: + +* If you set autoindex_commit to 0 in your ikiwiki.setup file, we *do* place auto-generated markdown files in the **wiki source** but *not* in the **repo** + +* If you set autoindex_commit to 1 (this is the default), auto-generated index files will be put in the repo provided you enabled rcs backend. + +<pre> +--- autoindex.pm.orig 2009-10-01 17:13:51.000000000 +0800 ++++ autoindex.pm 2009-10-01 17:21:09.000000000 +0800 +@@ -17,6 +17,13 @@ + safe => 1, + rebuild => 0, + }, ++ autoindex_commit => { ++ type => 'boolean', ++ default => 1, ++ description => 'commit generated autoindex pages into RCS', ++ safe => 0, ++ rebuild => 0, ++ }, + } + + sub genindex ($) { +@@ -25,7 +32,7 @@ + my $template=template("autoindex.tmpl"); + $template->param(page => $page); + writefile($file, $config{srcdir}, $template->output); +- if ($config{rcs}) { ++ if ($config{rcs} and $config{autoindex_commit}) { + IkiWiki::rcs_add($file); + } + } +@@ -94,13 +101,13 @@ + } + + if (@needed) { +- if ($config{rcs}) { ++ if ($config{rcs} and $config{autoindex_commit}) { + IkiWiki::disable_commit_hook(); + } + foreach my $page (@needed) { + genindex($page); + } +- if ($config{rcs}) { ++ if ($config{rcs} and $config{autoindex_commit}) { + IkiWiki::rcs_commit_staged( + gettext("automatic index generation"), + undef, undef); +</pre> + + +Warning: I guess this patch may work, but I *haven't tested it yet*. -- [[weakish]] diff --git a/doc/plugins/calendar.mdwn b/doc/plugins/calendar.mdwn index d695762b7..76e718a3b 100644 --- a/doc/plugins/calendar.mdwn +++ b/doc/plugins/calendar.mdwn @@ -1,15 +1,11 @@ [[!template id=plugin name=calendar author="[[ManojSrivastava]]"]] -[[!tag type/chrome]] +[[!tag type/widget]] This plugin provides a [[ikiwiki/directive/calendar]] [[ikiwiki/directive]]. The directive displays a calendar, similar to the typical calendars shown on -some blogs. +some blogs. -Since ikiwiki is a wiki compiler, to keep the calendar up-to-date, -wikis that include it need to be periodically refreshes, typically by cron -at midnight. Example crontab: - - 0 0 * * * ikiwiki -setup ~/ikiwiki.setup -refresh +The [[ikiwiki-calendar]] command is used to keep the calendar up-to-date. ## CSS @@ -18,6 +14,7 @@ customization. * `month-calendar` - The month calendar as a whole. * `month-calendar-head` - The head of the month calendar (ie,"March"). +* `month-calendar-arrow` - Arrow pointing to previous/next month. * `month-calendar-day-head` - A column head in the month calendar (ie, a day-of-week abbreviation). * `month-calendar-day-noday`, `month-calendar-day-link`, @@ -31,6 +28,7 @@ customization. weekends. * `year-calendar` - The year calendar as a whole. * `year-calendar-head` - The head of the year calendar (ie, "2007"). +* `year-calendar-arrow` - Arrow pointing to previous/next year. * `year-calendar-subhead` - For example, "Months". * `year-calendar-month-link`, `year-calendar-month-nolink`, `year-calendar-month-future`, `year-calendar-this-month` - The month diff --git a/doc/plugins/calendar/discussion.mdwn b/doc/plugins/calendar/discussion.mdwn index 9d57b7a1e..bb76a9d8b 100644 --- a/doc/plugins/calendar/discussion.mdwn +++ b/doc/plugins/calendar/discussion.mdwn @@ -1,6 +1,15 @@ It would be nice if the "month" type calendar could collect all of the matching pages on a given date in some inline type way. --[[DavidBremner]] +> I agree, but I have not come up with good html to display them. Seems +> it might need some sort of popup. + Is it possible to get the calendar to link to pages based not on their timestamp (as I understand that it does now, or have I misunderstood this?) and instead on for example their location in a directory hierarchy. That way the calendar could be used as a planning / timeline device which I think would be great. --[[Alexander]] -I would like the ability to specify relative previous months. This way I could have a sidebar with the last three months by specifying no month, then 'month="-1"' and 'month="-2"'. Negative numbers for the month would otherwise be invalid, so this shouldn't produce any conflicts with expected behavior. (Right?) -- [[StevenBlack]] +I would like the ability to specify relative previous months. This way I +could have a sidebar with the last three months by specifying no month, +then 'month="-1"' and 'month="-2"'. Negative numbers for the month would +otherwise be invalid, so this shouldn't produce any conflicts with expected +behavior. (Right?) -- [[StevenBlack]] + +> Great idea! Just implemented that and also relative years. --[[Joey]] diff --git a/doc/plugins/color.mdwn b/doc/plugins/color.mdwn index dbb8b870c..d639bf563 100644 --- a/doc/plugins/color.mdwn +++ b/doc/plugins/color.mdwn @@ -1,5 +1,5 @@ [[!template id=plugin name=color core=0 author="[[ptecza]]"]] -[[!tag type/chrome]] +[[!tag type/widget]] This plugin provides a [[ikiwiki/directive/color]] [[ikiwiki/directive]]. The directive can be used to color a piece of text on a page. diff --git a/doc/plugins/comments.mdwn b/doc/plugins/comments.mdwn index 7e2232411..48b6c6ae7 100644 --- a/doc/plugins/comments.mdwn +++ b/doc/plugins/comments.mdwn @@ -1,5 +1,5 @@ [[!template id=plugin name=comments author="[[Simon_McVittie|smcv]]"]] -[[!tag type/useful]] +[[!tag type/web]] This plugin adds "blog-style" comments. Unlike the wiki-style freeform Discussion pages, these comments are posted by a simple form, cannot later @@ -14,8 +14,8 @@ authorship should hopefully be unforgeable by CGI users. The intention is that on a non-wiki site (like a blog) you can lock all pages for admin-only access, then allow otherwise unprivileged (or perhaps even anonymous) users to comment on posts. See the documentation of the -[[lockedit]] and [[anonok]] pages for details on locking down a wiki so -users can only post comments. +[[opendiscussion]], [[lockedit]] and [[anonok]] pages for details on locking +down a wiki so readers can only post comments. Individual comments are stored as internal-use pages named something like `page/comment_1`, `page/comment_2`, etc. These pages internally use a @@ -45,8 +45,11 @@ There are some global options for the setup file: ## comment moderation If you enable the [[blogspam]] plugin, comments that appear spammy will be -held for moderation. Wiki admins can access the comment moderation queue +held for moderation. (Or with the [[moderatedcomments]] plugin, all +comments will be held.) Wiki admins can access the comment moderation queue via a button on their Preferences page. -The comments are stored in `.ikiwiki/comments_pending/`, and can be -deleted, or moved into the wiki's srcdir to be posted. +Comments pending moderation are not checked into revision control. +To find unmoderated comments, `find /your/ikiwiki/srcdir -name '*._comment_pending'` +To manually moderate a comment, just rename the file, removing the +"_pending" from the end, and check it into revision control. diff --git a/doc/plugins/comments/discussion.mdwn b/doc/plugins/comments/discussion.mdwn index 396d1f6d4..7cc8a4b2a 100644 --- a/doc/plugins/comments/discussion.mdwn +++ b/doc/plugins/comments/discussion.mdwn @@ -1,3 +1,9 @@ +## Moderating comments from the CLI + +How do you do this, without using the UI in the Preferences? + +Please put this info on the page. Many thanks --[[Kai Hendry]] + ## Why internal pages? (unresolved) Comments are saved as internal pages, so they can never be edited through the CGI, diff --git a/doc/plugins/conditional.mdwn b/doc/plugins/conditional.mdwn index 95ffb2764..27a99bb7c 100644 --- a/doc/plugins/conditional.mdwn +++ b/doc/plugins/conditional.mdwn @@ -1,5 +1,5 @@ [[!template id=plugin name=conditional core=1 author="[[Joey]]"]] -[[!tag type/format]] +[[!tag type/special-purpose]] This plugin provides the [[ikiwiki/directive/if]] [[ikiwiki/directive]]. With this directive, you can make text be conditionally displayed on a page. diff --git a/doc/plugins/conditional/discussion.mdwn b/doc/plugins/conditional/discussion.mdwn index 629d05940..6e84fdfc1 100644 --- a/doc/plugins/conditional/discussion.mdwn +++ b/doc/plugins/conditional/discussion.mdwn @@ -1,3 +1,28 @@ +## Conditional broken? + +Using \[\[!if test="tagged(plugin)" then="= Tagged as plugin =" else="*No plugins found*"]] on this wiki *should* present the 'Tagged as plugin' heading, instead it emits 'no plugins found'. Is the conditional plugin currently broken for tags or am I misusing it? Thanks. + +-- Thiana + +> This wiki has no page named "plugin", so nothing links to it; tags are a species of link +> so tagging a large number of pages with a tag that doesn't exist (which change has +> been reverted) doesn't make the pagespec match. It would if the tag's page existed. --[[Joey]] + +>> So if I understand this correctly... Assuming the tags Tag_A and Tag_B, the existence of +>> @wiki-home@/tags/Tag_A.creole, and a number of files with a \[\[!tag Tag_A Tag_B]] the +>> following is correct? +>> +>> * \[\[!if test="tagged(Tag_A)" then="OK" else="Fail"]] => OK +>> * \[\[!if test="tagged(Tag_B)" then="OK" else="Fail"]] => Fail +>> * \[\[!if test="tagged(Tag_A) and tagged(Tag_B)" then="OK" else="Fail"]] => Fail +>> +>> Is that the expected behaviour? If so, that's not what I'm seeing here since they all result +>> in a Fail. If not, what exactly is wrong with those conditionals? Thanks. +>> +>> -- Thiana + +---- + Would there be a way for this plugin to emit fewer blank lines (i.e. *none at all*)? For example, having a look at [this page](http://www.bddebian.com/~wiki/Hurd/)'s sidebar. diff --git a/doc/plugins/contrib.mdwn b/doc/plugins/contrib.mdwn index a03e6a95d..d8199a756 100644 --- a/doc/plugins/contrib.mdwn +++ b/doc/plugins/contrib.mdwn @@ -1,6 +1,4 @@ These plugins are provided by third parties and are not currently included in ikiwiki. See [[install]] for installation help. -[[!inline pages="plugins/contrib/* and !*/Discussion" -feedpages="created_after(plugins/contrib/navbar)" archive="yes" -rootpage="plugins/contrib" postformtext="Add a new plugin named:" show=0]] +[[!map pages="plugins/contrib/* and !plugins/contrib/*/* and !*/Discussion"]] diff --git a/doc/plugins/contrib/album.mdwn b/doc/plugins/contrib/album.mdwn index 395c99bce..3cfcb68d4 100644 --- a/doc/plugins/contrib/album.mdwn +++ b/doc/plugins/contrib/album.mdwn @@ -9,9 +9,11 @@ thoughts about this plugin). This plugin formats a collection of images into a photo album, in the same way as many websites: good examples include the PHP application [Gallery](http://gallery.menalto.com/), Flickr, -and Facebook's Photos "application". I've called it `album` -to distinguish it from [[contrib/gallery|plugins/contrib/gallery]], -although `gallery` might well be a better name for this functionality. +and Facebook's Photos "application". + +I've called it `album` to distinguish it from +[[contrib/gallery|plugins/contrib/gallery]], although `gallery` might well be +a better name for this functionality. The web UI I'm trying to achieve consists of one [HTML page of thumbnails](http://www.pseudorandom.co.uk/2008/2008-03-08-panic-cell-gig/) @@ -26,83 +28,129 @@ individual photos can't be bookmarked in a meaningful way, and the best it can do as a fallback for non-Javascript browsers is to provide a direct link to the image.) -## Writing the viewers +<h2 id="album"><code>album</code> directive</h2> + +Each page containing an `album` directive is treated as a photo album. + +Every image attached to an album or its subpages is considered to be part of +the album. A "viewer" page, with the wiki's default page extension, will be +generated to display the image, if there isn't already a page of the same +name as the image: for instance, if `debconf` is an album and +`debconf/tuesday/p100.jpg` exists, then `debconf/tuesday/p100.mdwn` might +be created. + +There's currently a hard-coded list of extensions that are treated as images: +`png`, `gif`, `jpg`, `jpeg` or `mov` files. More image and video types could +be added in future. Videos aren't currently handled very well; +ideally, something like totem-video-thumbnailer would be used. + +The `album` directive also produces an [[ikiwiki/directive/inline]] which +automatically includes all the viewers for this album, except those that +will appear in an <a href="#albumsection">albumsection</a> (if every image +is in a section, then the `album` directive won't have any visible effect). - \[[!albumimage image=foo.jpg album=myalbum - title=... - caption=... - copyright=... - size=... - viewertemplate=... - ]] +The `inline` is in `archive` and `quick` mode, but can include some +extra information about the images, including file size and a thumbnail made +using [[ikiwiki/directive/img]]). The default template is `albumitem.tmpl`, +which takes advantage of these things. -Each viewer contains one `\[[!albumimage]]` directive. This -sets the `image` filename, the `album` in which this image appears, -and an optional `caption`, and can override the `size` at which to -display the image and the `viewertemplate` to use to display the -image. +<h2 id="albumsection"><code>albumsection</code> directive</h2> -It can also have `title`, `copyright` and `date` parameters, which -are short-cuts for [[ikiwiki/directive/meta]] directives. +The `albumsection` directive is used to split an album into sections. It can +only appear on a page that also has the <a href="#album">album</a> directive. + +The `filter` parameter is a [[ikiwiki/PageSpec]] against which viewer pages +are matched. The `albumsection` directive displays all the images that match +the filter, and the `album` directive displays any leftover images, like +this: + + # Holiday photos + + \[[!album]] + <!-- replaced with a list of any uncategorized photos, which might be + empty --> -The viewer can also have any other content, but typically the -directive will be the only thing there. + ## People -Eventually, there will be a specialized CGI user interface to -edit all the photos of an album at once, upload a new photo -(which will attach the photo but also write out a viewer page -for it), or mark an already-uploaded photo as a member of an -album (which is done by writing out a viewer page for it). + \[[!albumsection filter="tagged(people)"]] + <!-- replaced with a list of photos tagged 'people', including + any that are also tagged 'landscapes' --> -The `\[[!albumimage]]` directive is replaced by an + ## Landscapes + + \[[!albumsection filter="tagged(landscapes)"]] + <!-- replaced with a list of photos tagged 'landscapes', including + any that are also tagged 'people' --> + +<h2 id="albumimage"><code>albumimage</code> directive</h2> + +Each viewer page produced by the <a href="#album">album</a> directive +contains an `albumimage` directive, which is replaced by an [[ikiwiki/directive/img]], wrapped in some formatting using a -template (by default `albumviewer.tmpl`). The template can (and -should) also include "next photo", "previous photo" and -"up to gallery" links. +template (by default it's `albumviewer.tmpl`). That template can also include +links to the next photo, the previous photo and the album it's in; the default +template has all of these. -The next/previous links are themselves implemented by -[[inlining|ikiwiki/directive/inline]] the next or previous -photo, using a special template (by default `albumnext.tmpl` -or `albumprev.tmpl`), in `archive`/`quick` mode. +The next/previous links are themselves implemented by evaluating a template, +either `albumnext.tmpl` or `albumprev.tmpl` by default. -> With hindsight, using an inline here is wrong - I should just -> run hooks and fill in the template within the album plugin. -> inline has some specialized functionality that's overkill -> here, and its delayed HTML substitution breaks the ability -> to have previous/up/next links both above and below the -> photo, for instance. --[[smcv]] +The directive can also have parameters: -## Writing the album +* `title`, `copyright` and `date` are short-cuts for the corresponding + [[ikiwiki/directive/meta]] directives -The album contains one `\[[!album]]` directive. It may also -contain any number of `\[[!albumsection]]` directives, for -example the demo album linked above could look like: +* `caption` sets a caption which is displayed in the album and viewer + pages - \[[!album]] - <!-- replaced with one uncategorized photo --> +The viewer page can also have other contents before or after the actual +image viewer. + +## Bugs + +* The plugin doesn't do anything special to handle albums that are subpages + of each other. If, say, `debconf` and `debconf/monday` are both albums, + then `debconf/monday/p100.jpg` will currently be assigned to one or the + other, arbitrarily. + +* The plugin doesn't do anything special to handle photos with similar names. + If you have `p100.jpg` and `p100.png`, one will get a viewer page called + `p100` and the other will be ignored. + +* If there's no `albumimage` in a viewer page, one should probably be appended + automatically. - ## Gamarra +## TODO - \[[!albumsection filter="link(gamarra)"]] - <!-- all the Gamarra photos --> +* The documentation should mention how to replicate the appearance of + `album` and `albumsection` using an `inline` of viewer pages. - ## Smokescreen +* The documentation should mention all the template variables and + all the parameters. - \[[!albumsection filter="link(smokescreen)"]] - <!-- all the Smokescreen photos --> +* The generated viewer page should include most or all of the possible + parameters to the `albumimage` directive, with empty values, as a + template for editing. - ... +* The generated viewer page should extract as much metadata as possible from + the photo's EXIF tags (creation/modification dates, author, title, caption, + copyright). [[smcv]] has a half-written implementation which runs + `scanimage` hooks, and has an `exiftool` plugin using [[!cpan Image::ExifTool]] + as a reference implementation of that hook. -The `\[[!album]]` directive is replaced by an -[[ikiwiki/directive/inline]] which automatically includes every -page that has an `\[[!albumimage]]` directive linking it to this -album, except those that will appear in an `\[[!albumsection]]`. +* There should be an option to reduce the size of photos and write them into + an underlay, for this workflow: -The `inline` is in `archive`/`quick` mode, but includes some -extra information about the images, including file size and a -thumbnail (again, made using [[ikiwiki/directive/img]]). The -default template is `albumitem.tmpl`, which takes advantage -of these things. + * your laptop's local ikiwiki has two underlays, `photos` and `webphotos` + * `photos` contains full resolution photos with EXIF tags + * for each photo that exists in `photos` but not in `webphotos`, the album + plugin automatically resamples it down to a web-compatible resolution + ([[smcv]] uses up to 640x640), optimizes it with `jpegoptim`, strips out + all EXIF tags, and and writes it into the corresponding location + in `webphotos` + * `webphotos` is what you rsync to the web server + * the web server's ikiwiki only has `webphotos` as an underlay -Each `\[[!albumsection]]` is replaced by a similar inline, which -selects a subset of the photos in the album. +* Eventually, there could be a specialized CGI user interface to batch-edit + all the photos of an album (so for each photo, you get an edit box each for + title, author, copyright etc.) - this would work by making programmatic + edits to all the `albumimage` directives. diff --git a/doc/plugins/contrib/album/discussion.mdwn b/doc/plugins/contrib/album/discussion.mdwn new file mode 100644 index 000000000..0356860d8 --- /dev/null +++ b/doc/plugins/contrib/album/discussion.mdwn @@ -0,0 +1,404 @@ +thanks for this plugin. it might help me in my application, which is to provide album/galleries which can be edited (ie. new images added, taken away, etc.) through web interface. + +> That's my goal eventually, too. Perhaps you can help to +> design/write this plugin? At the moment I'm mostly +> waiting for a design "sanity check" from [[Joey]], +> but any feedback you can provide on the design would +> also be helpful. --[[smcv]] + +i have two challenges: firstly, for installation, i'm not sure what all the files are that need to be downloaded (because of my setup i can't easily pull the repo). so far i have Ikiwiki/Plugins/album.pm; ikiwiki-album; and 4 files in templates/ any others? + +> Those are all the added files; ikiwiki-album isn't strictly +> needed (IkiWiki itself doesn't use that code, but you can +> use it to turn a directory full of images into correct +> input for the album plugin). +> +> You probably also want the album plugin's expanded version of +> style.css (or put its extra rules in your local.css). +> Without that, your albums will be quite ugly. +> +> There aren't currently any other files modified by my branch. +> --[[smcv]] + +secondly: barring the CGI interface for editing the album, which would be great, is there at least a way to use attachment plugin or any other to manually add images and then create viewers for them? + +> Images are just attachments, and viewers are pages (any supported +> format, but .html will be fastest to render). Attach each image, +> then write a page for each image containing the +> \[[!albumimage]] directive (usually it will *only* contain that +> directive). +> +> The script ikiwiki-album can help you to do this in a git/svn/etc. +> tree; doing it over the web will be a lot of work (until I get +> the CGI interface written), but it should already be possible! +> +> The structure is something like this: +> +> * album.mdwn (contains the \[[!album]] directive, and perhaps also +> some \[[!albumsection]] directives) +> * album/a.jpg +> * album/a.html (contains the \[[!albumimage]] directive for a.jpg) +> * album/b.jpg +> * album/b.html (contains the \[[!albumimage]] directive for b.jpg) +> +> Have a look at ikiwiki-album to see how the directives are meant to +> work in practice. +> +> --[[smcv]] + +>> In the current version of the branch, the viewer pages are +>> generated automatically if you didn't generate them yourself, +>> so `ikiwiki-album` is no longer needed. --[[smcv]] + +i'm new to ikiwiki, apologies if this is dealt with elsewhere. -brush + +> This plugin is pretty ambitious, and is unfinished, so I'd recommend +> playing with a normal IkiWiki installation for a bit, then trying +> out this plugin when you've mastered the basics of IkiWiki. --[[smcv]] + +---- + +You had wanted my feedback on the design of this. I have not looked at the +code or tried it yet, but here goes. --[[Joey]] + +* Needing to create the albumimage "viewer" pages for each photo + seems like it will become a pain. Everyone will need to come up + with their own automation for it, and then there's the question + of how to automate it when uploading attachments. -J + +> There's already a script (ikiwiki-album) to populate a git +> checkout with skeleton "viewer" pages; I was planning to make a +> specialized CGI interface for albums after getting feedback from +> you (since the requirements for that CGI interface change depending +> on the implementation). I agree that this is ugly, though. -s + +>> Would you accept a version where the albumimage "viewer" pages +>> could be 0 bytes long, at least until metadata gets added? +>> +>> The more I think about the "binaries as first-class pages" approach, +>> the more subtle interactions I notice with other plugins. I +>> think I'm up to needing changes to editpage, comments, attachment +>> and recentchanges, plus adjustments to img and Render (to reduce +>> duplication when thumbnailing an image with a strange extension +>> while simultaneously changing the extension, and to hardlink/copy +>> an image with a strange extension to a differing target filename +>> with the normal extension, respectively). -s + +>>> Now that we have `add_autofile` I can just create viewer pages +>>> whenever there's an image to view. The current version of the +>>> branch does that. -s + +* With each viewer page having next/prev links, I can see how you + were having the scalability issues with ikiwiki's data structures + earlier! -J + +> Yeah, I think they're a basic requirement from a UI point of view +> though (although they don't necessarily have to be full wikilinks). +> -s + +>> I think that with the new dependency types system, the dependencies for +>> these can be presence dependencies, which will probably help with +>> avoiding rebuilds of a page if the next/prev page is changed. +>> (Unless you use img to make the thumbnails for those links, then it +>> would rebuild the thumbnails anyway. Have not looked at the code.) --[[Joey]] + +>>> I do use img. -s + +* And doesn't each viewer page really depend on every other page in the + same albumsection? If a new page is added, the next/prev links + may need to be updated, for example. If so, there will be much + unnecessary rebuilding. -J + +> albumsections are just a way to insert headings into the flow of +> photos, so they don't actually affect dependencies. +> +> One non-obvious constraint of ikiwiki's current design is that +> everything "off-page" necessary to build any page has to happen +> at scan time, which has caused a few strange design decisions, +> like the fact that each viewer controls what album it's in. +> +> It's difficult for the contents of the album to just be a +> pagespec, like for inline, because pagespecs can depend on +> metadata, which is gathered in arbitrary order at scan time; +> so the earliest you can safely apply a pagespec to the wiki +> contents to get a concrete list of pages is at rebuild time. +> +> (This stalled my attempt at a trail plugin, too.) -s + +>> Not sure I understand why these need to look at pagespecs at scan time? +>> Also, note that it is fairly doable to detect if a pagespec uses such +>> metadata. Er, I mean, I have a cheezy hack in `add_depends` now that does +>> it to deal with a similar case. --[[Joey]] + +>>> I think I was misunderstanding how early you have to call `add_depends`? +>>> The critical thing I missed was that if you're scanning a page, you're +>>> going to rebuild it in a moment anyway, so it doesn't matter if you +>>> have no idea what it depends on until the rebuild phase. -s + +* One thing I do like about having individual pages per image is + that they can each have their own comments, etc. -J + +> Yes; also, they can be wikilinked. I consider those to be +> UI requirements. -s + +* Seems possibly backwards that the albumimage controls what album + an image appears in. Two use cases -- 1: I may want to make a locked + album, but then anyone who can write to any other page on the wiki can + add an image to it. 2: I may want an image to appear in more than one + album. Think tags. So it seems it would be better to have the album + directive control what pages it includes (a la inline). -J + +> I'm inclined to fix this by constraining images to be subpages of exactly +> one album: if they're subpages of 2+ nested albums then they're only +> considered to be in the deepest-nested one (i.e. longest URL), and if +> they're not in any album then that's a usage error. This would +> also make prev/next links sane. -s + +>> The current version constrains images to be in at most one album, +>> choosing one arbitrarily (dependent on scan order) if albums are +>> nested. -s + +> If you want to reference images from elsewhere in the wiki and display +> them as if in an album, then you can use an ordinary inline with +> the same template that the album would use, and I'll make sure the +> templates are set up so this works. -s + +>> Still needs documenting, I've put it on the TODO list on the main +>> page. -s + +> (Implementation detail: this means that an image X/Y/Z/W/V, where X and +> Y are albums, Z does not exist and W exists but is not an album, +> would have a content dependency on Y, a presence dependency on Z +> and a content dependency on W.) +> +> Perhaps I should just restrict to having the album images be direct +> subpages of the album, although that would mean breaking some URLs +> on the existing website I'm doing all this work for... -s + +>> The current version of the branch doesn't have this restriction; +>> perhaps it's a worthwhile simplification, or perhaps it's too +>> restrictive? I fairly often use directory hierarchies like +>> `a_festival/saturday/foo.jpg` within an album, which makes +>> it very easy to write `albumsection` filters. -s + +* Putting a few of the above thoughts together, my ideal album system + seems to be one where I can just drop the images into a directory and + have them appear in the album index, as well as each generate their own wiki + page. Plus some way I can, later, edit metadata for captions, + etc. (Real pity we can't just put arbitrary metadata into the images + themselves.) This is almost pointing toward making the images first-class + wiki page sources. Hey, it worked for po! :) But the metadata and editing + problems probably don't really allow that. -J + +> Putting a JPEG in the web form is not an option from my point of +> view :-) but perhaps there could just be a "web-editable" flag supplied +> by plugins, and things could be changed to respect it. + +>> Replying to myself: would you accept patches to support +>> `hook(type => 'htmlize', editable => 0, ...)` in editpage? This would +>> essentially mean "this is an opaque binary: you can delete it +>> or rename it, and it might have its own special editing UI, but you +>> can never get it in a web form". +>> +>> On the other hand, that essentially means we need to reimplement +>> editpage in order to edit the sidecar files that contain the metadata. +>> Having already done one partial reimplementation of editpage (for +>> comments) I'm in no hurry to do another. +>> +>> I suppose another possibility would be to register hook +>> functions to be called by editpage when it loads and saves the +>> file. In this case, the loading hook would be to discard +>> the binary and use filter() instead, and the saving conversion +>> would be to write the edited content into the metadata sidecar +>> (creating it if necessary). +>> +>> I'd also need to make editpage (and also comments!) not allow the +>> creation of a file of type albumjpg, albumgif etc., which is something +>> I previously missed; and I'd need to make attachment able to +>> upload-and-rename. +>> -s + +>>> I believe the current branch meets your requirements, by having +>>> first-class wiki pages spring into existence using `add_autofile` +>>> to be viewer pages for photos. -s + +> In a way, what you really want for metadata is to have it in the album +> page, so you can batch-edit the whole lot by editing one file (this +> does mean that editing the album necessarily causes each of its viewers +> to be rebuilt, but in practice that happens anyway). -s + +>> Replying to myself: in practice that *doesn't* happen anyway. Having +>> the metadata in the album page is somewhat harmful because it means +>> that changing the title of one image causes every viewer in the album +>> to be rebuilt, whereas if you have a metadata file per image, only +>> the album itself, plus the next and previous viewers, need +>> rebuilding. So, I think a file per image is the way to go. +>> +>> Ideally we'd have some way to "batch-edit" the metadata of all +>> images in an album at once, except that would make conflict +>> resolution much more complicated to deal with; maybe just +>> give up and scream about mid-air collisions in that case? +>> (That's apparently good enough for Bugzilla, but not really +>> for ikiwiki). -s + +>>> This is now in the main page's TODO list; if/when I implement this, +>>> I intend to make it a specialized CGI interface. -s + +>> Yes, [all metadata in one file] would make some sense.. It also allows putting one image in +>> two albums, with different caption etc. (Maybe for different audiences.) +>> --[[Joey]] + +>>> Eek. No, that's not what I had in mind at all; the metadata ends up +>>> in the "viewer" page, so it's necessarily the same for all albums. -s + +>> It would probably be possible to add a new dependency type, and thus +>> make ikiwiki smart about noticing whether the metadata has actually +>> changed, and only update those viewers where it has. But the dependency +>> type stuff is still very new, and not plugin friendly .. so only just +>> possible, --[[Joey]] + +---- + +'''I think the "special extension" design is a dead-end, but here's what +happened when I tried to work out how it would work. --[[smcv]]''' + +Suppose that each viewer is a JPEG-or-GIF-or-something, with extension +".albumimage". We have a gallery "memes" with three images, badger, +mushroom and snake. + +> An alternative might be to use ".album.jpg", and ".album.gif" +> etc as the htmlize extensions. May need some fixes to ikiwiki to support +> that. --[[Joey]] + +>> foo.albumjpg (etc.) for images, and foo._albummeta (with +>> `keepextension => 1`) for sidecar metadata files, seems viable. -s + +Files in git repo: + +* index.mdwn +* memes.mdwn +* memes/badger.albumjpg (a renamed JPEG) +* memes/badger/comment_1._comment +* memes/badger/comment_2._comment +* memes/mushroom.albumgif (a renamed GIF) +* memes/mushroom._albummeta (sidecar file with metadata) +* memes/snake.albummov (a renamed video) + +Files in web content: + +* index.html +* memes/index.html +* memes/96x96-badger.jpg (from img) +* memes/96x96-mushroom.gif (from img) +* memes/96x96-snake.jpg (from img, hacked up to use totem-video-thumbnailer :-) ) +* memes/badger/index.html (including comments) +* memes/badger.jpg +* memes/mushroom/index.html +* memes/mushroom.gif +* memes/snake/index.html +* memes/snake.mov + +ispage("memes/badger") (etc.) must be true, to make the above rendering +happen, so albumimage needs to be a "page" extension. + +To not confuse other plugins, album should probably have a filter() hook +that turns .albumimage files into HTML? That'd probably be a reasonable +way to get them rendered anyway. + +> I guess that is needed to avoid preprocess, scan, etc trying to process +> the image, as well as eg, smiley trying to munge it in sanitize. +> --[[Joey]] + +>> As long as nothing has a filter() hook that assumes it's already +>> text... filters are run in arbitrary order. We seem to be OK so far +>> though. +>> +>> If this is the route I take, I propose to have the result of filter() +>> be the contents of the sidecar metadata file (empty string if none), +>> with the `\[[!albumimage]]` directive (which no longer requires +>> arguments) prepended if not already present. This would mean that +>> meta directives in the metadata file would work as normal, and it +>> would be possible to insert text both before and after the viewer +>> if desired. The result of filter() would also be a sensible starting +>> point for editing, and the result of editing could be diverted into +>> the metadata file. -s + +do=edit&page=memes/badger needs to not put the JPG in a text box: somehow +divert or override the normal edit CGI by telling it that .albumimage +files are not editable in the usual way? + +> Something I missed here is that editpage also needs to be told that +> creating new files of type albumjpg, albumgif etc. is not allowed +> either! -s + +Every image needs to depend on, and link to, the next and previous images, +which is a bit tricky. In previous thinking about this I'd been applying +the overly strict constraint that the ordered sequence of pages in each +album must be known at scan time. However, that's not *necessarily* needed: +the album and each photo could collect an unordered superset of dependencies +at scan time, and at rebuild time that could be refined to be the exact set, +in order. + +> Why do you need to collect this info at scan time? You can determine it +> at build time via `pagespec_match_list`, surely .. maybe with some +> memoization to avoid each image in an album building the same list. +> I sense that I may be missing a subtelty though. --[[Joey]] + +>> I think I was misunderstanding how early you have to call `add_depends` +>> as mentioned above. -s + +Perhaps restricting to "the images in an album A must match A/*" +would be useful; then the unordered superset could just be "A/*". Your +"albums via tags" idea would be nice too though, particularly for feature +parity with e.g. Facebook: "photos of Joey" -> "tags/joey and albumimage()" +maybe? + +If images are allowed to be considered to be part of more than one album, +then a pretty and usable UI becomes harder - "next/previous" expands into +"next photo in holidays/2009/germany / next photo in tagged/smcv / ..." +and it could get quite hard to navigate. Perhaps next/previous links could +be displayed only for the closest ancestor (in URL space) that is an +album, or something? + +> Ugh, yeah, that is a problem. Perhaps wanting to support that was just +> too ambitious. --[[Joey]] + +>> I propose to restrict to having images be subpages of albums, as +>> described above. -s + +Requiring renaming is awkward for non-technical Windows/Mac users, with both +platforms' defaults being to hide extensions; however, this could be +circumvented by adding some sort of hook in attachment to turn things into +a .albumimage at upload time, and declaring that using git/svn/... without +extensions visible is a "don't do that then" situation :-) + +> Or extend `pagetype` so it can do the necessary matching without +> renaming. Maybe by allowing a subdirectory to be specified along +> with an extension. (Or allow specifying a full pagespec, +> but I hesitate to seriously suggest that.) --[[Joey]] + +>> I think that might be a terrifying idea for another day. If we can +>> mutate the extension during the `attach` upload, that'd be enough; +>> I don't think people who are skilled enough to use git/svn/..., +>> but not skilled enough to tell Explorer to show file extensions, +>> represent a major use case. -s + +Ideally attachment could also be configured to upload into a specified +underlay, so that photos don't have to be in your source-code control +(you might want that, but I don't!). + +> Replying to myself: perhaps best done as an orthogonal extension +> to attach? -s + +> Yet another non-obvious thing this design would need to do is to find +> some way to have each change to memes/badger._albummeta show up as a +> change to memes/badger in `recentchanges`. -s + +Things that would be nice, and are probably possible: + +* make the "Edit page" link on viewers divert to album-specific CGI instead + of just failing or not appearing (probably possible via pagetemplate) + +* some way to deep-link to memes/badger.jpg with a wikilink, without knowing a + priori that it's secretly a JPEG (probably harder than it looks - you'd + have to make a directive for it and it's probably not worth it) diff --git a/doc/plugins/contrib/cvs.mdwn b/doc/plugins/contrib/cvs.mdwn deleted file mode 100644 index 1ff71d274..000000000 --- a/doc/plugins/contrib/cvs.mdwn +++ /dev/null @@ -1,31 +0,0 @@ -[[!template id=plugin name=cvs core=0 author="[[schmonz]]"]] - -This plugin allows ikiwiki to use [[!wikipedia desc="CVS" Concurrent Versions System]] as an [[rcs]]. - -* Diffs are against [[3.14159|news/version_3.14159]]. `cvs.pm` started life as a copy of `svn.pm`. -* `IkiWiki.pm:wiki_file_prune_regexps` avoids copying CVS metadata into `$DESTDIR`. -* [[ikiwiki-makerepo]]: - * creates a repository, - * imports `$SRCDIR` into top-level module `ikiwiki` (vendor tag IKIWIKI, release tag PRE_CVS), - * creates a small post-commit wrapper to prevent `cvs add <directory>` from being seen by ikiwiki's [[post-commit]] hook, - * configures the wrapper itself as a post-commit hook in `CVSROOT/loginfo`. -* [`cvsps`](http://www.cobite.com/cvsps/) is required (`rcs_recentchanges()` and `rcs_diff()` need it to work). -* CVS multi-directory commits happen separately; the post-commit hook sees only the first directory's changes in time for [[recentchanges|plugins/recentchanges]]. The next run of `ikiwiki --setup` will correctly re-render such a recentchanges entry. It might be possible to solve this problem with scripts like `commit_prep` and `log_accum` from CVS contrib. -* Due to the name of CVS's metadata directories, it's impossible to create `.../CVS/foo.mdwn`. On case-insensitive filesystems it's also impossible to create `.../cvs/foo.mdwn`. -* No testing or special-casing has been done with [[attachments|plugins/attachment]], but they'll probably need `cvs add -kb`. - -Having a `$HOME/.cvsrc` isn't necessary. Sure does make using CVS more livable, though. Here's a good general-purpose one: - - cvs -q - checkout -P - update -dP - diff -u - rdiff -u - -Not knowing how the tests get set up, I blindly attempted to add subversion-like tests to `t/file_pruned.t`. They fail. But the plugin definitely works. :-) - -### Code -* [`cvs.pm`](http://www.netbsd.org/~schmonz/ikiwiki-cvs/cvs.pm) -* [`cvs-IkiWiki.pm.diff`](http://www.netbsd.org/~schmonz/ikiwiki-cvs/cvs-IkiWiki.pm.diff) -* [`cvs-ikiwiki-makerepo.diff`](http://www.netbsd.org/~schmonz/ikiwiki-cvs/cvs-ikiwiki-makerepo.diff) -* [`cvs-t-file_pruned.t.diff`](http://www.netbsd.org/~schmonz/ikiwiki-cvs/cvs-t-file_pruned.t.diff) diff --git a/doc/plugins/contrib/field.mdwn b/doc/plugins/contrib/field.mdwn new file mode 100644 index 000000000..dce2d891c --- /dev/null +++ b/doc/plugins/contrib/field.mdwn @@ -0,0 +1,196 @@ +[[!template id=plugin name=field author="[[rubykat]]"]] +[[!tag type/meta]] +[[!toc]] +## NAME + +IkiWiki::Plugin::field - front-end for per-page record fields. + +## SYNOPSIS + + # activate the plugin + add_plugins => [qw{goodstuff field ....}], + + # simple registration + field_register => [qw{meta}], + + # simple registration with priority + field_register => { + meta => 'last' + foo => 'DD' + }, + + # allow the config to be queried as a field + field_allow_config => 1, + + # flag certain fields as "tags" + field_tags => { + BookAuthor => '/books/authors', + BookGenre => '/books/genres', + MovieGenre => '/movies/genres', + } + +## DESCRIPTION + +This plugin is meant to be used in conjunction with other plugins +in order to provide a uniform interface to access per-page structured +data, where each page is treated like a record, and the structured data +are fields in that record. This can include the meta-data for that page, +such as the page title. + +Plugins can register a function which will return the value of a "field" for +a given page. This can be used in a few ways: + +* In page templates; all registered fields will be passed to the page template in the "pagetemplate" processing. +* In PageSpecs; the "field" function can be used to match the value of a field in a page. +* In SortSpecs; the "field" function can be used for sorting pages by the value of a field in a page. +* By other plugins, using the field_get_value function, to get the value of a field for a page, and do with it what they will. + +## CONFIGURATION OPTIONS + +The following options can be set in the ikiwiki setup file. + +**field_allow_config** + + field_allow_config => 1, + +Allow the $config hash to be queried like any other field; the +keys of the config hash are the field names. + +**field_register** + + field_register => [qw{meta}], + + field_register => { + meta => 'last' + foo => 'DD' + }, + +A hash of plugin-IDs to register. The keys of the hash are the names of the +plugins, and the values of the hash give the order of lookup of the field +values. The order can be 'first', 'last', 'middle', or an explicit order +sequence between 'AA' and 'ZZ'. If the simpler type of registration is used, +then the order will be 'middle'. + +This assumes that the plugins in question store data in the %pagestatus hash +using the ID of that plugin, and thus the field values are looked for there. + +This is the simplest form of registration, but the advantage is that it +doesn't require the plugin to be modified in order for it to be +registered with the "field" plugin. + +**field_tags** + + field_tags => { + BookAuthor => '/books/authors', + BookGenre => '/books/genres', + MovieGenre => '/movies/genres', + } + +A hash of fields and their associated pages. This provides a faceted +tagging system. + +The way this works is that a given field-name will be associated with a given +page, and the values of that field will be linked to sub-pages of that page. + +For example: + + BookGenre: SF + +will link to "/books/genres/SF", with a link-type of "bookgenre". + +## PageSpec + +The `field` plugin provides a few PageSpec functions to match values +of fields for pages. + +* field + * **field(*name* *glob*)** + * field(bar Foo\*) will match if the "bar" field starts with "Foo". +* destfield + * **destfield(*name* *glob*)** + * as for "field" but matches against the destination page (i.e when the source page is being included in another page). +* field_item + * **field_item(*name* *glob*)** + * field_item(bar Foo) will match if one of the values of the "bar" field is "Foo". +* destfield_item + * **destfield_item(*name* *glob*)** + * as for "field_item" but matches against the destination page. +* field_tagged + * **field_tagged(*name* *glob*)** + * like `tagged`, but this uses the tag-bases and link-types defined in the `field_tags` configuration option. +* destfield_tagged + * **destfield_tagged(*name* *glob*)** + * as for "field_tagged" but matches against the destination page. + +## SortSpec + +The "field" SortSpec function can be used to sort a page depending on the value of a field for that page. This is used for directives that take sort parameters, such as **inline** or **report**. + +field(*name*) + +For example: + +sort="field(bar)" will sort by the value og the "bar" field. + +## FUNCTIONS + +### field_register + +field_register(id=>$id); + +Register a plugin as having field data. The above form is the simplest, where +the field value is looked up in the %pagestatus hash under the plugin-id. + +Additional Options: + +**call=>&myfunc** + +A reference to a function to call rather than just looking up the value in the +%pagestatus hash. It takes two arguments: the name of the field, and the name +of the page. It is expected to return (a) an array of the values of that field +if "wantarray" is true, or (b) a concatenation of the values of that field +if "wantarray" is not true, or (c) undef if there is no field by that name. + + sub myfunc ($$) { + my $field = shift; + my $page = shift; + + ... + + return (wantarray ? @values : $value); + } + +**first=>1** + +Set this to be called first in the sequence of calls looking for values. Since +the first found value is the one which is returned, ordering is significant. +This is equivalent to "order=>'first'". + +**last=>1** + +Set this to be called last in the sequence of calls looking for values. Since +the first found value is the one which is returned, ordering is significant. +This is equivalent to "order=>'last'". + +**order=>$order** + +Set the explicit ordering in the sequence of calls looking for values. Since +the first found value is the one which is returned, ordering is significant. + +The values allowed for this are "first", "last", "middle", or a two-character +ordering-sequence between 'AA' and 'ZZ'. + +### field_get_value($field, $page) + + my @values = field_get_value($field, $page); + + my $value = field_get_value($field, $page); + +Returns the values of the field for that page, or undef if none is found. +Note that it will return an array of values if you ask for an array, +and a scalar value if you ask for a scalar. + +## DOWNLOAD + +* browse at GitHub: <http://github.com/rubykat/ikiplugins/blob/master/IkiWiki/Plugin/field.pm> +* git repo at git://github.com/rubykat/ikiplugins.git diff --git a/doc/plugins/contrib/field/discussion.mdwn b/doc/plugins/contrib/field/discussion.mdwn new file mode 100644 index 000000000..191f8b27d --- /dev/null +++ b/doc/plugins/contrib/field/discussion.mdwn @@ -0,0 +1,332 @@ +Having tried out `field`, some comments (from [[smcv]]): + +The general concept looks great. + +The `pagetemplate` hook seems quite namespace-polluting: on a site containing +a list of books, I'd like to have an `author` field, but that would collide +with IkiWiki's use of `<TMPL_VAR AUTHOR>` for the author of the *page* +(i.e. me). Perhaps it'd be better if the pagetemplate hook was only active for +`<TMPL_VAR FIELD_AUTHOR>` or something? (For those who want the current +behaviour, an auxiliary plugin would be easy.) + +> No, please. The idea is to be *able* to override field names if one wishes to, and choose, for yourself, non-colliding field names if one wishes not to. I don't wish to lose the power of being able to, say, define a page title with YAML format if I want to, or to write a site-specific plugin which calculates a page title, or other nifty things. +>It's not like one is going to lose the fields defined by the meta plugin; if "author" is defined by \[[!meta author=...]] then that's what will be found by "field" (provided the "meta" plugin is registered; that's what the "field_register" option is for). +>--[[KathrynAndersen]] + +>> Hmm. I suppose if you put the title (or whatever) in the YAML, then +>> "almost" all the places in IkiWiki that respect titles will do the +>> right thing due to the pagetemplate hook, with the exception being +>> anything that has special side-effects inside `meta` (like `date`), +>> or anything that looks in `$pagestate{foo}{meta}` directly +>> (like `map`). Is your plan that `meta` should register itself by +>> default, and `map` and friends should be adapted to +>> work based on `getfield()` instead of `$pagestate{foo}{meta}`, then? + +>>> Based on `field_get_value()`, yes. That would be my ideal. Do you think I should implement that as an ikiwiki branch? --[[KathrynAndersen]] + +>>>> This doesn't solve cases where certain fields are treated specially; for +>>>> instance, putting a `\[[!meta permalink]]` on a page is not the same as +>>>> putting it in `ymlfront` (in the latter case you won't get your +>>>> `<link>` header), and putting `\[[!meta date]]` is not the same as putting +>>>> `date` in `ymlfront` (in the latter case, `%pagectime` won't be changed). +>>>> +>>>> One way to resolve that would be to have `ymlfront`, or similar, be a +>>>> front-end for `meta` rather than for `field`, and call +>>>> `IkiWiki::Plugin::meta::preprocess` (or a refactored-out function that's +>>>> similar). +>>>> +>>>> There are also some cross-site scripting issues (see below)... --[[smcv]] + +>> (On the site I mentioned, I'm using an unmodified version of `field`, +>> and currently working around the collision by tagging books' pages +>> with `bookauthor` instead of `author` in the YAML.) --s + +>> Revisiting this after more thought, the problem here is similar to the +>> possibility that a wiki user adds a `meta` shortcut +>> to [[shortcuts]], or conversely, that a plugin adds a `cpan` directive +>> that conflicts with the `cpan` shortcut that pages already use. (In the +>> case of shortcuts, this is resolved by having plugin-defined directives +>> always win.) For plugin-defined meta keywords this is the plugin +>> author's/wiki admin's problem - just don't enable conflicting plugins! - +>> but it gets scary when you start introducing things like `ymlfront`, which +>> allow arbitrary, wiki-user-defined fields, even ones that subvert +>> other plugins' assumptions. +>> +>> The `pagetemplate` hook is particularly alarming because page templates are +>> evaluated in many contexts, not all of which are subject to the +>> htmlscrubber or escaping; because the output from `field` isn't filtered, +>> prefixed or delimited, when combined with an arbitrary-key-setting plugin +>> like `ymlfront` it can interfere with other plugins' expectations +>> and potentially cause cross-site scripting exploits. For instance, `inline` +>> has a `pagetemplate` hook which defines the `FEEDLINKS` template variable +>> to be a blob of HTML to put in the `<head>` of the page. As a result, this +>> YAML would be bad: +>> +>> --- +>> FEEDLINKS: <script>alert('code injection detected')</script> +>> --- +>> +>> (It might require a different case combination due to implementation +>> details, I'm not sure.) +>> +>> It's difficult for `field` to do anything about this, because it doesn't +>> know whether a field is meant to be plain text, HTML, a URL, or something +>> else. +>> +>> If `field`'s `pagetemplate` hook did something more limiting - like +>> only emitting template variables starting with `field_`, or from some +>> finite set, or something - then this would cease to be a problem, I think? +>> +>> `ftemplate` and `getfield` don't have this problem, as far as I can see, +>> because their output is in contexts where the user could equally well have +>> written raw HTML directly; the user can cause themselves confusion, but +>> can't cause harmful output. --[[smcv]] + +From a coding style point of view, the `$CamelCase` variable names aren't +IkiWiki style, and the `match_foo` functions look as though they could benefit +from being thin wrappers around a common `&IkiWiki::Plugin::field::match` +function (see `meta` for a similar approach). + +I think the documentation would probably be clearer in a less manpage-like +and more ikiwiki-like style? + +> I don't think ikiwiki *has* a "style" for docs, does it? So I followed the Perl Module style. And I'm rather baffled as to why having the docs laid out in clear sections... make them less clear. --[[KathrynAndersen]] + +>> I keep getting distracted by the big shouty headings :-) +>> I suppose what I was really getting at was that when this plugin +>> is merged, its docs will end up split between its plugin +>> page, [[plugins/write]] and [[ikiwiki/PageSpec]]; on some of the +>> contrib plugins I've added I've tried to separate the docs +>> according to how they'll hopefully be laid out after merge. --s + +If one of my branches from [[todo/allow_plugins_to_add_sorting_methods]] is +accepted, a `field()` cmp type would mean that [[plugins/contrib/report]] can +stop reimplementing sorting. Here's the implementation I'm using, with +your "sortspec" concept (a sort-hook would be very similar): if merged, +I think it should just be part of `field` rather than a separate plugin. + + # Copyright © 2010 Simon McVittie, released under GNU GPL >= 2 + package IkiWiki::Plugin::fieldsort; + use warnings; + use strict; + use IkiWiki 3.00; + use IkiWiki::Plugin::field; + + sub import { + hook(type => "getsetup", id => "fieldsort", call => \&getsetup); + } + + sub getsetup () { + return + plugin => { + safe => 1, + rebuild => undef, + }, + } + + package IkiWiki::SortSpec; + + sub cmp_field { + if (!length $_[0]) { + error("sort=field requires a parameter"); + } + + my $left = IkiWiki::Plugin::field::field_get_value($_[0], $a); + my $right = IkiWiki::Plugin::field::field_get_value($_[0], $b); + + $left = "" unless defined $left; + $right = "" unless defined $right; + return $left cmp $right; + } + + 1; + +---- + +Disclaimer: I've only looked at this plugin and ymlfront, not other related +stuff yet. (I quite like ymlfront, so I looked at this as its dependency. :) +I also don't want to annoy you with a lot of design discussion +if your main goal was to write a plugin that did exactly what you wanted. + +My first question is: Why we need another plugin storing metadata +about the page, when we already have the meta plugin? Much of the +complication around the field plugin has to do with it accessing info +belonging to the meta plugin, and generalizing that to be able to access +info stored by other plugins too. (But I don't see any other plugins that +currently store such info). Then too, it raises points of confusion like +smcv's discuission of field author vs meta author above. --[[Joey]] + +> The point is exactly in the generalization, to provide a uniform interface for accessing structured data, no matter what the source of it, whether that be the meta plugin or some other plugin. + +> There were a few reasons for this: + +>1. In converting my site over from PmWiki, I needed something that was equivalent to PmWiki's Page-Text-Variables (which is how PmWiki implements structured data). +>2. I also wanted an equivalent of PmWiki's Page-Variables, which, rather than being simple variables, are the return-value of a function. This gives one a lot of power, because one can do calculations, derive one thing from another. Heck, just being able to have a "basename" variable is useful. +>3. I noticed that in the discussion about structured data, it was mired down in disagreements about what form the structured data should take; I wanted to overcome that hurdle by decoupling the form from the content. +>4. I actually use this to solve (1), because, while I do use ymlfront, initially my pages were in PmWiki format (I wrote (another) unreleased plugin which parses PmWiki format) including PmWiki's Page-Text-Variables for structured data. So I needed something that could deal with multiple formats. + +> So, yes, it does cater to mostly my personal needs, but I think it is more generally useful, also. +> --[[KathrynAndersen]] + +>> Is it fair to say, then, that `field`'s purpose is to take other +>> plugins' arbitrary per-page data, and present it as a single +>> merged/flattened string => string map per page? From the plugins +>> here, things you then use that merged map for include: +>> +>> * sorting - stolen by [[todo/allow_plugins_to_add_sorting_methods]] +>> * substitution into pages with Perl-like syntax - `getfield` +>> * substitution into wiki-defined templates - the `pagetemplate` +>> hook +>> * substitution into user-defined templates - `ftemplate` +>> +>> As I mentioned above, the flattening can cause collisions (and in the +>> `pagetemplate` case, even security problems). +>> +>> I wonder whether conflating Page Text Variables with Page Variables +>> causes `field` to be more general than it needs to be? +>> To define a Page Variable (function-like field), you need to write +>> a plugin containing that Perl function; if we assume that `field` +>> or something resembling it gets merged into ikiwiki, then it's +>> reasonable to expect third-party plugins to integrate with whatever +>> scaffolding there is for these (either in an enabled-by-default +>> plugin that most people are expected to leave enabled, like `meta` +>> now, or in the core), and it doesn't seem onerous to expect each +>> plugin that wants to participate in this mechanism to have code to +>> do so. While it's still contrib, `field` could just have a special case +>> for the meta plugin, rather than the converse? +>> +>> If Page Text Variables are limited to being simple strings as you +>> suggest over in [[forum/an_alternative_approach_to_structured_data]], +>> then they're functionally similar to `meta` fields, so one way to +>> get their functionality would be to extend `meta` so that +>> +>> \[[!meta badger="mushroom"]] +>> +>> (for an unrecognised keyword `badger`) would store +>> `$pagestate{$page}{meta}{badger} = "mushroom"`? Getting this to +>> appear in templates might be problematic, because a naive +>> `pagetemplate` hook would have the same problem that `field` combined +>> with `ymlfront` currently does. +>> +>> One disadvantage that would appear if the function-like and +>> meta-like fields weren't in the same namespace would be that it +>> wouldn't be possible to switch a field from being meta-like to being +>> function-like without changing any wiki content that referenced it. +>> +>> Perhaps meta-like fields should just *be* `meta` (with the above +>> enhancement), as a trivial case of function-like fields? That would +>> turn `ymlfront` into an alternative syntax for `meta`, I think? +>> That, in turn, would hopefully solve the special-fields problem, +>> by just delegating it to meta. I've been glad of the ability to define +>> new ad-hoc fields with this plugin without having to write an extra plugin +>> to do so (listing books with a `bookauthor` and sorting them by +>> `"field(bookauthor) title"`), but that'd be just as easy if `meta` +>> accepted ad-hoc fields? +>> +>> --[[smcv]] + +>>> Your point above about cross-site scripting is a valid one, and something I +>>> hadn't thought of (oops). + +>>> I still want to be able to populate pagetemplate templates with field, because I +>>> use it for a number of things, such as setting which CSS files to use for a +>>> given page, and, as I said, for titles. But apart from the titles, I +>>> realize I've been setting them in places other than the page data itself. +>>> (Another unreleased plugin, `concon`, uses Config::Context to be able to +>>> set variables on a per-site, per-directory and a per-page basis). + +>>> The first possible solution is what you suggested above: for field to only +>>> set values in pagetemplate which are prefixed with *field_*. I don't think +>>> this is quite satisfactory, since that would still mean that people could +>>> put un-scrubbed values into a pagetemplate, albeit they would be values +>>> named field_foo, etc. --[[KathrynAndersen]] + +>>>> They can already do similar; `PERMALINK` is pre-sanitized to +>>>> ensure that it's a "safe" URL, but if an extremely confused wiki admin was +>>>> to put `COPYRIGHT` in their RSS/Atom feed's `<link>`, a malicious user +>>>> could put an unsafe (e.g. Javascript) URL in there (`COPYRIGHT` *is* +>>>> HTML-scrubbed, but "javascript:alert('pwned!')" is just text as far as a +>>>> HTML sanitizer is concerned, so it passes straight through). The solution +>>>> is to not use variables in situations where that variable would be +>>>> inappropriate. Because `field` is so generic, the definition of what's +>>>> appropriate is difficult. --[[smcv]] + +>>> An alternative solution would be to classify field registration as "secure" +>>> and "insecure". Sources such as ymlfront would be insecure, sources such +>>> as concon (or the $config hash) would be secure, since they can't be edited +>>> as pages. Then, when doing pagetemplate substitution (but not ftemplate +>>> substitution) the insecure sources could be HTML-escaped. +>>> --[[KathrynAndersen]] + +>>>> Whether you trust the supplier of data seems orthogonal to whether its value +>>>> is (meant to be) interpreted as plain text, HTML, a URL or what? +>>>> +>>>> Even in cases where you trust the supplier, you need to escape things +>>>> suitably for the context, not for security but for correctness. The +>>>> definition of the value, and the context it's being used in, changes the +>>>> processing you need to do. An incomplete list: +>>>> +>>>> * HTML used as HTML needs to be html-scrubbed if and only if untrusted +>>>> * URLs used as URLs need to be put through `safeurl()` if and only if +>>>> untrusted +>>>> * HTML used as plain text needs tags removed regardless +>>>> * URLs used as plain text are safe +>>>> * URLs or plain text used in HTML need HTML-escaping (and URLs also need +>>>> `safeurl()` if untrusted) +>>>> * HTML or plain text used in URLs need URL-escaping (and the resulting +>>>> URL might need sanitizing too?) +>>>> +>>>> I can't immediately think of other data types we'd be interested in beyond +>>>> text, HTML and URL, but I'm sure there are plenty. + +>>>>> But isn't this a problem with anything that uses pagetemplates? Or is +>>>>> the point that, with plugins other than `field`, they all know, +>>>>> beforehand, the names of all the fields that they are dealing with, and +>>>>> thus the writer of the plugin knows which treatment each particular field +>>>>> needs? For example, that `meta` knows that `title` needs to be +>>>>> HTML-escaped, and that `baseurl` doesn't. In that case, yes, I see the problem. +>>>>> It's a tricky one. It isn't as if there's only ever going to be a fixed set of fields that need different treatment, either. Because the site admin is free to add whatever fields they like to the page template (if they aren't using the default one, that is. I'm not using the default one myself). +>>>>> Mind you, for trusted sources, since the person writing the page template and the person providing the variable are the same, they themselves would know whether the value will be treated as HTML, plain text, or a URL, and thus could do the needed escaping themselves when writing down the value. + +>>>>> Looking at the content of the default `page.tmpl` let's see what variables fall into which categories: + +>>>>> * **Used as URL:** BASEURL, EDITURL, PARENTLINKS->URL, RECENTCHANGESURL, HISTORYURL, GETSOURCEURL, PREFSURL, OTHERLANGUAGES->URL, ADDCOMMENTURL, BACKLINKS->URL, MORE_BACKLINKS->URL +>>>>> * **Used as part of a URL:** FAVICON, LOCAL_CSS +>>>>> * **Needs to be HTML-escaped:** TITLE +>>>>> * **Used as-is (as HTML):** FEEDLINKS, RELVCS, META, PERCENTTRANSLATED, SEARCHFORM, COMMENTSLINK, DISCUSSIONLINK, OTHERLANGUAGES->PERCENT, SIDEBAR, CONTENT, COMMENTS, TAGS->LINK, COPYRIGHT, LICENSE, MTIME, EXTRAFOOTER + +>>>>> This looks as if only TITLE needs HTML-escaping all the time, and that the URLS all end with "URL" in their name. Unfortunately the FAVICON and LOCAL_CSS which are part of URLS don't have "URL" in their name, though that's fair enough, since they aren't full URLs. + +>>>>> --K.A. + +>>>> One reasonable option would be to declare that `field` takes text-valued +>>>> fields, in which case either consumers need to escape +>>>> it with `<TMPL_VAR FIELD_FOO ESCAPE=HTML>`, and not interpret it as a URL +>>>> without first checking `safeurl`), or the pagetemplate hook needs to +>>>> pre-escape. + +>>>>> Since HTML::Template does have the ability to do ESCAPE=HTML/URL/JS, why not take advantage of that? Some things, like TITLE, probably should have ESCAPE=HTML all the time; that would solve the "to escape or not to escape" problem that `meta` has with titles. After all, when one *sorts* by title, one doesn't really want HTML-escaping in it; only when one uses it in a template. -- K.A. + +>>>> Another reasonable option would be to declare that `field` takes raw HTML, +>>>> in which case consumers need to only use it in contexts that will be +>>>> HTML-scrubbed (but it becomes unsuitable for using as text - problematic +>>>> for text-based things like sorting or URLs, and not ideal for searching). +>>>> +>>>> You could even let each consumer choose how it's going to use the field, +>>>> by having the `foo` field generate `TEXT_FOO` and `HTML_FOO` variables? +>>>> --[[smcv]] + +>>>>> Something similar is already done in `template` and `ftemplate` with the `raw_` prefix, which determines whether the variable should have `htmlize` run over it first before the value is applied to the template. Of course, that isn't scrubbing or escaping, because with those templates, the scrubbing is done afterwards as part of the normal processing. + +>>> Another problem, as you point out, is special-case fields, such as a number of +>>> those defined by `meta`, which have side-effects associated with them, more +>>> than just providing a value to pagetemplate. Perhaps `meta` should deal with +>>> the side-effects, but use `field` as an interface to get the values of those special fields. + +>>> --[[KathrynAndersen]] + +----- + +I was just looking at HTML5 and wondered if the field plugin should generate the new Microdata tags (as well as the internal structures)? <http://slides.html5rocks.com/#slide19> -- [[Will]] + +> This could just as easily be done as a separate plugin. Feel free to do so. --[[KathrynAndersen]] diff --git a/doc/plugins/contrib/flattr.mdwn b/doc/plugins/contrib/flattr.mdwn new file mode 100644 index 000000000..e9b4bf857 --- /dev/null +++ b/doc/plugins/contrib/flattr.mdwn @@ -0,0 +1,48 @@ +[[!template id=plugin name=flattr author="[[jaywalk]]"]] + +[flattr.com](http://flattr.com/) is a flatrate micropayment service, which revolves around the idea of having flattr buttons everywhere that people visiting your site can use to "flattr" you. + +This plugin makes it easier to put flattr buttons in ikiwiki. It supports both the static kind as well as the counting dynamic javascript version. The dynamic version does not work if [[htmlscrubber|/plugins/htmlscrubber]] is active on the page. + +The dynamic button does not require creation of the page on flattr before being added to a page, the static one does. + +I wrote some notes on [jonatan.walck.se](http://jonatan.walck.se/software/ikiwiki/plugin/flattr/) and put the source here: [flattr.pm](http://jonatan.walck.se/software/ikiwiki/flattr.pm) + +This plugin is licensed under [CC0](http://creativecommons.org/publicdomain/zero/1.0/) (public domain). + +Note that there is now a [[plugins/flattr]] plugin bundled with ikiwiki. It +is less configurable, not supporting static buttons, but simpler to use. + +# Usage # + + # [[!flattr args]] where args are in the form of arg=value. + # Possible args: + # type - static or dynamic. Defaults to static. + + # vars in static mode: + # -------------------- + # Required: + # url - URL to flattr page, + # e.g. http://flattr.com/thing/1994/jaywalks-weblog + # Optional: + # style - Set to compact for compact button. + + # vars in dynamic mode: + # --------------------- + # Required: + # None. + # Optional: + # uid - Set the default in the plugin, override if needed. + # title - The title defaults to $wikiname/some/path (like on the top of + # the wiki). + # desc - A description of the content. Defaults to " ". + # cat - Category, this can be text, images, video, audio, software or + # rest. Defaults to text. + # lang - Language, list of available choises is on + # https://flattr.com/support/integrate/languages. Defaults to en_GB. + # tag - A list of comma separated tags. Empty per default. + # url - URL to thing to flattred, + # e.g. http://jonatan.walck.se/weblog + # style - Set it to compact to get the small button, big for any other + # value including empty. + diff --git a/doc/plugins/contrib/flattr/discussion.mdwn b/doc/plugins/contrib/flattr/discussion.mdwn new file mode 100644 index 000000000..586139e9c --- /dev/null +++ b/doc/plugins/contrib/flattr/discussion.mdwn @@ -0,0 +1,9 @@ +FWIW, it is possible for a plugin like this to add javascript to pages that +are protected by htmlscrubber. Just return a token in your preprocess hook, +and then have a format hook that replaces the token with the javascript. +--[[Joey]] + +> Thanks, That's good to know. I'll try to continue the development of this +> plugin later, for now I just needed it to work. :) It will most likely +> evolve as my page does too. +> --[[jaywalk]] diff --git a/doc/plugins/contrib/ftemplate.mdwn b/doc/plugins/contrib/ftemplate.mdwn new file mode 100644 index 000000000..d82867f94 --- /dev/null +++ b/doc/plugins/contrib/ftemplate.mdwn @@ -0,0 +1,25 @@ +[[!template id=plugin name=ftemplate author="[[rubykat]]"]] +[[!tag type/meta type/format]] + +This plugin provides the [[ikiwiki/directive/ftemplate]] directive. + +This is like the [[ikiwiki/directive/template]] directive, with the addition +that one does not have to provide all the values in the call to the template, +because ftemplate can query structured data ("fields") using the [[field]] +plugin. + +## Activate the plugin + + add_plugins => [qw{goodstuff ftemplate ....}], + +## PREREQUISITES + + IkiWiki + IkiWiki::Plugin::field + HTML::Template + Encode + +## DOWNLOAD + +* browse at GitHub: <http://github.com/rubykat/ikiplugins/blob/master/IkiWiki/Plugin/ftemplate.pm> +* git repo at git://github.com/rubykat/ikiplugins.git diff --git a/doc/plugins/contrib/ftemplate/discussion.mdwn b/doc/plugins/contrib/ftemplate/discussion.mdwn new file mode 100644 index 000000000..1e0bca5d8 --- /dev/null +++ b/doc/plugins/contrib/ftemplate/discussion.mdwn @@ -0,0 +1,33 @@ +I initially thought this wasn't actually necessary - the combination +of [[plugins/template]] with [[plugins/contrib/field]]'s `pagetemplate` +hook ought to provide the same functionality. However, `template` +doesn't run `pagetemplate` hooks; a more general version of this +plugin would be to have a variant of `template` that runs `pagetemplate` +hooks (probably easiest to just patch `template` to implement a +second directive, or have a special parameter `run_hooks="yes"`, +or something). + +> I got the impression that `pagetemplate` hooks are intended to be completely independent of `template` variables; page-template is for the actual `page.tmpl` template, while `template` is for other templates which are used inside the page content. So I don't understand why one would need a run_hooks option. --[[KathrynAndersen]] + +>> `Render`, `inline`, `comments` and `recentchanges` run `pagetemplate` +>> hooks, as does anything that uses `IkiWiki::misctemplate`. From that +>> quick survey, it seems as though `template` is the only thing that +>> uses `HTML::Template` but *doesn't* run `pagetemplate` hooks? +>> +>> It just seems strange to me that `field` needs to have its own +>> variant of `template` (this), its own variant of `inline` (`report`), +>> and so on - I'd tend to lean more towards having `field` +>> enhance the existing plugins. I'm not an ikiwiki committer, +>> mind... Joey, your opinion would be appreciated! --[[smcv]] + +>>> I did it that way basically because I needed the functionality ASAP, and I didn't want to step on anyone's toes, so I made them as separate plugins. If Joey wants to integrate the functionality into IkiWiki proper, I would be very happy, but I don't want to put pressure on him. --[[KathrynAndersen]] + +Another missing thing is that `ftemplate` looks in +the "system" templates directories, not just in the wiki, but that +seems orthogonal (and might be a good enhancement to `template` anyway). +--[[smcv]] + +> Yes, I added that because I wanted the option of not having to make all my templates work as wiki pages also. --[[KathrynAndersen]] + +>> Joey has added support for +>> [[todo/user-defined_templates_outside_the_wiki]] now. --s diff --git a/doc/plugins/contrib/ftemplate/ikiwiki/directive/ftemplate.mdwn b/doc/plugins/contrib/ftemplate/ikiwiki/directive/ftemplate.mdwn new file mode 100644 index 000000000..3009fc830 --- /dev/null +++ b/doc/plugins/contrib/ftemplate/ikiwiki/directive/ftemplate.mdwn @@ -0,0 +1,106 @@ +The `ftemplate` directive is supplied by the [[!iki plugins/contrib/ftemplate desc=ftemplate]] plugin. + +This is like the [[ikiwiki/directive/template]] directive, with the addition +that one does not have to provide all the values in the call to the template, +because ftemplate can query structured data ("fields") using the +[[plugins/contrib/field]] plugin. + +Templates are files that can be filled out and inserted into pages in +the wiki, by using the ftemplate directive. The directive has an id +parameter that identifies the template to use. + +Additional parameters can be used to fill out the template, in +addition to the "field" values. Passed-in values override the +"field" values. + +There are two places where template files can live. One is in the /templates +directory on the wiki. These templates are wiki pages, and can be edited from +the web like other wiki pages. + +The second place where template files can live is in the global +templates directory (the same place where the page.tmpl template lives). +This is a useful place to put template files if you want to prevent +them being edited from the web, and you don't want to have to make +them work as wiki pages. + +### EXAMPLES + +#### Example 1 + +PageA: + + \[[!meta title="I Am Page A"]] + \[[!meta description="A is for Apple."]] + \[[!meta author="Fred Nurk"]] + \[[!ftemplate id="mytemplate"]] + +Template "mytemplate": + + # <TMPL_VAR NAME="TITLE"> + by <TMPL_VAR NAME="AUTHOR"> + + **Summary:** <TMPL_VAR NAME="DESCRIPTION"> + +This will give: + + <h1>I Am Page A</h1> + <p>by Fred Nurk</p> + <p><strong>Summary:</strong> A is for Apple. + +#### Example 2: Overriding values + +PageB: + + \[[!meta title="I Am Page B"]] + \[[!meta description="B is for Banana."]] + \[[!meta author="Fred Nurk"]] + \[[!ftemplate id="mytemplate" title="Bananananananas"]] + +This will give: + + <h1>Bananananananas</h1> + <p>by Fred Nurk</p> + <p><strong>Summary:</strong> B is for Banana. + +#### Example 3: Loops + +(this example uses the [[plugins/contrib/ymlfront]] plugin) + +Page C: + + --- + BookAuthor: Georgette Heyer + BookTitle: Black Sheep + BookGenre: + - Historical + - Romance + --- + \[[ftemplate id="footemplate"]] + + I like this book. + +Template "footemplate": + + # <TMPL_VAR BOOKTITLE> + by <TMPL_VAR BOOKAUTHOR> + + <TMPL_IF BOOKGENRE>( + <TMPL_LOOP GENRE_LOOP><TMPL_VAR BOOKGENRE> + <TMPL_UNLESS __last__>, </TMPL_UNLESS> + </TMPL_LOOP> + )</TMPL_IF> + +This will give: + + <h1>Black Sheep</h1> + <p>by Georgette Heyer</p> + + <p>(Historical, Romance)</p> + + <p>I like this book.</p> + +### LIMITATIONS + +One cannot query the values of fields on pages other than the current +page. If you want to do that, check out the [[plugins/contrib/report]] +plugin. diff --git a/doc/plugins/contrib/getfield.mdwn b/doc/plugins/contrib/getfield.mdwn new file mode 100644 index 000000000..0a92894f1 --- /dev/null +++ b/doc/plugins/contrib/getfield.mdwn @@ -0,0 +1,131 @@ +[[!template id=plugin name=getfield author="[[rubykat]]"]] +[[!tag type/meta type/format]] +[[!toc]] +## NAME + +IkiWiki::Plugin::getfield - query the values of fields + +## SYNOPSIS + + # activate the plugin + add_plugins => [qw{goodstuff getfield ....}], + +## DESCRIPTION + +This plugin provides a way of querying the meta-data (data fields) of a page +inside the page content (rather than inside a template) This provides a way to +use per-page structured data, where each page is treated like a record, and the +structured data are fields in that record. This can include the meta-data for +that page, such as the page title. + +This plugin is meant to be used in conjunction with the [[field]] plugin. + +### USAGE + +One can get the value of a field by using special markup in the page. +This does not use directive markup, in order to make it easier to +use the markup inside other directives. There are four forms: + +* {{$*fieldname*}} + + This queries the value of *fieldname* for the source page. + + For example: + + \[[!meta title="My Long and Complicated Title With Potential For Spelling Mistakes"]] + # {{$title}} + + When the page is processed, this will give you: + + <h1>My Long and Complicated Title With Potential For Spelling Mistakes</h1> + +* {{$*pagename*#*fieldname*}} + + This queries the value of *fieldname* for the page *pagename*. + + For example: + + On PageFoo: + + \[[!meta title="I Am Page Foo"]] + + Stuff about Foo. + + On PageBar: + + For more info, see \[[{{$PageFoo#title}}|PageFoo]]. + + When PageBar is displayed: + + <p>For more info, see <a href="PageFoo">I Am Page Foo</a>.</p> + +* {{+$*fieldname*+}} + + This queries the value of *fieldname* for the destination page; that is, + the value when this page is included inside another page. + + For example: + + On PageA: + + \[[!meta title="I Am Page A"]] + # {{+$title+}} + + Stuff about A. + + On PageB: + + \[[!meta title="I Am Page B"]] + \[[!inline pagespec="PageA"]] + + When PageA is displayed: + + <h1>I Am Page A</h1> + <p>Stuff about A.</p> + + When PageB is displayed: + + <h1>I Am Page B</h1> + <p>Stuff about A.</p> + +* {{+$*pagename*#*fieldname*+}} + + This queries the value of *fieldname* for the page *pagename*; the + only difference between this and {{$*pagename*#*fieldname*}} is + that the full name of *pagename* is calculated relative to the + destination page rather than the source page. + + I can't really think of a reason why this should be needed, but + this format has been added for completeness. + +### No Value Found + +If no value is found for the given field, then the field name is returned. + +For example: + +On PageFoo: + + \[[!meta title="Foo"]] + My title is {{$title}}. + + My description is {{$description}}. + +When PageFoo is displayed: + + <p>My title is Foo.</p> + + <p>My description is description.</p> + +This is because "description" hasn't been defined for that page. + +### More Examples + +Listing all the sub-pages of the current page: + + \[[!map pages="{{$page}}/*"]] + +## DOWNLOAD + +* browse at GitHub: <http://github.com/rubykat/ikiplugins/blob/master/IkiWiki/Plugin/getfield.pm> +* git repo at git://github.com/rubykat/ikiplugins.git diff --git a/doc/plugins/contrib/getfield/discussion.mdwn b/doc/plugins/contrib/getfield/discussion.mdwn new file mode 100644 index 000000000..5f7fffead --- /dev/null +++ b/doc/plugins/contrib/getfield/discussion.mdwn @@ -0,0 +1,32 @@ +## Templating, and other uses + +Like you mentioned in [[ftemplate]] IIRC, it'll only work on the same page. If it can be made to work anywhere, or from a specific place in the wiki - configurable, possibly - you'll have something very similar to mediawiki's templates. I can already think of a few uses for this combined with [[template]] ;) . --[[SR|users/simonraven]] + +> Yes, I mentioned "only current page" in the "LIMITATIONS" section. + +> What do you think would be a good syntax for querying other pages? +> It needs to resolve to a single page, though I guess using "bestlink" to find the closest page would mean that one didn't have to spell out the whole page. + +>> I don't know the internals very well, I think that's how other plugins do it. *goes to check* Usually it's a `foreach` loop, and use a `pagestate{foo}` to check the page's status/state. There's also some stuff like 'pagespec_match_list($params{page}` ... they do slightly different thing depending on need. --[[SR|users/simonraven]] + +>>> No, I meant what markup I should use; the actual implementation probably wouldn't be too difficult. + +>>> The current markup is {{$*fieldname*}}; what you're wanting, perhaps it should be represented like {{$*pagename*:*fieldname*}}, or {{$*pagename*::*fieldname*}} or something else... +>>> -- [[KathrynAndersen]] + +>>>> Oh. Hmm. I like your idea actually, or alternately, in keeping more with other plugins, doing it like {{pagename/fieldname}}. The meaning of the separator is less clear with /, but avoids potential issues with filename clashes that have a colon in them. It also keeps a certain logic - at least to me. Either way, I think both are good choices. [[SR|users/simonraven]] + +>>>>> What about using {{pagename#fieldname}}? The meaning of the hash in URLs sort of fits with what is needed here (reference to a 'named' thing within the page) and it won't conflict with actual hash usages (unless we expect different named parts of pages to define different values for the same field ...) +>>>>> -- [[Oblomov]] +>>>>>> That's a good one too. --[[simonraven]] +>>>>>>> Done! I used {{$*pagename*#*fieldname*}} for the format. -- [[users/KathrynAndersen]] + + +> I'm also working on a "report" plugin, which will basically apply a template like [[ftemplate]] does, but to a list of pages given from a pagespec, rather than the current page. + +> -- [[users/KathrynAndersen]] + +>> Ooh, sounds nice :) . --[[SR|users/simonraven]] + +>>> I've now released the [[plugins/contrib/report]] plugin. I've been using it on my site; the holdup on releasing was because I hadn't yet written the docs for it. I hope you find it useful. +>>> -- [[users/KathrynAndersen]] diff --git a/doc/plugins/contrib/groupfile.mdwn b/doc/plugins/contrib/groupfile.mdwn new file mode 100644 index 000000000..e5c0ded42 --- /dev/null +++ b/doc/plugins/contrib/groupfile.mdwn @@ -0,0 +1,105 @@ +[[!template id=plugin name=groupfile core=0 author="[[Jogo]]"]] + +This plugin add a `group(groupname)` function to [[ikiwiki/PageSpec]], which is true +only if the actual user is member of the group named `groupname`. + +Groups membership are read from a file. The syntax of this file is very close to +usual `/etc/passwd` Unix file : the group's name, followed by a colon, followed by +a coma separated list of user's names. For exemple : + + dev:toto,foo + i18n:zorba + +----- + + #!/usr/bin/perl + # GroupFile plugin. + # by Joseph Boudou <jogo at matabio dot net> + + package IkiWiki::Plugin::groupfile; + + use warnings; + use strict; + use IkiWiki 3.00; + + sub import { + hook(type => 'getsetup', id => 'groups', call => \&get_setup); + } + + sub get_setup () { + return ( + plugin => { + safe => 0, + rebuild => 0, + }, + group_file => { + type => 'string', + example => '/etc/ikiwiki/group', + description => 'group file location', + safe => 0, + rebuild => 0, + }, + ); + } + + my $users_of = 0; + + sub get_groups () { + if (not $users_of) { + + if (not defined $config{group_file}) { + return 'group_file option not set'; + } + + open my $file, '<', $config{group_file} + or return 'Unable to open group_file'; + + $users_of = {}; + READ: + while (<$file>) { + next READ if (/^\s*$/); + + if (/^(\w+):([\w,]+)/) { + %{ $users_of->{$1} } = map { $_ => 1 } split /,/, $2; + } + else { + $users_of = "Error at group_file:$."; + last READ; + } + } + + close $file; + } + + return $users_of; + } + + package IkiWiki::PageSpec; + + sub match_group ($$;@) { + shift; + my $group = shift; + my %params = @_; + + if (not exists $params{user}) { + return IkiWiki::ErrorReason->new('no user specified'); + } + if (not defined $params{user}) { + return IkiWiki::FailReason->new('not logged in'); + } + + my $users_of = IkiWiki::Plugin::groupfile::get_groups(); + if (not ref $users_of) { + return IkiWiki::ErrorReason->new($users_of); + } + + if (exists $users_of->{$group}{ $params{user} }) { + return IkiWiki::SuccessReason->new("user is member of $group"); + } + else { + return IkiWiki::FailReason->new( + "user $params{user} isn't member of $group"); + } + } + + 1 diff --git a/doc/plugins/contrib/highlightcode.mdwn b/doc/plugins/contrib/highlightcode.mdwn index 8abb76583..f1df204bb 100644 --- a/doc/plugins/contrib/highlightcode.mdwn +++ b/doc/plugins/contrib/highlightcode.mdwn @@ -1,6 +1,8 @@ [[!template id=plugin name=highlightcode author="[[sabr]]"]] [[!tag type/format]] +(An alternative to this plugin, [[plugins/highlight]], is now provided with IkiWiki. --[[smcv]]) + A small plugin to allow Ikiwiki to display source files complete with syntax highlighting. Files with recognized extensions (i.e. my-file.cpp) are be rendered just like any other Ikiwiki page. You can even edit your source files with Ikiwiki's editor. It uses the Syntax::Highlight::Engine::Kate Perl module to do the highlighting. diff --git a/doc/plugins/contrib/ikiwiki/directive/ymlfront.mdwn b/doc/plugins/contrib/ikiwiki/directive/ymlfront.mdwn new file mode 100644 index 000000000..bb4a58fc6 --- /dev/null +++ b/doc/plugins/contrib/ikiwiki/directive/ymlfront.mdwn @@ -0,0 +1,17 @@ +The `ymlfront` directive is supplied by the [[!iki plugins/contrib/ymlfront desc=ymlfront]] plugin. + +This directive allows the user to define arbitrary meta-data in YAML format. + + \[[!ymlfront data=""" + foo: fooness + bar: The Royal Pigeon + baz: 2 + """]] + +There is one argument to this directive. + +* **data:** + The YAML-format data. This should be enclosed inside triple-quotes to preserve the data correctly. + +If more than one ymlfront directive is given per page, the result is undefined. +Likewise, it is inadvisable to try to mix the "---" ymlfront format with the directive form of the data. diff --git a/doc/plugins/contrib/linguas.mdwn b/doc/plugins/contrib/linguas.mdwn index bf502606e..84ece042e 100644 --- a/doc/plugins/contrib/linguas.mdwn +++ b/doc/plugins/contrib/linguas.mdwn @@ -10,7 +10,7 @@ Download: [linguas.pm](http://ettin.org/pub/ikiwiki/linguas.pm) (2006-08-21). Note that even though it is still available for download, this plugin is no longer actively maintained. If you are interested in multilingual wiki pages, you -can also take a look at other approaches such as [[todo/l10n]], [[plugins/contrib/po]], +can also take a look at other approaches such as [[todo/l10n]], [[plugins/po]], or Lars Wirzenius's [Static website, with translations, using IkiWiki](http://liw.iki.fi/liw/log/2007-05.html#20070528b). diff --git a/doc/plugins/contrib/mailbox/discussion.mdwn b/doc/plugins/contrib/mailbox/discussion.mdwn index 00fb0c05f..9520fdd70 100644 --- a/doc/plugins/contrib/mailbox/discussion.mdwn +++ b/doc/plugins/contrib/mailbox/discussion.mdwn @@ -3,3 +3,6 @@ For some reason, `git fetch` from http://pivot.cs.unb.ca/git/ikimailbox.git/ didn't work very smoothly for me: it hung, and I had to restart it 3 times before the download was complete. I'm writing this just to let you know that there might be some problems with such connections to your http-server. --Ivan Z. +> I can't replicate this (two months later!) +> I can suggest trying the git:// url for download if you can. +> Also, if you really want to get my attention, send me email [[DavidBremner]] diff --git a/doc/plugins/contrib/mediawiki.mdwn b/doc/plugins/contrib/mediawiki.mdwn index 7bf1ba0df..13c2d04b2 100644 --- a/doc/plugins/contrib/mediawiki.mdwn +++ b/doc/plugins/contrib/mediawiki.mdwn @@ -1,7 +1,10 @@ [[!template id=plugin name=mediawiki author="[[sabr]]"]] [[!tag type/format]] -[The Mediawiki plugin](http://u32.net/Mediawiki_Plugin/) allows ikiwiki to -process pages written using MediaWiki markup. +The Mediawiki plugin allows ikiwiki to process pages written using MediaWiki +markup. -Available at <http://alcopop.org/~jon/mediawiki.pm> +Available at <http://github.com/jmtd/mediawiki.pm>. + +This plugin originally lived at <http://u32.net/Mediawiki_Plugin/>, but that +website has disappeared. diff --git a/doc/plugins/contrib/navbar/discussion.mdwn b/doc/plugins/contrib/navbar/discussion.mdwn new file mode 100644 index 000000000..0bbec743c --- /dev/null +++ b/doc/plugins/contrib/navbar/discussion.mdwn @@ -0,0 +1,2 @@ +Where can I download this plugin ? +-- [[jogo]] diff --git a/doc/plugins/contrib/pod.mdwn b/doc/plugins/contrib/pod.mdwn new file mode 100644 index 000000000..97a9c648a --- /dev/null +++ b/doc/plugins/contrib/pod.mdwn @@ -0,0 +1,38 @@ +[[!template id=plugin name=pod author="[[rubykat]]"]] +[[!tag type/format]] +## NAME + +IkiWiki::Plugin::pod - process pages written in POD format. + +## SYNOPSIS + +In the ikiwiki setup file, enable this plugin by adding it to the +list of active plugins. + + add_plugins => [qw{goodstuff pod ....}], + +## DESCRIPTION + +IkiWiki::Plugin::pod is an IkiWiki plugin enabling ikiwiki to +process pages written in POD ([Plain Old Documentation](http://en.wikipedia.org/wiki/Plain_Old_Documentation)) format. +This will treat files with a `.pod` or `.pm` extension as files +which contain POD markup. + +## OPTIONS + +The following options can be set in the ikiwiki setup file. + +* **pod_index:** If true, this will generate an index (table of contents) for the page. +* **pod_toplink:** The label to be used for links back to the top of the page. If this is empty, then no top-links will be generated. + +## PREREQUISITES + + IkiWiki + Pod::Xhtml + IO::String + +## DOWNLOAD + +* browse at GitHub: <http://github.com/rubykat/ikiplugins/blob/master/IkiWiki/Plugin/pod.pm> +* git repo at git://github.com/rubykat/ikiplugins.git + diff --git a/doc/plugins/contrib/pod/discussion.mdwn b/doc/plugins/contrib/pod/discussion.mdwn new file mode 100644 index 000000000..9187b1350 --- /dev/null +++ b/doc/plugins/contrib/pod/discussion.mdwn @@ -0,0 +1,14 @@ +My one concern about this plugin is the `=for` markup in POD. + +> Some format names that formatters currently are known to +> accept include "roff", "man", "latex", "tex", "text", and "html". + +I don't know which of these [[!cpan Pod::Xhtml]] supports. If it currently +supports, or later support latex, that could be problimatic since that +could maybe be used to include files or run code. --[[Joey]] + +> I don't know, either; the documentation for [[!cpan Pod:Xhtml]] is silent on this subject. --[[KathrynAndersen]] + +>> I'm afraid the only approach is to audit the existing code in the perl +>> module(s), and then hope nothing is added to them later that opens a +>> security hole. --[[Joey]] diff --git a/doc/plugins/contrib/postal.mdwn b/doc/plugins/contrib/postal.mdwn index b2f875393..c522f8bcb 100644 --- a/doc/plugins/contrib/postal.mdwn +++ b/doc/plugins/contrib/postal.mdwn @@ -1,5 +1,5 @@ [[!template id=plugin name=postal author="[[DavidBremner]]"]] -[[!tag type/useful]] +[[!tag type/special-purpose]] The `postal` plugin allows users to send mail to a special address to comment on a page. It uses the [[mailbox]] diff --git a/doc/plugins/contrib/postal/discussion.mdwn b/doc/plugins/contrib/postal/discussion.mdwn new file mode 100644 index 000000000..4eaacc044 --- /dev/null +++ b/doc/plugins/contrib/postal/discussion.mdwn @@ -0,0 +1,24 @@ +It seems like the filter 'postal-accept.pl' I wrote doesn't refresh thoroughly enough. When a comment is added it calls + + IkiWiki::add_depends($page,$comments_page); + +And then after adding the actual comment, it ends with + + IkiWiki::refresh(); + IkiWiki::saveindex(); + +Sure enough, the page being commented on is refreshed, but not any inline pages (e.g. tags pages, blog top level) that contain it. +Is there a way to recursively refresh? Or should it work that way by default. I guess it is some part of the api that I don't understand, +since I think not many people grub about in the internals of ikiwiki this way. +It would be nice to figure this out, doing a full rebuild every time I get a blog comment is not that fun. + +[[DavidBremner]] + +> Ikiwiki currently doesn't have support for transitive dependencies. +> This is discussed deep inside [[todo/tracking_bugs_with_dependencies]] +> and in [[todo/inlines_inheriting_links]]. +> +> FYI, the [[plugins/comments]] plugin avoids this problem by only showing the +> comments on the page, and not on pages that inline it. --[[Joey]] +>> Ok, thanks for the speedy response. I guess I should do the same thing. +>> [[DavidBremner]] diff --git a/doc/plugins/contrib/report.mdwn b/doc/plugins/contrib/report.mdwn new file mode 100644 index 000000000..0bd5392c6 --- /dev/null +++ b/doc/plugins/contrib/report.mdwn @@ -0,0 +1,26 @@ +[[!template id=plugin name=report author="[[rubykat]]"]] +[[!tag type/meta type/format]] +IkiWiki::Plugin::report - Produce templated reports from page field data. + +This plugin provides the [[ikiwiki/directive/report]] directive. This enables +one to report on the structured data ("field" values) of multiple pages; the +output is formatted via a template. This depends on the +[[plugins/contrib/field]] plugin. + + +## Activate the plugin + + # activate the plugin + add_plugins => [qw{goodstuff report ....}], + +## PREREQUISITES + + IkiWiki + IkiWiki::Plugin::field + HTML::Template + Encode + +## DOWNLOAD + +* browse at GitHub: <http://github.com/rubykat/ikiplugins/blob/master/IkiWiki/Plugin/report.pm> +* git repo at git://github.com/rubykat/ikiplugins.git diff --git a/doc/plugins/contrib/report/discussion.mdwn b/doc/plugins/contrib/report/discussion.mdwn new file mode 100644 index 000000000..e23a4ced4 --- /dev/null +++ b/doc/plugins/contrib/report/discussion.mdwn @@ -0,0 +1,75 @@ +Wow, this plugin does a lot... it seems to be `inline` (but without the feeds +or the ability to not have `archive="yes"`), plus part of +[[plugins/contrib/trail]], plus some sorting, plus an ingenious workaround +for template evaluation being relatively stateless. + +A large part of this plugin would just fall off if one of the versions of +"[[todo/allow_plugins_to_add_sorting_methods]]" was merged, which was a +large part of the idea of that feature request :-) To make use of that +you'd have to use `pagespec_match_list` in the trail case too, but that's +easy enough - just add `list => [@the_trail_pages]` to the arguments. + +Another large part would fall off if this plugin required, and internally +invoked, `inline` (like my `comments` plugin does) - `inline` runs +`pagetemplate` hooks, and in particular, it'll run the `field` hook. +Alternatively, this plugin could invoke `pagetemplate` hooks itself, +removing the special case for `field`. + +Perhaps the `headers` thing could migrate into inline somehow? That might +lead to making inline too big, though. + +> I think inline is *already* too big, honestly. --[[KathrynAndersen]] + +>> A fair point; perhaps my complaint should be that *inline* does +>> too many orthogonal things. I suppose the headers feature wouldn't +>> really make sense in an inline that didn't have `archive="yes"`, +>> so it'd make sense to recommend this plugin as a replacement +>> for inlining with archive=yes (for which I now realise "inline" +>> is the wrong verb anyway :-) ) --s + +>>> I think *inline* would be a bit less unwieldy if there was some way of factoring out the feed stuff into a separate plugin, but I don't know if that's possible. --K.A. + +Is the intention that the `trail` part is a performance hack, or a way +to select pages? How does it relate to [[todo/wikitrails]] or +[[plugins/contrib/trail]]? --[[smcv]] + +> The `trail` part is *both* a performance hack, and a way to select pages. I have over 5000 pages on my site, I need all the performance hacks I can get. +> For the performance hack, it is a way of reducing the need to iterate through every single page in the wiki in order to find matching pages. +> For the way-to-select-pages, yes, it is intended to be similar to [[todo/wikitrails]] and [[plugins/contrib/trail]] (and will be more similar with the new release which will be happening soon; it will add prev_* and next_* variables). +> The idea is that, rather than having to add special "trail" links on PageA to indicate that a page is part of the trail, +> it takes advantage of the `%links` hash, which already contains, for each page, an array of the links from that page to other pages. No need for special markup, just use what's there; a trail is defined as "all the pages linked to from page X", and since it's an array, it has an order already. +> But to avoid that being too limiting, one can use a `pages=...` pagespec to filter that list to a subset; only the pages one is interested in. +> And one can also sort it, if one so desires. +> --[[KathrynAndersen]] + +>> That's an interesting approach to trails; I'd missed the fact that +>> links are already ordered. +>> +>> This does have the same problems as tags, though: see +>> [[bugs/tagged()_matching_wikilinks]] and +>> [[todo/matching_different_kinds_of_links]]. I suppose the question +>> now is whether new code should be consistent with `tag` (and +>> potentially be fixed at the same time as tag itself), or try to +>> avoid those problems? +>> +>> The combination of `trail` with another pagespec in this plugin +>> does provide a neat way for it to work around having unwanted +>> pages in the report, by limiting by a suitable tag or subdirectory +>> or something. --s + +>>> Either that, or somehow combine tagging with fields, such that one could declare a tag, and it would create both a link and a field with a given value. (I've been working on something like that, but it still has bugs). +>>> That way, the test for whether something is tagged would be something like "link(tag/foo) and field(tag foo)". +>>> --K.A. + +>>>> I can see that this'd work well for 1:1 relationships like next +>>>> and previous, but I don't think that'd work for pages with more than +>>>> one tag - as far as I can see, `field`'s data model is that each +>>>> page has no more than one value for each field? +>>>> [[todo/Matching_different_kinds_of_links]] has some thoughts about +>>>> how it could be implemented, though. --s + +>>>>> You have a point there. I'm not sure what would be better: to add the concept of arrays/sets to `field`, or to think of tags as a special case. Problem is, I find tags as they currently exist to be too limiting. I prefer something that can be used for Faceted Tagging <http://en.wikipedia.org/wiki/Faceted_classification>; that is, things like Author:Fred Nurk, Genre:Historical, Rating:Good, and so on. Of course, that doesn't mean that each tag is limited to only one value, either; just to take the above examples, something might have more than one author, or have multiple genres (such as Historical + Romance). + +>>>>> It might be that adding arrays to the `field` plugin is a good way to go: after all, even though field=value is the most common, with the flexibility of things like YAML, one could define all sorts of things. What I'm not so sure about is how to return the values when queried, since some things would be expecting scalars all the time. Ah, perhaps I could use wantarray? +>>>>> Is there a way of checking a HTML::Template template to see if it expecting an array for a particular value? +>>>>> --[[KathrynAndersen]] diff --git a/doc/plugins/contrib/report/ikiwiki/directive/report.mdwn b/doc/plugins/contrib/report/ikiwiki/directive/report.mdwn new file mode 100644 index 000000000..8f8e6b4e8 --- /dev/null +++ b/doc/plugins/contrib/report/ikiwiki/directive/report.mdwn @@ -0,0 +1,149 @@ +[[!toc]] +The `report` directive is supplied by the [[!iki plugins/contrib/report desc=report]] plugin. + +This enables one to report on the structured data ("field" values) of +multiple pages; the output is formatted via a template. This depends +on the [[plugins/contrib/field]] plugin. + +The pages to report on are selected by a PageSpec given by the "pages" +parameter. The template is given by the "template" parameter. +The template expects the data from a single page; it is applied +to each matching page separately, one after the other. + +Additional parameters can be used to fill out the template, in +addition to the "field" values. Passed-in values override the +"field" values. + +There are two places where template files can live. One is in the +/templates directory on the wiki. These templates are wiki pages, and +can be edited from the web like other wiki pages. + +The second place where template files can live is in the global +templates directory (the same place where the page.tmpl template lives). +This is a useful place to put template files if you want to prevent +them being edited from the web, and you don't want to have to make +them work as wiki pages. + +## OPTIONS + +**template**: The template to use for the report. + +**pages**: A PageSpec to determine the pages to report on. + +**trail**: A page or pages to use as a "trail" page. + +When a trail page is used, the matching pages are limited to (a subset +of) the pages which that page links to; the "pages" pagespec in this +case, rather than selecting pages from the entire wiki, will select +pages from within the set of pages given by the trail page. + +Additional space-separated trail pages can be given in this option. +For example: + + trail="animals/cats animals/dogs" + +This will take the links from both the "animals/cats" page and the +"animals/dogs" page as the set of pages to apply the PageSpec to. + +**sort**: A SortSpec to determine how the matching pages should be sorted. + +**here_only**: Report on the current page only. + +This is useful in combination with "prev_" and "next_" variables to +make a navigation trail. +If the current page doesn't match the pagespec, then no pages will +be reported on. + +### Headers + +An additional option is the "headers" option. This is a space-separated +list of field names which are to be used as headers in the report. This +is a way of getting around one of the limitations of HTML::Template, that +is, not being able to do tests such as +"if this-header is not equal to previous-header". + +Instead, that logic is performed inside the plugin. The template is +given parameters "HEADER1", "HEADER2" and so on, for each header. +If the value of a header field is the same as the previous value, +then HEADER**N** is set to be empty, but if the value of the header +field is new, then HEADER**N** is given that value. + +#### Example + +Suppose you're writing a blog in which you record "moods", and you +want to display your blog posts by mood. + + \[[!report template="mood_summary" + pages="blog/*" + sort="Mood Date title" + headers="Mood"]] + +The "mood_summary" template might be like this: + + <TMPL_IF NAME="HEADER1"> + ## <TMPL_VAR NAME="HEADER1"> + </TMPL_IF> + ### <TMPL_VAR NAME="TITLE"> + (<TMPL_VAR NAME="DATE">) \[[<TMPL_VAR NAME="PAGE">]] + <TMPL_VAR NAME="DESCRIPTION"> + +### Advanced Options + +The following options are used to improve efficiency when dealing +with large numbers of pages; most people probably won't need them. + +**doscan**: + +Whether this report should be called in "scan" mode; if it is, then +the pages which match the pagespec are added to the list of links from +this page. This can be used by *another* report by setting this +page to be a "trail" page in *that* report. +It is not possible to use "trail" and "doscan" at the same time. +By default, "doscan" is false. + +## TEMPLATE PARAMETERS + +The templates are in HTML::Template format, just as [[plugins/template]] and +[[ftemplate]] are. The parameters passed in to the template are as follows: + +### Fields + +The structured data from the current matching page. This includes +"title" and "description" if they are defined. + +### Common values + +Values known for all pages: "page", "destpage". Also "basename" (the +base name of the page). + +### Passed-in values + +Any additional parameters to the report directive are passed to the +template; a parameter will override the matching "field" value. +For example, if you have a "Mood" field, and you pass Mood="bad" to +the report, then that will be the Mood which is given for the whole +report. + +Generally this is useful if one wishes to make a more generic +template and hide or show portions of it depending on what +values are passed in the report directive call. + +For example, one could have a "hide_mood" parameter which would hide +the "Mood" section of your template when it is true, which one could +use when the Mood is one of the headers. + +### Prev_ And Next_ Items + +Any of the above variables can be prefixed with "prev_" or "next_" +and that will give the previous or next value of that variable; that is, +the value from the previous or next page that this report is reporting on. +This is mainly useful for a "here_only" report. + +### Headers + +See the section on Headers. + +### First and Last + +If this is the first page-record in the report, then "first" is true. +If this is the last page-record in the report, then "last" is true. diff --git a/doc/plugins/contrib/tracking.mdwn b/doc/plugins/contrib/tracking.mdwn new file mode 100644 index 000000000..06d4120cd --- /dev/null +++ b/doc/plugins/contrib/tracking.mdwn @@ -0,0 +1,30 @@ +[[!template id=plugin name=tracking author="[[BerndZeimetz]]"]] +[[!toc]] +[[!tag plugins]] [[!tag patch]] [[!tag wishlist]] + +## NAME + +IkiWiki::Plugin::tracking - enable google/piwik visitor tracking + +## SYNOPSIS + + # activate the plugin + add_plugins => [qw{goodstuff tracking ....}], + + # to use Piwik: + piwik_id => '1', + piwik_https_url => "https://ssl.example.com/piwik/", + piwik_http_url => "http://www.example.com/piwik/", + + # to use Google Analytics: + google_analytics_id => "UA-xxxxxx-x" + +## DESCRIPTION + +This plugin includes the necessary tracking codes for Piwik and/or Google Analytics on all pages. Tracking codes will only be included if the necessary config options are set. The plugin could be enhanced to support goals/profiles and similar things, but I do not plan to do so. + +## DOWNLOAD + +* single files: [tracking.pm](http://git.recluse.de/?p=users/bzed/ikiwiki.git;a=blob;f=IkiWiki/Plugin/tracking.pm;hb=refs/heads/tracking) [piwik.tmpl](http://git.recluse.de/?p=users/bzed/ikiwiki.git;a=blob;f=templates/piwik.tmpl;hb=refs/heads/tracking) [google_analytics.tmpl](http://git.recluse.de/?p=users/bzed/ikiwiki.git;a=blob;f=templates/google_analytics.tmpl;hb=refs/heads/tracking) +* browse repository: <http://git.recluse.de/?p=users/bzed/ikiwiki.git;a=shortlog;h=refs/heads/tracking> +* git repo: `git://git.recluse.de/users/bzed/ikiwiki.git` or <http://git.recluse.de/repos/users/bzed/ikiwiki.git> (Use the tracking branch) diff --git a/doc/plugins/contrib/unixauth.mdwn b/doc/plugins/contrib/unixauth.mdwn index 6108ebfae..c97312b59 100644 --- a/doc/plugins/contrib/unixauth.mdwn +++ b/doc/plugins/contrib/unixauth.mdwn @@ -1,6 +1,8 @@ [[!template id=plugin name=unixauth core=0 author="[[schmonz]]"]] [[!tag type/auth]] +[[!template id=gitbranch branch=unixauth author="[[schmonz]]"]] + This plugin authenticates users against the Unix user database. It presents a similar UI to [[plugins/passwordauth]], but simpler, as there's no need to be able to register or change one's password. To authenticate, either [checkpassword](http://cr.yp.to/checkpwd.html) or [pwauth](http://www.unixpapa.com/pwauth/) must be installed and configured. `checkpassword` is strongly preferred. If your web server runs as an unprivileged user -- as it darn well should! -- then `checkpassword` needs to be setuid root. (Or your ikiwiki CGI wrapper, I guess, but don't do that.) Other checkpassword implementations are available, notably [checkpassword-pam](http://checkpasswd-pam.sourceforge.net/). @@ -17,205 +19,3 @@ __Security__: [As with passwordauth](/security/#index14h2), be wary of sending u `unixauth` needs the `HTTPS` environment variable, available in ikiwiki 2.67 or later (fixed in #[502047](http://bugs.debian.org/502047)), without which it fails closed. The plugin has not been tested with newer versions of ikiwiki. [[schmonz]] hopes to have time to polish this plugin soon. - -[[!toggle id="code" text="unixauth.pm"]] - -[[!toggleable id="code" text=""" - - #!/usr/bin/perl - # Ikiwiki unixauth authentication. - package IkiWiki::Plugin::unixauth; - - use warnings; - use strict; - use IkiWiki 2.00; - - sub import { - hook(type => "getsetup", id => "unixauth", call => \&getsetup); - hook(type => "formbuilder_setup", id => "unixauth", - call => \&formbuilder_setup); - hook(type => "formbuilder", id => "unixauth", - call => \&formbuilder); - hook(type => "sessioncgi", id => "unixauth", call => \&sessioncgi); - } - - sub getsetup () { - return - unixauth_type => { - type => "string", - example => "checkpassword", - description => "type of authenticator; can be 'checkpassword' or 'pwauth'", - safe => 0, - rebuild => 1, - }, - unixauth_command => { - type => "string", - example => "/path/to/checkpassword", - description => "full path and any arguments", - safe => 0, - rebuild => 1, - }, - unixauth_requiressl => { - type => "boolean", - example => "1", - description => "require SSL? strongly recommended", - safe => 0, - rebuild => 1, - }, - plugin => { - description => "Unix user authentication", - safe => 0, - rebuild => 1, - }, - } - - # Checks if a string matches a user's password, and returns true or false. - sub checkpassword ($$;$) { - my $user=shift; - my $password=shift; - my $field=shift || "password"; - - # It's very important that the user not be allowed to log in with - # an empty password! - if (! length $password) { - return 0; - } - - my $ret=0; - if (! exists $config{unixauth_type}) { - # admin needs to carefully think over his configuration - return 0; - } - elsif ($config{unixauth_type} eq "checkpassword") { - open UNIXAUTH, "|$config{unixauth_command} true 3<&0" or die("Could not run $config{unixauth_type}"); - print UNIXAUTH "$user\0$password\0Y123456\0"; - close UNIXAUTH; - $ret=!($?>>8); - } - elsif ($config{unixauth_type} eq "pwauth") { - open UNIXAUTH, "|$config{unixauth_command}" or die("Could not run $config{unixauth_type}"); - print UNIXAUTH "$user\n$password\n"; - close UNIXAUTH; - $ret=!($?>>8); - } - else { - # no such authentication type - return 0; - } - - if ($ret) { - my $userinfo=IkiWiki::userinfo_retrieve(); - if (! length $user || ! defined $userinfo || - ! exists $userinfo->{$user} || ! ref $userinfo->{$user}) { - IkiWiki::userinfo_setall($user, { - 'email' => '', - 'regdate' => time, - }); - } - } - - return $ret; - } - - sub formbuilder_setup (@) { - my %params=@_; - - my $form=$params{form}; - my $session=$params{session}; - my $cgi=$params{cgi}; - - # if not under SSL, die before even showing a login form, - # unless the admin explicitly says it's fine - if (! exists $config{unixauth_requiressl}) { - $config{unixauth_requiressl} = 1; - } - if ($config{unixauth_requiressl}) { - if ((! $config{sslcookie}) || (! exists $ENV{'HTTPS'})) { - die("SSL required to login. Contact your administrator.<br>"); - } - } - - if ($form->title eq "signin") { - $form->field(name => "name", required => 0); - $form->field(name => "password", type => "password", required => 0); - - if ($form->submitted) { - my $submittype=$form->submitted; - # Set required fields based on how form was submitted. - my %required=( - "Login" => [qw(name password)], - ); - foreach my $opt (@{$required{$submittype}}) { - $form->field(name => $opt, required => 1); - } - - # Validate password against name for Login. - if ($submittype eq "Login") { - $form->field( - name => "password", - validate => sub { - checkpassword($form->field("name"), shift); - }, - ); - } - - # XXX is this reachable? looks like no - elsif ($submittype eq "Login") { - $form->field( - name => "name", - validate => sub { - my $name=shift; - length $name && - IkiWiki::userinfo_get($name, "regdate"); - }, - ); - } - } - else { - # First time settings. - $form->field(name => "name"); - if ($session->param("name")) { - $form->field(name => "name", value => $session->param("name")); - } - } - } - elsif ($form->title eq "preferences") { - $form->field(name => "name", disabled => 1, - value => $session->param("name"), force => 1, - fieldset => "login"); - $form->field(name => "password", disabled => 1, type => "password", - fieldset => "login"), - } - } - - sub formbuilder (@) { - my %params=@_; - - my $form=$params{form}; - my $session=$params{session}; - my $cgi=$params{cgi}; - my $buttons=$params{buttons}; - - if ($form->title eq "signin") { - if ($form->submitted && $form->validate) { - if ($form->submitted eq 'Login') { - $session->param("name", $form->field("name")); - IkiWiki::cgi_postsignin($cgi, $session); - } - } - } - elsif ($form->title eq "preferences") { - if ($form->submitted eq "Save Preferences" && $form->validate) { - my $user_name=$form->field('name'); - } - } - } - - sub sessioncgi ($$) { - my $q=shift; - my $session=shift; - } - - 1 - -"""]] diff --git a/doc/plugins/contrib/unixrelpagespec.mdwn b/doc/plugins/contrib/unixrelpagespec.mdwn new file mode 100644 index 000000000..a35f76c30 --- /dev/null +++ b/doc/plugins/contrib/unixrelpagespec.mdwn @@ -0,0 +1,42 @@ +[[!template id=plugin name=unixrelpagespec core=0 author="[[Jogo]]"]] + +I don't understand why `./*` correspond to siblings and not subpages. +This is probably only meaningfull with [[plugins/autoindex]] turned on. + +Here is a small plugin wich follow usual Unix convention : + +- `./*` expand to subpages +- `../*` expand to siblings + +--- + #!/usr/bin/perl + # UnixRelPageSpec plugin. + # by Joseph Boudou <jogo at matabio dot net> + + package IkiWiki::Plugin::unixrelpagespec; + + use warnings; + use strict; + use IkiWiki 3.00; + + sub import { + inject( + name => 'IkiWiki::PageSpec::derel', + call => \&unix_derel + ); + } + + sub unix_derel ($$) { + my $path = shift; + my $from = shift; + + if ($path =~ m!^\.{1,2}/!) { + $from =~ s#/?[^/]+$## if (defined $from and $path =~ m/^\.{2}/); + $path =~ s#^\.{1,2}/##; + $path = "$from/$path" if length $from; + } + + return $path; + } + + 1; diff --git a/doc/plugins/contrib/xslt.mdwn b/doc/plugins/contrib/xslt.mdwn new file mode 100644 index 000000000..80c956c58 --- /dev/null +++ b/doc/plugins/contrib/xslt.mdwn @@ -0,0 +1,39 @@ +[[!template id=plugin name=xslt author="[[rubykat]]"]] +[[!tag type/chrome]] +## NAME + +IkiWiki::Plugin::xslt - ikiwiki directive to process an XML file with XSLT + +## SYNOPSIS + +\[[!xslt file="data1.xml" stylesheet="style1.xsl"]] + +## DESCRIPTION + +IkiWiki::Plugin::xslt is an IkiWiki plugin implementing a directive +to process an input XML data file with XSLT, and output the result in +the page where the directive was called. + +It is expected that the XSLT stylesheet will output valid HTML markup. + +## OPTIONS + +There are two arguments to this directive. + +* **file:** + The file which contains XML data to be processed. This file *must* have a `.xml` extension (`filename.xml`). This file is searched for using the usual IkiWiki mechanism, thus finding the file first in the same directory as the page, then in the directory above, and so on. + +* **stylesheet:** + The file which contains XSLT stylesheet to apply to the XML data. This file *must* have a `.xsl` extension (`filename.xsl`). This file is searched for using the usual IkiWiki mechanism, thus finding the file first in the same directory as the page, then in the directory above, and so on. + +## PREREQUISITES + + IkiWiki + XML::LibXML + XML::LibXSLT + +## DOWNLOAD + +* browse at GitHub: <http://github.com/rubykat/ikiplugins/blob/master/IkiWiki/Plugin/xslt.pm> +* git repo at git://github.com/rubykat/ikiplugins.git + diff --git a/doc/plugins/contrib/xslt/discussion.mdwn b/doc/plugins/contrib/xslt/discussion.mdwn new file mode 100644 index 000000000..72cce083c --- /dev/null +++ b/doc/plugins/contrib/xslt/discussion.mdwn @@ -0,0 +1,49 @@ +## security + +I'm curious what the security implications of having this plugin on a +publically writable wiki are. + +First, it looks like the way it looks up the stylesheet file will happily +use a regular .mdwn wiki page as the stylsheet. Which means any user can +create a stylesheet and have it be used, without needing permission to +upload arbitrary files. That probably needs to be fixed; one way would be +to mandate that the `srcfile` has a `.xsl` extension. + +Secondly, if an attacker is able to upload a stylesheet file somehow, could +this be used to attack the server where it is built? I know that xslt is +really a full programming language, so I assume at least DOS attacks are +possible. Can it also read other arbitrary files, run other programs, etc? +--[[Joey]] + +> For the first point, agreed. It should probably check that the data file has a `.xml` extension also. Have now fixed. + +> For the second point, I think the main concern would be resource usage. XSLT is a pretty limited language; it can read other XML files, but it can't run other programs so far as I know. + +> -- [[KathrynAndersen]] + +>> XSLT is, indeed, a Turing-complete programming language. + However, [XML::LibXSLT][] provides a set of functions to help + to minimize the damage that may be caused by running a random + program. + +>> In particular, `max_depth ()` allows for the maximum + recursion depth to be set, while + `read_file ()`, `write_file ()`, `create_dir ()`, + `read_net ()` and `write_net ()` + are the callbacks that allow any of the possible file + operations to be denied. + +>> To be honest, I'd prefer for the `read_file ()` callback to + only grant access to the files below the Ikiwiki source + directory, and for all the `write_`… and + …`_net` callbacks to deny the access unconditionally. + +>> One more wishlist item: allow the set of locations to take + `.xsl` files from to be preconfigured, so that, e. g., + one could allow (preasumably trusted) system stylesheets, + while disallowing any stylesheets that are placed on the Wiki + itself. + +>> — Ivan Shmakov, 2010-03-28Z. + +[XML::LibXSLT]: http://search.cpan.org/~PAJAS/XML-LibXSLT/LibXSLT.pm diff --git a/doc/plugins/contrib/ymlfront.mdwn b/doc/plugins/contrib/ymlfront.mdwn new file mode 100644 index 000000000..a2c649044 --- /dev/null +++ b/doc/plugins/contrib/ymlfront.mdwn @@ -0,0 +1,106 @@ +[[!template id=plugin name=ymlfront author="[[rubykat]]"]] +[[!tag type/meta]] +[[!toc]] +## NAME + +IkiWiki::Plugin::ymlfront - add YAML-format data to a page + +## SYNOPSIS + + # activate the plugin + add_plugins => [qw{goodstuff ymlfront ....}], + +## DESCRIPTION + +This plugin provides a way of adding arbitrary meta-data (data fields) to any +page by prefixing the page with a YAML-format document. This also provides +the [[ikiwiki/directive/ymlfront]] directive, which enables one to put +YAML-formatted data inside a standard IkiWiki [[ikiwiki/directive]]. + +This is a way to create per-page structured data, where each page is +treated like a record, and the structured data are fields in that record. This +can include the meta-data for that page, such as the page title. + +This plugin is meant to be used in conjunction with the [[field]] plugin. + +## DETAILS + +If one is not using the ymlfront directive, the YAML-format data in a page +must be placed at the start of the page and delimited by lines containing +precisely three dashes. The "normal" content of the page then follows. + +For example: + + --- + title: Foo does not work + Urgency: High + Status: Assigned + AssignedTo: Fred Nurk + Version: 1.2.3 + --- + When running on the Sprongle system, the Foo function returns incorrect data. + +What will normally be displayed is everything following the second line of dashes. +That will be htmlized using the page-type of the page-file. + +### Accessing the Data + +There are a few ways to access the given YAML data. + +* [[getfield]] plugin + + The **getfield** plugin can display the data as individual variable values. + + For example: + + --- + title: Foo does not work + Urgency: High + Status: Assigned + AssignedTo: Fred Nurk + Version: 1.2.3 + --- + # {{$title}} + + **Urgency:** {{$Urgency}}\\ + **Status:** {{$Status}}\\ + **Assigned To:** {{$AssignedTo}}\\ + **Version:** {{$Version}} + + When running on the Sprongle system, the Foo function returns incorrect data. + +* [[ftemplate]] plugin + + The **ftemplate** plugin is like the [[plugins/template]] plugin, but it is also aware of [[field]] values. + + For example: + + --- + title: Foo does not work + Urgency: High + Status: Assigned + AssignedTo: Fred Nurk + Version: 1.2.3 + --- + \[[!ftemplate id="bug_display_template"]] + + When running on the Sprongle system, the Foo function returns incorrect data. + +* [[report]] plugin + + The **report** plugin is like the [[ftemplate]] plugin, but it reports on multiple pages, rather than just the current page. + +* write your own plugin + + In conjunction with the [[field]] plugin, you can write your own plugin to access the data. + +## PREREQUISITES + + IkiWiki + IkiWiki::Plugin::field + YAML::Any + +## DOWNLOAD + +* browse at GitHub: <http://github.com/rubykat/ikiplugins/blob/master/IkiWiki/Plugin/ymlfront.pm> +* git repo at git://github.com/rubykat/ikiplugins.git diff --git a/doc/plugins/contrib/ymlfront/discussion.mdwn b/doc/plugins/contrib/ymlfront/discussion.mdwn new file mode 100644 index 000000000..b1fd65fff --- /dev/null +++ b/doc/plugins/contrib/ymlfront/discussion.mdwn @@ -0,0 +1,6 @@ +Now that I have implemented a \[[!ymlfront ...]] directive, I would like to remove support for the old "---" delimited format, because + +* it is fragile (easily breakable) +* it is non-standard + +Any objections? diff --git a/doc/plugins/creole/discussion.mdwn b/doc/plugins/creole/discussion.mdwn index 38ee2bd78..7f47c2c97 100644 --- a/doc/plugins/creole/discussion.mdwn +++ b/doc/plugins/creole/discussion.mdwn @@ -12,4 +12,11 @@ I've installed Text::WikiCreole 0.05 and enabled the plugin, but I get an error >>> forgot, done now --[[Joey]] +--- +## External Links + I'm moving over a really stinkingly old UseMod and creole seems the nearest match. I've worked out that Bare /Subpage links need to become \[\[Subpage\]\], and Top/Sub links need to be \[\[Top/Sub\]\] (or \[\[Top/Sub|Top/Sub\]\], to display in exactly the same way), but I'm stuck on generic hyperlinks. The creole cheat sheet says I should be able to do \[\[http://url.path/foo|LinkText\]\], but that comes out as a link to create the "linktext" page, and Markdown-style \[Link Text\](http://url.path/foo) just gets rendered as is. Any suggestions? --[[schmonz]] + +> Was this problem ever solved? -- Thiana + +>> Not by me. If I were looking at the problem now, with fresh eyes, I'd probably bite the bullet and just convert everything to Markdown. --[[schmonz]] diff --git a/doc/plugins/cutpaste.mdwn b/doc/plugins/cutpaste.mdwn index f74f8a269..ea3665c44 100644 --- a/doc/plugins/cutpaste.mdwn +++ b/doc/plugins/cutpaste.mdwn @@ -1,5 +1,5 @@ [[!template id=plugin name=cutpaste author="[[Enrico]]"]] -[[!tag type/chrome]] +[[!tag type/widget]] This plugin provides the [[ikiwiki/directive/cut]], [[ikiwiki/directive/copy]] and [[ikiwiki/directive/paste]] diff --git a/doc/plugins/date.mdwn b/doc/plugins/date.mdwn new file mode 100644 index 000000000..2a33f014c --- /dev/null +++ b/doc/plugins/date.mdwn @@ -0,0 +1,6 @@ +[[!template id=plugin name=date author="[[Joey]]"]] +[[!tag type/widget]] + +This plugin provides the [[ikiwiki/directive/date]] +[[ikiwiki/directive]], which provides a way to display an arbitrary date +in a page. diff --git a/doc/plugins/ddate.mdwn b/doc/plugins/ddate.mdwn index 741606a6e..17bb16cff 100644 --- a/doc/plugins/ddate.mdwn +++ b/doc/plugins/ddate.mdwn @@ -1,6 +1,7 @@ [[!template id=plugin name=ddate author="[[Joey]]"]] [[!tag type/fun]] [[!tag type/date]] +[[!tag type/chrome]] Enables use of Discordian dates. `--timeformat` can be used to change the date format; see `ddate(1)`. diff --git a/doc/plugins/discussion.mdwn b/doc/plugins/discussion.mdwn index 854307a98..d47fa4718 100644 --- a/doc/plugins/discussion.mdwn +++ b/doc/plugins/discussion.mdwn @@ -34,3 +34,9 @@ Any objections to listing plugins alphabetically rather than by creation date? >> "recently changed" list with the 10 most recently changed plugins >> at the top. That would allow what you suggested, but still allow >> the main list to be alphabetical. -- [[Will]] + +### `themes.pm` instead of `themes.mdwn` + +Could someone please change the filename. I cannot fix this using the Web interface. Somebody step in please. --[[PaulePanter]] + +> Oops, not the first time I've made that mistake! --[[Joey]] diff --git a/doc/plugins/editpage.mdwn b/doc/plugins/editpage.mdwn index b830e51aa..346ee7c78 100644 --- a/doc/plugins/editpage.mdwn +++ b/doc/plugins/editpage.mdwn @@ -1,4 +1,5 @@ [[!template id=plugin name=editpage core=1 author="[[Joey]]"]] +[[!tag type/web]] This plugin allows editing wiki pages in the web interface. It's enabled by default if [[cgi]] is enabled; disable it if you want cgi for other things diff --git a/doc/plugins/edittemplate.mdwn b/doc/plugins/edittemplate.mdwn index 85dfdfc2d..c19ecd858 100644 --- a/doc/plugins/edittemplate.mdwn +++ b/doc/plugins/edittemplate.mdwn @@ -2,5 +2,5 @@ [[!tag type/web]] This plugin provides the [[ikiwiki/directive/edittemplate]] [[ikiwiki/directive]]. -This directive allows registering template pages, that provide default -content for new pages created using the web frontend. +This directive allows registering [[template|templates]] pages, that +provide default content for new pages created using the web frontend. diff --git a/doc/plugins/filecheck.mdwn b/doc/plugins/filecheck.mdwn index f4563d58e..b038bc433 100644 --- a/doc/plugins/filecheck.mdwn +++ b/doc/plugins/filecheck.mdwn @@ -1,5 +1,5 @@ [[!template id=plugin name=filecheck core=0 author="[[Joey]]"]] -[[!tag type/useful]] +[[!tag type/special-purpose]] This plugin enhances the regular [[ikiwiki/PageSpec]] syntax with some additional tests, for things like file size, mime type, and virus @@ -7,7 +7,8 @@ status. These tests are mostly useful for the [[attachment]] plugin, and are documented [[here|ikiwiki/pagespec/attachment]]. This plugin will use the [[!cpan File::MimeInfo::Magic]] perl module, if -available, for mimetype checking. +available, for mimetype checking. It falls back to using the `file` command +if necessary for hard to detect files. The `virusfree` [[PageSpec|ikiwiki/pagespec/attachment]] requires that ikiwiki be configured with a virus scanner program via the `virus_checker` diff --git a/doc/plugins/filecheck/discussion.mdwn b/doc/plugins/filecheck/discussion.mdwn index f91950b7d..f3f3c4ffd 100644 --- a/doc/plugins/filecheck/discussion.mdwn +++ b/doc/plugins/filecheck/discussion.mdwn @@ -15,3 +15,71 @@ if ::magic() returns undef? --[[DavidBremner]] >> for ::default >>> Applied + +--- + +At first I need to thank you for ikiwiki - it is what I was always looking +for - coming from a whole bunch of wiki engines, this is the most +intelligent and least bloated one. + +My question is about the [[plugins/attachment]] plugin in conjunction with +[[plugins/filecheck]]: I am using soundmanger2 js-library for having +attached media files of all sorts played inline a page. + +To achieve this soundmanager2 asks for an id inside a ul-tag surrounding +the a-tag. I was wondering if the Insert Link button could be provided with +a more elegant solution than to have this code snippet to be filled in by +hand every time you use it to insert links for attached media files. And in +fact there apparently is a way in attachment.pm. + +While I can see that it is not needed for everyone inserting links to +attached media files to have ul- and li-tags surrounding the link itself as +well as being supplied with an id fill in, for me it would be the most +straight forward solution. Pitty is I don't have the time to wrap my head +around perl to write a patch myself. Is there any way to have this made an +option which can be called via templates? + +For sure I would like to donate for such a patch as well as I will do it +for ikiwiki anyway, because it is such a fine application. + +If you are not familiar with soundmanager2: It is a very straight forward +solution to inline mediafiles, using the usual flash as well as html5 +solutions (used by soundcloud.com, freesound.org and the like). Worth a +look anyway [schillmania.com](http://www.schillmania.com/) + +Boris + +> The behavior of "Insert Links" is currently hardcoded to support images +> and has a fallback for other files. What you want is a +> [[todo/generic_insert_links]] that can insert a template directive. +> Then you could make a template that generates the html needed for +> soundmanager2. I've written down a design at +> [[todo/generic_insert_links]]; I am currently very busy and not sure +> when I will get around to writing it, but with it on the todo list +> I shouldn't forget. --[[Joey]] +> +> You could make a [[ikiwiki/directive/template]] for soundmanager2 +> now, and manually insert the template directive for now +> when you want to embed a sound file. Something like this: + + \[[!template id=embed_mp3 file=your.mp3]] + +> Then in templates/embed_mp3.mdwn, something vaguely like this: + + <ul id="foo"> + <a href="<TMPL_VAR FILE>">mp3</a> + </ul> + +>> Thanks a lot - looking forward to [[todo/generic_insert_links]] - I am using the [[ikiwiki/directive/template]] variant also adding a name vaiable, it looks like this and is working fine: + + <ul class="playlist"> + <li> + <a href="<TMPL_VAR FILE>"><TMPL_VAR NAME></a> + </li> + </ul> + +>> Calling it: + + \[[!template id=embedmedia.tmpl file=../Tinas_Gonna_Have_A_Baby.mp3 name="Tina's Gonna Have A Baby" ]] + +>> BTW your Flattr button doesn't seem to work properly - or it is Flattr itself that doesn't- clicking it won't let ikiwiki show up on my Dashboard. diff --git a/doc/plugins/flattr.mdwn b/doc/plugins/flattr.mdwn new file mode 100644 index 000000000..5da279518 --- /dev/null +++ b/doc/plugins/flattr.mdwn @@ -0,0 +1,9 @@ +[[!template id=plugin name=flattr author="[[Joey]]"]] +[[!tag type/web]] + +[Flattr](http://flattr.com/) is a social micropayment platform. +This plugin allows easily adding Flattr buttons to pages, +using the [[ikiwiki/directive/flattr]] directive. + +This plugin has a configuration setting. `flattr_userid` can be set +to either your numeric flatter userid, or your flattr username. diff --git a/doc/plugins/format.mdwn b/doc/plugins/format.mdwn index 91e707fcf..b41d365aa 100644 --- a/doc/plugins/format.mdwn +++ b/doc/plugins/format.mdwn @@ -1,5 +1,5 @@ [[!template id=plugin name=format core=0 author="[[Joey]]"]] -[[!tag type/format]] +[[!tag type/widget]] This plugin allows mixing different page formats together, by embedding text formatted one way inside a page formatted another way. This is done diff --git a/doc/plugins/fortune.mdwn b/doc/plugins/fortune.mdwn index 9966f456d..3cb125ac1 100644 --- a/doc/plugins/fortune.mdwn +++ b/doc/plugins/fortune.mdwn @@ -1,5 +1,6 @@ [[!template id=plugin name=fortune author="[[Joey]]"]] [[!tag type/fun]] +[[!tag type/widget]] This plugin implements the [[ikiwiki/directive/fortune]] [[ikiwiki/directive]]. This directive uses the `fortune` program to insert a fortune into the page. diff --git a/doc/plugins/getsource.mdwn b/doc/plugins/getsource.mdwn new file mode 100644 index 000000000..d5404a628 --- /dev/null +++ b/doc/plugins/getsource.mdwn @@ -0,0 +1,14 @@ +[[!template id=plugin name=getsource author="[[Will_Uther|Will]]"]] +[[!tag type/web]] + +This plugin adds a "Source" link to the top of each page that uses +the CGI to display the page's source. + +Configuration for this plugin in the setup file: + +* `getsource_mimetype => "text/plain; charset=utf-8"` + + Sets the MIME type used when page source is requested. The default is + usually appropriate, but you could set this to `application/octet-stream` + to encourage browsers to download the source to a file rather than showing + it in the browser. diff --git a/doc/plugins/getsource/discussion.mdwn b/doc/plugins/getsource/discussion.mdwn new file mode 100644 index 000000000..3e985948b --- /dev/null +++ b/doc/plugins/getsource/discussion.mdwn @@ -0,0 +1,3 @@ +It would be very cool if this plugin was enabled by default. One of the best ways to learn how to do various advanced things is to be able to "view source" on other wiki's which do things you like. -- [[AdamShand]] + +This plugin requires the cgi plugin. If you run a static site, you may check the [[repolist]] plugin. -- [[weakish]] diff --git a/doc/plugins/goodstuff/discussion.mdwn b/doc/plugins/goodstuff/discussion.mdwn new file mode 100644 index 000000000..4ccea4ad4 --- /dev/null +++ b/doc/plugins/goodstuff/discussion.mdwn @@ -0,0 +1,8 @@ +### What is the syntax for enabling plugins in the setup file? + +Here is an example snippet from a working setup file: + + <pre> + # plugins to add to the default configuration + add_plugins => ['goodstuff'], +</pre> diff --git a/doc/plugins/google.mdwn b/doc/plugins/google.mdwn index 7c61e637b..349c278ee 100644 --- a/doc/plugins/google.mdwn +++ b/doc/plugins/google.mdwn @@ -5,8 +5,7 @@ This plugin adds a search form to the wiki, using google's site search. Google is asked to search for pages in the domain specified in the wiki's `url` configuration parameter. Results will depend on whether google has -indexed the site, and how recently. Also, if the same domain has other -content, outside the wiki's content, it will be searched as well. +indexed the site, and how recently. The [[search]] plugin offers full text search of only the wiki, but requires that a search engine be installed on your site. diff --git a/doc/plugins/google/discussion.mdwn b/doc/plugins/google/discussion.mdwn index babc919d2..e664f5723 100644 --- a/doc/plugins/google/discussion.mdwn +++ b/doc/plugins/google/discussion.mdwn @@ -4,3 +4,22 @@ This is not very good since the default ikiwiki templates produce XHTML instead of HTML. > Fixed, thanks for the patch! --[[Joey]] + +It works to pass the whole wiki baseurl to Google, not just the +domain, and appears to be legal. I've got a wiki that'd benefit +(it's a few directories down from the root). Can the plugin be +tweaked to do this? --[[schmonz]] + +> Done. --[[Joey]] + +The main page said: + +> Also, if the same domain has other content, outside the wiki's +> content, it will be searched as well. + +Is it still true now? (Or this statement is out of date?) --[weakish] + +[weakish]: http://weakish.pigro.net + +> I checked, and it's never been true; google is given the url to the top +> of the wiki and only searches things in there. --[[Joey]] diff --git a/doc/plugins/goto.mdwn b/doc/plugins/goto.mdwn index 9c401c5d2..8e1de7a10 100644 --- a/doc/plugins/goto.mdwn +++ b/doc/plugins/goto.mdwn @@ -1,5 +1,5 @@ [[!template id=plugin name=goto author="[[Simon_McVittie|smcv]]"]] -[[!tag type/useful]] +[[!tag type/web]] This plugin adds a `do=goto` mode for the IkiWiki CGI script. It's mainly for internal use by the [[404]], [[comments]] and [[recentchanges]] diff --git a/doc/plugins/graphviz.mdwn b/doc/plugins/graphviz.mdwn index b89f16b59..d57d7dc94 100644 --- a/doc/plugins/graphviz.mdwn +++ b/doc/plugins/graphviz.mdwn @@ -1,5 +1,5 @@ [[!template id=plugin name=graphviz author="[[JoshTriplett]]"]] -[[!tag type/chrome type/format]] +[[!tag type/widget]] This plugin provides the [[ikiwiki/directive/graph]] [[ikiwiki/directive]]. This directive allows embedding [graphviz](http://www.graphviz.org/) graphs in a @@ -22,4 +22,4 @@ Some example graphs: [[!graph src="a -- b -- c -- a;" prog="circo" type="graph"]] """]] -This plugin uses the [[!cpan Digest::SHA1]] perl module. +This plugin uses the [[!cpan Digest::SHA]] perl module. diff --git a/doc/plugins/haiku.mdwn b/doc/plugins/haiku.mdwn index 74eac1c29..448733d95 100644 --- a/doc/plugins/haiku.mdwn +++ b/doc/plugins/haiku.mdwn @@ -1,5 +1,6 @@ [[!template id=plugin name=haiku author="[[Joey]]"]] [[!tag type/fun]] +[[!tag type/widget]] This plugin provides a [[ikiwiki/directive/haiku]] [[ikiwiki/directive]]. The directive allows inserting a randomly generated haiku into a wiki page. diff --git a/doc/plugins/highlight.mdwn b/doc/plugins/highlight.mdwn index 44ced80f7..5f04fda52 100644 --- a/doc/plugins/highlight.mdwn +++ b/doc/plugins/highlight.mdwn @@ -8,8 +8,10 @@ languages and file formats. ## prerequisites You will need to install the perl bindings to the -[highlight library](http://www.andre-simon.de/), which in Debian -are in the [[!debpkg libhighlight-perl]] package. +[highlight library](http://www.andre-simon.de/). In Debian +they are in the [[!debpkg libhighlight-perl]] package. If +your distribution does not have them, look in `examples/swig` +in highlight's source. ## embedding highlighted code diff --git a/doc/plugins/highlight/discussion.mdwn b/doc/plugins/highlight/discussion.mdwn index 7d3cabea9..556b04145 100644 --- a/doc/plugins/highlight/discussion.mdwn +++ b/doc/plugins/highlight/discussion.mdwn @@ -10,3 +10,12 @@ if you want to implement it I won't complain :-). [[DavidBremner]] > of, but if there are multiple options, giving each its own nane would > word better for websetup than would putting all the options in a > sub-hash. --[[Joey]] + + +Has anyone got this running with CentOS/RHEL ? +Having trouble working out where to get the perl bindings for highlight. --[Mick](http://www.lunix.com.au) + +> The perl bindings are hidden in `examples/swig` in highlight's source. +> --[[Joey]] + +Thanks for prompt reply.All working. I will post on my site tonight and link here what I did on CentOS to make this work. --[Mick](http://www.lunix.com.au) diff --git a/doc/plugins/htmlscrubber/discussion.mdwn b/doc/plugins/htmlscrubber/discussion.mdwn new file mode 100644 index 000000000..5e8b637b7 --- /dev/null +++ b/doc/plugins/htmlscrubber/discussion.mdwn @@ -0,0 +1,18 @@ +**Ok, I have yet to post a big dummy wiki-noobie question around here, so here goes:** + +Yes, I want to play around with *gulp* Google Ads on an ikiwiki blog, namely, in the *sidebar*. + +No, I do not want to turn htmlscrubber off, but apart from that I have not been able to allow <script> elements as required by Google. + +Thoughts? + +--- + +***Fixed!*** + +Did some more reading, did some searching on the wiki, and found, under *embed*, these + + htmlscrubber_skip => '!*/Discussion', + locked_pages => '!*/Discussion', + +Thanks! diff --git a/doc/plugins/httpauth.mdwn b/doc/plugins/httpauth.mdwn index 11ed223e7..0eda5554f 100644 --- a/doc/plugins/httpauth.mdwn +++ b/doc/plugins/httpauth.mdwn @@ -2,8 +2,34 @@ [[!tag type/auth]] This plugin allows HTTP basic authentication to be used to log into the -wiki. To use the plugin, your web server should be set up to perform HTTP -basic authentiation for at least the directory containing `ikiwiki.cgi`. -The authenticated user will be automatically signed into the wiki. +wiki. -This plugin is included in ikiwiki, but is not enabled by default. +## fully authenticated wiki + +One way to use the plugin is to configure your web server to require +HTTP basic authentication for any access to the directory containing the +wiki (and `ikiwiki.cgi`). The authenticated user will be automatically +signed into the wiki. This method is suitable only for private wikis. + +## separate cgiauthurl + +To use httpauth for a wiki where the content is public, and where +the `ikiwiki.cgi` needs to be usable without authentication (for searching, +or logging in using other methods, and so on), you can configure a separate +url that is used for authentication, via the `cgiauthurl` option in the setup +file. This url will then be redirected to when a user chooses to log in using +httpauth. + +A typical setup is to make an `auth` subdirectory, and symlink `ikiwiki.cgi` +into it. Then configure the web server to require authentication only for +access to the `auth` subdirectory. Then `cgiauthurl` is pointed at this +symlink. + +## using only httpauth for some pages + +If you want to only use httpauth for editing some pages, while allowing +other authentication methods to be used for other pages, you can +configure `httpauth_pagespec` in the setup file. This makes Edit +links on pages that match the [[ikiwiki/PageSpec]] automatically use +the `cgiauthurl`, and prevents matching pages from being edited by +users authentication via other methods. diff --git a/doc/plugins/img.mdwn b/doc/plugins/img.mdwn index 114438765..a6cd90f28 100644 --- a/doc/plugins/img.mdwn +++ b/doc/plugins/img.mdwn @@ -1,5 +1,5 @@ [[!template id=plugin name=img author="Christian Mock"]] -[[!tag type/chrome]] +[[!tag type/widget]] This plugin provides the [[ikiwiki/directive/img]] [[ikiwiki/directive]]. While ikiwiki supports inlining full-size images by making a diff --git a/doc/plugins/inline.mdwn b/doc/plugins/inline.mdwn index 6c3282576..3eb849fdb 100644 --- a/doc/plugins/inline.mdwn +++ b/doc/plugins/inline.mdwn @@ -1,4 +1,5 @@ [[!template id=plugin name=inline core=1 author="[[Joey]]"]] +[[!tag type/widget]] This plugin provides the [[ikiwiki/directive/inline]] [[ikiwiki/directive]], which allows including one wiki page diff --git a/doc/plugins/link.mdwn b/doc/plugins/link.mdwn index 6adbf3eae..7dfa50de4 100644 --- a/doc/plugins/link.mdwn +++ b/doc/plugins/link.mdwn @@ -1,4 +1,5 @@ [[!template id=plugin name=link core=1 author="[[Joey]]"]] [[!tag type/link]] -This plugin implements standard [[WikiLinks|ikiwiki/wikilink]]. +This plugin implements standard [[WikiLinks|ikiwiki/wikilink]] and links to +external pages. diff --git a/doc/plugins/linkmap.mdwn b/doc/plugins/linkmap.mdwn index 89cb9d8ae..7e51cd935 100644 --- a/doc/plugins/linkmap.mdwn +++ b/doc/plugins/linkmap.mdwn @@ -1,5 +1,6 @@ [[!template id=plugin name=linkmap author="[[Joey]]"]] [[!tag type/meta]] +[[!tag type/widget]] [[!tag type/slow]] This plugin provides the [[ikiwiki/directive/linkmap]] [[ikiwiki/directive]]. diff --git a/doc/plugins/listdirectives.mdwn b/doc/plugins/listdirectives.mdwn index 2d9bce01d..df854de52 100644 --- a/doc/plugins/listdirectives.mdwn +++ b/doc/plugins/listdirectives.mdwn @@ -1,5 +1,6 @@ [[!template id=plugin name=listdirectives author="Will"]] [[!tag type/meta]] +[[!tag type/widget]] This plugin provides the [[ikiwiki/directive/listdirectives]] [[ikiwiki/directive]], which inserts a list of currently available diff --git a/doc/plugins/localstyle.mdwn b/doc/plugins/localstyle.mdwn new file mode 100644 index 000000000..70a909d68 --- /dev/null +++ b/doc/plugins/localstyle.mdwn @@ -0,0 +1,12 @@ +[[!template id=plugin name=localstyle author="[[Joey]]"]] +[[!tag type/chrome]] + +This plugin allows styling different sections of a wiki using different +versions of the local.css [[CSS]] file. Normally this file is read from the +top level of the wiki, but with this plugin enabled, standard +[[ikiwiki/subpage/LinkingRules]] are used to find the closest local.css +file to each page. + +So, for example, to use different styling for page `foo`, as well as all +of its [[SubPages|ikiwiki/subpage]], such as `foo/bar`, create a +`foo/local.css`. diff --git a/doc/plugins/lockedit.mdwn b/doc/plugins/lockedit.mdwn index c8f64ea47..681163203 100644 --- a/doc/plugins/lockedit.mdwn +++ b/doc/plugins/lockedit.mdwn @@ -12,14 +12,9 @@ to lock. For example, you could choose to lock all pages created before 2006, or all pages that are linked to from the page named "locked". More usually though, you'll just list some names of pages to lock. -One handy thing to do if you're using ikiwiki for your blog is to lock -"* and !*/Discussion". This prevents others from adding to or modifying -posts in your blog, while still letting them comment via the Discussion -pages. - -Alternatively, if you're using the [[comments]] plugin, you can lock -"!postcomment(*)" to allow users to comment on pages, but not edit anything -else. +If you want to lock down a blog so only you can post to it, you can just +lock "*", and enable the [[opendiscussion]] plugin, so readers can still post +[[comments]]. Wiki administrators can always edit locked pages. The [[ikiwiki/PageSpec]] can specify that some pages are not locked for some users. For example, diff --git a/doc/plugins/lockedit/discussion.mdwn b/doc/plugins/lockedit/discussion.mdwn new file mode 100644 index 000000000..867fc6a51 --- /dev/null +++ b/doc/plugins/lockedit/discussion.mdwn @@ -0,0 +1,18 @@ +This plugin not only locks pages but ensures too a user is logged in. This +seems to me redundant with signedit. I propose [removing the if block that +calls needsignin ]. + +> That was added because the most typical reason for being unable to edit a +> page is that you are not logged in. And without the jump to logging the +> user in, there is no way for the user to log in, without navigating away +> from the page they were trying to edit. --[[Joey]] + +>> Ok, but the problem is that when you don't want any signin form you end up +>> with a lone login button. That might happend if you lock pages only on IP +>> adresses, if you use another cookie from another webapp... + +>> That happends to me and I had to reimplement lockedit in my private auth +>> plugin. + +>> Perhaps you could return undef on that case and let another plugin do the +>> needsignin call ? -- [[Jogo]] diff --git a/doc/plugins/map.mdwn b/doc/plugins/map.mdwn index 8f5a9f15e..b164d5ca8 100644 --- a/doc/plugins/map.mdwn +++ b/doc/plugins/map.mdwn @@ -1,5 +1,5 @@ [[!template id=plugin name=map author="Alessandro Dotti Contra"]] -[[!tag type/meta]] +[[!tag type/meta type/widget]] This plugin provides the [[ikiwiki/directive/map]] [[ikiwiki/directive]], which generates a hierarchical page map for the wiki. diff --git a/doc/plugins/map/discussion.mdwn b/doc/plugins/map/discussion.mdwn index 2f7b140d6..54c921b0f 100644 --- a/doc/plugins/map/discussion.mdwn +++ b/doc/plugins/map/discussion.mdwn @@ -1,7 +1,7 @@ I'm wanting a [[map]] (with indentation levels) showing page _titles_ instead of page 'names'. As far as I can see, this is not an option with existing plugins - I can get a list of pages using [[inline]] and -appropriate [[wikitemplates]], but that has no indentation and therefore +appropriate [[templates]], but that has no indentation and therefore doesn't show structure well. The quick way is to modify the map plugin to have a 'titles' option. The diff --git a/doc/plugins/mdwn.mdwn b/doc/plugins/mdwn.mdwn index 6ad1fb229..ce1b6097a 100644 --- a/doc/plugins/mdwn.mdwn +++ b/doc/plugins/mdwn.mdwn @@ -12,9 +12,9 @@ this plugin. The [original version of markdown](http://daringfireball.net/projects/markdown/) can be used, or the [[!cpan Text::Markdown]] perl module. -[[!cpan Text::Markdown]] also includes a markdown variant called -[multimarkdown](http://fletcherpenney.net/MultiMarkdown/), which supports -tables, footnotes, and other new features. Multimarkdown is not enabled by -default, but can be turned on via the `multimarkdown` option in the setup -file. Note that multimarkdown's metadata and wikilinks features are -disabled when it's used with ikiwiki. +[[!cpan Text::MultiMarkdown]] can be used in order to use tables, footnotes, +and other new features from the markdown variant called +[multimarkdown](http://fletcherpenney.net/MultiMarkdown/). Multimarkdown is +not enabled by default, but can be turned on via the `multimarkdown` option +in the setup file. Note that multimarkdown's metadata and wikilinks +features are disabled when it's used with ikiwiki. diff --git a/doc/plugins/mirrorlist.mdwn b/doc/plugins/mirrorlist.mdwn index b371e8eb7..aedc1f4a0 100644 --- a/doc/plugins/mirrorlist.mdwn +++ b/doc/plugins/mirrorlist.mdwn @@ -1,5 +1,5 @@ [[!template id=plugin name=mirror author="[[Joey]]"]] -[[!tag type/special-purpose]] +[[!tag type/web]] This plugin allows adding links a list of mirrors to each page in the wiki. For each mirror, a name and an url should be specified. Pages are diff --git a/doc/plugins/moderatedcomments.mdwn b/doc/plugins/moderatedcomments.mdwn new file mode 100644 index 000000000..f9466e833 --- /dev/null +++ b/doc/plugins/moderatedcomments.mdwn @@ -0,0 +1,12 @@ +[[!template id=plugin name=moderatedcomments author="[[Joey]]"]] +[[!tag type/auth]] + +This plugin causes [[comments]] to be held for manual moderation. +Admins can access the comment moderation queue via their preferences page. + +By default, all comments made by anyone who is not an admin will be held +for moderation. The `moderate_pagespec` setting can be used to specify a +[[ikiwiki/PageSpec]] to match comments and users who should be moderated. +For example, to avoid moderating comments from logged-in users, set +`moderate_pagespec` to "`!user(*)`". Or to moderate everyone except for +admins, set it to "`!admin(*)`". diff --git a/doc/plugins/more.mdwn b/doc/plugins/more.mdwn index e9a971289..a0664e843 100644 --- a/doc/plugins/more.mdwn +++ b/doc/plugins/more.mdwn @@ -1,5 +1,5 @@ [[!template id=plugin name=more author="Ben"]] -[[!tag type/format]] +[[!tag type/widget]] This plugin provides the [[ikiwiki/directive/more]] [[ikiwiki/directive]], which is a way to have a "more" link on a post in a blog, that leads to the diff --git a/doc/plugins/opendiscussion.mdwn b/doc/plugins/opendiscussion.mdwn index b2ba68bf7..3b5ab4858 100644 --- a/doc/plugins/opendiscussion.mdwn +++ b/doc/plugins/opendiscussion.mdwn @@ -1,5 +1,6 @@ [[!template id=plugin name=opendiscussion author="[[Joey]]"]] [[!tag type/auth]] -This plugin allows editing of Discussion pages by anonymous users who have -not logged into the wiki. +This plugin allows editing of Discussion pages, and posting of comments, +even when the [[lockedit]] plugin has been configured to otherwise prevent +editing. diff --git a/doc/plugins/openid.mdwn b/doc/plugins/openid.mdwn index 91fc7cddc..f3b3abfbb 100644 --- a/doc/plugins/openid.mdwn +++ b/doc/plugins/openid.mdwn @@ -11,17 +11,22 @@ The [[!cpan LWPx::ParanoidAgent]] perl module is used if available, for added security. Finally, the [[!cpan Crypt::SSLeay]] perl module is needed to support users entering "https" OpenID urls. -This plugin has a configuration option. You can set `--openidsignup` -to the url of a third-party site where users can sign up for an OpenID. If -it's set, the signin page will link to that site. - -This plugin supports the -[myopenid.com affiliate program](http://myopenid.com/affiliate_welcome), -which can be used to help users sign up for an OpenID and log into your -site in a single, unified process. When you create the affiliate, specify a -login url like `http://example.com/ikiwiki.cgi?do=continue`. Once the -affiliate is created, set `openidsignup` to point to the affiliate's signup -url. - This plugin is enabled by default, but can be turned off if you want to only use some other form of authentication, such as [[passwordauth]]. + +## options + +These options do not normally need to be set, but can be useful in +certian setups. + +* `openid_realm` can be used to control the scope of the openid request. + It defaults to the `cgiurl` (or `openid_cgiurl` if set); only allowing + ikiwiki's [[CGI]] to authenticate. If you have multiple ikiwiki instances, + or other things using openid on the same site, you may choose to put them + all in the same realm to improve the user's openid experience. It is an + url pattern, so can be set to eg "http://*.example.com/" + +* `openid_cgiurl` can be used to cause a different than usual `cgiurl` + to be used when doing openid authentication. The `openid_cgiurl` must + point to an ikiwiki [[CGI]], and it will need to match the `openid_realm` + to work. diff --git a/doc/plugins/openid/discussion.mdwn b/doc/plugins/openid/discussion.mdwn index 39e947b82..a88da8b9d 100644 --- a/doc/plugins/openid/discussion.mdwn +++ b/doc/plugins/openid/discussion.mdwn @@ -19,3 +19,8 @@ It looks like OpenID 2.0 (the only supported by Yahoo) is not supported in ikiwi -- Ivan Z. They have more on OpenID 2.0 in [their FAQ](http://developer.yahoo.com/openid/faq.html). --Ivan Z. + +---- +I'm trying to add a way to query the data saved by the OpenID plugin from outside of ikiwiki, to see what identity the user has been authenticated as, if any. I'm thinking of designating some directories as internal pages and check the identity against a list in a mod_perl access hook. I would also write a CGI script that would return a JSON formatted reply to tell if the user is authenticated for those pages and query it with AJAX and only render links to the internal pages if the user would have access to them. That's just a couple of ideas I'm working on first, but I can imagine that there's any number of other tricks that people could implement with that sort of a thing. + +Also, this isn't really specific to OpenID but to all auth plugins, but I'm going to use only OpenID for authentication so that's what I'm targeting right now. I suppose that would be worth its own TODO item. --[[kaol]] diff --git a/doc/plugins/orphans.mdwn b/doc/plugins/orphans.mdwn index ea7c4df13..09ad0a51d 100644 --- a/doc/plugins/orphans.mdwn +++ b/doc/plugins/orphans.mdwn @@ -1,5 +1,6 @@ [[!template id=plugin name=orphans author="[[Joey]]"]] [[!tag type/meta]] +[[!tag type/widget]] This plugin provides the [[ikiwiki/directive/orphans]] [[ikiwiki/directive]], which generates a list of possibly orphaned pages -- @@ -10,5 +11,6 @@ Here's a list of orphaned pages on this wiki: [[!orphans pages="* and !news/* and !todo/* and !bugs/* and !users/* and !recentchanges and !examples/* and !tips/* and !sandbox/* and !templates/* and +!forum/* and !*.js and !wikiicons/* and !plugins/*"]] """]] diff --git a/doc/plugins/pagecount.mdwn b/doc/plugins/pagecount.mdwn index a56027e60..71872fae8 100644 --- a/doc/plugins/pagecount.mdwn +++ b/doc/plugins/pagecount.mdwn @@ -1,5 +1,6 @@ [[!template id=plugin name=pagecount author="[[Joey]]"]] [[!tag type/meta]] +[[!tag type/widget]] This plugin provides the [[ikiwiki/directive/pagecount]] [[ikiwiki/directive]], which displays the number of pages diff --git a/doc/plugins/pagestats.mdwn b/doc/plugins/pagestats.mdwn index c3eba6363..347e39a89 100644 --- a/doc/plugins/pagestats.mdwn +++ b/doc/plugins/pagestats.mdwn @@ -1,5 +1,5 @@ [[!template id=plugin name=pagestats author="Enrico Zini"]] -[[!tag type/meta type/tags]] +[[!tag type/meta type/tags type/widget]] This plugin provides the [[ikiwiki/directive/pagestats]] [[ikiwiki/directive]], which can generate stats about how pages link to diff --git a/doc/plugins/pagetemplate.mdwn b/doc/plugins/pagetemplate.mdwn index afd5eb500..8254e14c5 100644 --- a/doc/plugins/pagetemplate.mdwn +++ b/doc/plugins/pagetemplate.mdwn @@ -2,9 +2,5 @@ [[!tag type/chrome]] This plugin provides the [[ikiwiki/directive/pagetemplate]] -[[ikiwiki/directive]], which allows a page to be created using a different -[[template|wikitemplates]]. - -This plugin can only use templates that are already installed in -`/usr/share/ikiwiki/templates` (or wherever ikiwiki is configured to look for -them). You can choose to use any .tmpl files in that directory. +[[ikiwiki/directive]], which allows a page to be displayed +using a different [[template|templates]] than the default. diff --git a/doc/plugins/parentlinks.mdwn b/doc/plugins/parentlinks.mdwn index ef262a30c..c2d364bef 100644 --- a/doc/plugins/parentlinks.mdwn +++ b/doc/plugins/parentlinks.mdwn @@ -1,5 +1,5 @@ [[!template id=plugin name=parentlinks core=1 author="[[intrigeri]]"]] -[[!tag type/link]] +[[!tag type/link type/chrome]] This plugin generates the links to a page's parents that typically appear at the top of a wiki page. diff --git a/doc/plugins/po.mdwn b/doc/plugins/po.mdwn index 0c8110563..91273ba98 100644 --- a/doc/plugins/po.mdwn +++ b/doc/plugins/po.mdwn @@ -54,10 +54,10 @@ Supported languages `po_slave_languages` is used to set the list of supported "slave" languages, such as: - po_slave_languages => { 'fr' => 'Français', - 'es' => 'Español', - 'de' => 'Deutsch', - } + po_slave_languages => [ 'fr|Français', + 'es|Español', + 'de|Deutsch', + ] Decide which pages are translatable ----------------------------------- @@ -114,7 +114,8 @@ Apache Using Apache `mod_negotiation` makes it really easy to have Apache serve any page in the client's preferred language, if available. -This is the default Debian Apache configuration. + +Add 'Options MultiViews' to the wiki directory's configuration in Apache. When `usedirs` is enabled, one has to set `DirectoryIndex index` for the wiki context. @@ -123,14 +124,17 @@ Setting `DefaultLanguage LL` (replace `LL` with your default MIME language code) for the wiki context can help to ensure `bla/page/index.en.html` is served as `Content-Language: LL`. +For details, see [Apache's documentation](http://httpd.apache.org/docs/2.2/content-negotiation.html). + lighttpd -------- -lighttpd unfortunately does not support content negotiation. +Recent versions of lighttpd should be able to use +`$HTTP["language"]` to configure the translated pages to be served. -**FIXME**: does `mod_magnet` provide the functionality needed to - emulate this? +See [Lighttpd Issue](http://redmine.lighttpd.net/issues/show/1119) +TODO: Example Usage ===== @@ -194,7 +198,7 @@ enabled, "slave" pages therefore link to the "master" page's discussion page. Likewise, "slave" pages are not supposed to have sub-pages; -[[WikiLinks|wikilink]] that appear on a "slave" page therefore link to +[[WikiLinks|ikiwiki/wikilink]] that appear on a "slave" page therefore link to the master page's sub-pages. Translating @@ -209,16 +213,16 @@ preferred `$EDITOR`, without needing to be online. Markup languages support ------------------------ -[[Markdown|mdwn]] is well supported. Some other markup languages supported -by ikiwiki mostly work, but some pieces of syntax are not rendered -correctly on the slave pages: +[[Markdown|mdwn]] and [[html]] are well supported. Some other markup +languages supported by ikiwiki mostly work, but some pieces of syntax +are not rendered correctly on the slave pages: * [[reStructuredText|rst]]: anonymous hyperlinks and internal cross-references * [[wikitext]]: conversion of newlines to paragraphs * [[creole]]: verbatim text is wrapped, tables are broken -* [[html]] and LaTeX: not supported yet; the dedicated po4a modules - could be used to support them, but they would need a security audit +* LaTeX: not supported yet; the dedicated po4a module + could be used to support it, but it would need a security audit * other markup languages have not been tested. Security @@ -231,66 +235,86 @@ When using po4a older than 0.35, it is recommended to uninstall `Text::WrapI18N` (Debian package `libtext-wrapi18n-perl`), in order to avoid a potential denial of service. +BUGS +==== + +[[!inline pages="bugs/po:* and !bugs/done and !link(bugs/done) and !bugs/*/*" +feeds=no actions=no archive=yes show=0]] + TODO ==== -Better links ------------- - -Once the fix to -[[bugs/pagetitle_function_does_not_respect_meta_titles]] from -[[intrigeri]]'s `meta` branch is merged into ikiwiki upstream, the -generated links' text will be optionally based on the page titles set -with the [[meta|plugins/meta]] plugin, and will thus be translatable. -It will also allow displaying the translation status in links to slave -pages. Both were implemented, and reverted in commit -ea753782b222bf4ba2fb4683b6363afdd9055b64, which should be reverted -once [[intrigeri]]'s `meta` branch is merged. - -An integration branch, called `meta-po`, merges [[intrigeri]]'s `po` -and `meta` branches, and thus has thise additional features. - -Self links ----------- - -If a page contains a WikiLink to itself, ikiwiki does not normally -turn that into a hyperlink. However, if a translated page contains a -WikiLink to itself, a hyperlink is inserted, at least with the default -`po_link_to` the link points to the English version of the page. Is there a -good reason for that to be done? --[[Joey]] - -Language display order ----------------------- - -Jonas pointed out that one might want to control the order that links to -other languages are listed, for various reasons. Currently, there is no -order, as `po_slave_languages` is a hash. It would need to be converted -to an array to support this. (If twere done, twere best done quickly.) ---[[Joey]] +[[!inline pages="todo/po:* and !todo/done and !link(todo/done) and !todo/*/*" +feeds=no actions=no archive=yes show=0]] -Duplicate %links ? ------------------- +broken links to translatable basewiki pages that lack po files +-------------------------------------------------------------- -I notice code in the scan hook that seems to assume -that %links will accumulate duplicate links for a page. -That used to be so, but the bug was fixed. Does this mean -that po might be replacing the only link on a page, in error? ---[[Joey]] +If a page is not translated yet, the "translated" version of it +displays wikilinks to other, existing (but not yet translated?) +pages as edit links, as if those pages do not exist. -Bug when editing underlay file ------------------------------- +That's really confusing, especially as clicking such a link +brings up an edit form to create a new, english page. -While I've gotten translated underlays working, there is a bug -if the wiki is currently using a page from the underlay, and the master -language version is edited. This causes the edited page to be saved -to srcdir.. and all the translations get set to 0% and their -po files have the translated strings "emptied". What's really going on -is that these are entirely new po files not based on the old ones -in the basewiki, and thus lacking translations. --[[Joey]] +This is with po_link_to=current or negotiated. With default, it doesn't +happen.. -Documentation -------------- +Also, this may only happen if the page being linked to is coming from an +underlay, and the underlays lack translation to a given language. +--[[Joey]] -Maybe write separate documentation depending on the people it targets: -translators, wiki administrators, hackers. This plugin may be complex -enough to deserve this. +> Any simple testcase to reproduce it, please? I've never seen this +> happen yet. --[[intrigeri]] + +>> Sure, go here <http://l10n.ikiwiki.info/smiley/smileys/index.sv.html> +>> (Currently 0% translateed) and see the 'WikiLink' link at the bottom, +>> which goes to <http://l10n.ikiwiki.info/ikiwiki.cgi?page=ikiwiki/wikilink&from=smiley/smileys&do=create> +>> Compare with eg, the 100% translated Dansk version, where +>> the WikiLink link links to the English WikiLink page. --[[Joey]] + +>>> Seems not related to the page/string translation status: the 0% +>>> translated Spanish version has the correct link, just like the +>>> Dansk version => I'm changing the bug title accordingly. +>>> +>>> I tested forcing the sv html page to be rebuilt by translating a +>>> string in it, it did not fix the bug. I did the same for the +>>> Spanish page, it did not introduce the bug. So this is really +>>> weird. +>>> +>>> The smiley underlay seems to be the only place where the wrong +>>> thing happens: the basewiki underlay has similar examples +>>> that do not exhibit this bug. An underlay linking to another might +>>> be necessary to reproduce it. Going to dig deeper. --[[intrigeri]] + +>>>> After a few hours lost in the Perl debugger, I think I have found +>>>> the root cause of the problem: in l10n wiki's configured +>>>> `underlaydir`, the basewiki is present in every slave language +>>>> that is enabled for this wiki *but* Swedish. With such a +>>>> configuration, the `ikiwiki/wikilink` page indeed does not exist +>>>> in Swedish language: no `ikiwiki/wikilink.sv.po` can be found +>>>> where ikiwiki is looking. Have a look to +>>>> <http://l10n.ikiwiki.info/ikiwiki/>, the basewiki is not +>>>> available in Swedish language on this wiki. So this is not a po +>>>> bug, but a configuration or directories layout issue. This is +>>>> solved by adding the Swedish basewiki to the underlay dir, which +>>>> is I guess not a possibility in the l10n wiki context. I guess +>>>> this could be solved by adding `SRCDIR/basewiki` as an underlay +>>>> to your l10n wiki configuration, possibly using the +>>>> `add_underlays` configuration directive. --[[intrigeri]] + +>>>>> There is no complete Swedish underlay translation yet, so it is not +>>>>> shipped in ikiwiki. I don't think it's a misconfiguration to use +>>>>> a language that doesn't have translated underlays. --[[Joey]] + +>>>>>> Ok. The problem is triggered when using a language that doesn't +>>>>>> have translated underlays, *and* defining +>>>>>> `po_translatable_pages` in a way that renders the base wiki +>>>>>> pages translatable in po's view of things, which in turns makes +>>>>>> the po plugin act as if the translation pages did exist, +>>>>>> although they do not in this case. I still need to have a deep +>>>>>> look at the underlays-related code you added to `po.pm` a while +>>>>>> ago. Stay tuned. --[[intrigeri]] + +>>>>>>> Fixed in my po branch, along with other related small bugs that +>>>>>>> happen in the very same situation only. --[[intrigeri]] diff --git a/doc/plugins/po/discussion.mdwn b/doc/plugins/po/discussion.mdwn index 1c3f0e752..50998e822 100644 --- a/doc/plugins/po/discussion.mdwn +++ b/doc/plugins/po/discussion.mdwn @@ -150,6 +150,23 @@ The following analysis was done with his help. variables; according to [[Joey]], this is "Freaky code, but seems ok due to use of `quotementa`". +##### Locale::Po4a::Xhtml + +* does not run any external program +* does not build regexp's from untrusted variables + +=> Seems safe as far as the `includessi` option is disabled; the po +plugin explicitly disables it. + +Relies on Locale::Po4a::Xml` to do most of the work. + +##### Locale::Po4a::Xml + +* does not run any external program +* the `includeexternal` option makes it able to read external files; + the po plugin explicitly disables it +* untrusted variables are escaped when used to build regexp's + ##### Text::WrapI18N `Text::WrapI18N` can cause DoS @@ -513,7 +530,7 @@ finish it at some point in the first quarter of 2009. --[[intrigeri]] >>>> >>>>> Done. --[[intrigeri]] >>> -> * I'm very fearful of the `add_depends` in `postscan`. +> * I'm very fearful of the `add_depends` in `indexhtml`. > Does this make every page depend on every page that links > to it? Won't this absurdly bloat the dependency pagespecs > and slow everything down? And since nicepagetitle is given @@ -627,28 +644,6 @@ daring a timid "please pull"... or rather, please review again :) >>> need improvements to the deletion UI to de-confuse that. It's fine to >>> put that off until needed --[[Joey]] >> -> * Re the meta title escaping issue worked around by `change`. -> I suppose this does not only affect meta, but other things -> at scan time too. Also, handling it only on rebuild feels -> suspicious -- a refresh could involve changes to multiple -> pages and trigger the same problem, I think. Also, exposing -> this rebuild to the user seems really ugly, not confidence inducing. -> -> So I wonder if there's a better way. Such as making po, at scan time, -> re-run the scan hooks, passing them modified content (either converted -> from po to mdwn or with the escaped stuff cheaply de-escaped). (Of -> course the scan hook would need to avoid calling itself!) -> -> (This doesn't need to block the merge, but I hope it can be addressed -> eventually..) -> -> --[[Joey]] ->> ->> I'll think about it soon. ->> ->> --[[intrigeri]] ->> ->>> Did you get a chance to? --[[Joey]] * As discussed at [[todo/l10n]] the templates needs to be translatable too. They should be treated properly by po4a using the markdown option - at least with my @@ -699,3 +694,28 @@ and via CGI, have been tested. * general test with `indexpages` enabled: **not OK** * general test with `po_link_to=default` with `userdirs` enabled: **OK** * general test with `po_link_to=default` with `userdirs` disabled: **OK** + +Duplicate %links ? +------------------ + +I notice code in the scan hook that seems to assume +that %links will accumulate duplicate links for a page. +That used to be so, but the bug was fixed. Does this mean +that po might be replacing the only link on a page, in error? +--[[Joey]] + +> It would replace it. The only problematic case is when another +> plugin has its own reasons, in its `scan` hook, to add a page +> that is already there to `$links{$page}`. This other plugin's +> effect might then be changed by po's `scan` hook... which could +> be either good (better overall l10n) or bad (break the other +> plugin's goal). --[[intrigeri]] + +>> Right.. well, the cases where links are added is very small. +>> Grepping for `add_link`, it's just done by link, camelcase, meta, and +>> tag. All of these are supposed to work just link regular links +>> so I'd think that is ok. We could probably remove the currently scary +>> comment about only wanting to change the first link. --[[Joey]] + +>>> Commit 3c2bffe21b91684 in my po branch does this. --[[intrigeri]] +>>>> Cherry-picked --[[Joey]] diff --git a/doc/plugins/poll.mdwn b/doc/plugins/poll.mdwn index 510f67798..099cb399c 100644 --- a/doc/plugins/poll.mdwn +++ b/doc/plugins/poll.mdwn @@ -1,5 +1,5 @@ [[!template id=plugin name=poll author="[[Joey]]"]] -[[!tag type/web]] +[[!tag type/widget]] This plugin provides the [[ikiwiki/directive/poll]] [[ikiwiki/directive]], which allows inserting an online poll into a page. diff --git a/doc/plugins/polygen.mdwn b/doc/plugins/polygen.mdwn index 6045c1ec9..f9cea1f4d 100644 --- a/doc/plugins/polygen.mdwn +++ b/doc/plugins/polygen.mdwn @@ -1,5 +1,6 @@ [[!template id=plugin name=polygen author="Enrico Zini"]] [[!tag type/fun]] +[[!tag type/widget]] This plugin provides the [[ikiwiki/directive/polygen]] [[ikiwiki/directive]], which allows inserting text generated by polygen into a wiki page. diff --git a/doc/plugins/postsparkline.mdwn b/doc/plugins/postsparkline.mdwn index c81f91bdc..b0733e343 100644 --- a/doc/plugins/postsparkline.mdwn +++ b/doc/plugins/postsparkline.mdwn @@ -1,5 +1,5 @@ [[!template id=plugin name=postsparkline author="[[Joey]]"]] -[[!tag type/chrome]] +[[!tag type/widget]] This plugin provides the [[ikiwiki/directive/postsparkline]] [[ikiwiki/directive]]. It uses the [[sparkline]] plugin to create a sparkline of diff --git a/doc/plugins/prettydate.mdwn b/doc/plugins/prettydate.mdwn index 11ad4252f..149b7c29c 100644 --- a/doc/plugins/prettydate.mdwn +++ b/doc/plugins/prettydate.mdwn @@ -1,5 +1,6 @@ [[!template id=plugin name=prettydate author="[[Joey]]"]] [[!tag type/date]] +[[!tag type/chrome]] Enabling this plugin changes the dates displayed on pages in the wiki to a format that is nice and easy to read. Examples: "late Wednesday evening, diff --git a/doc/plugins/progress.mdwn b/doc/plugins/progress.mdwn index e1b560cc8..20736d18c 100644 --- a/doc/plugins/progress.mdwn +++ b/doc/plugins/progress.mdwn @@ -1,5 +1,5 @@ [[!template id=plugin name=progress author="[[Will]]"]] -[[!tag type/meta]] +[[!tag type/widget]] This plugin provides the [[ikiwiki/directive/progress]] [[ikiwiki/directive]], which generates a progress bar. diff --git a/doc/plugins/rawhtml/discussion.mdwn b/doc/plugins/rawhtml/discussion.mdwn index e63e4acb9..9ed8230ba 100644 --- a/doc/plugins/rawhtml/discussion.mdwn +++ b/doc/plugins/rawhtml/discussion.mdwn @@ -2,4 +2,6 @@ Is there anyway to allow this only on locked pages? I'd like to be able to do r > Not at the moment. Long-term, ikiwiki needs some general permission mechanisms that encompass this sort of issue. --[[JoshTriplett]] ->> Thanks. Bummer though, looking forward to when this is possible. :-) -- Adam.
\ No newline at end of file +>> Thanks. Bummer though, looking forward to when this is possible. :-) -- Adam. + +> Well, this plugin is different from the [[html]] plugin. It **copies** html files. So users cannot do raw HTML via cgi. Thus it is safe in most cases. -- weakish diff --git a/doc/plugins/recentchanges.mdwn b/doc/plugins/recentchanges.mdwn index 4ab2cd078..823f68502 100644 --- a/doc/plugins/recentchanges.mdwn +++ b/doc/plugins/recentchanges.mdwn @@ -1,4 +1,5 @@ [[!template id=plugin name=recentchanges core=1 author="[[Joey]]"]] +[[!tag type/meta]] This plugin examines the [[revision_control_system|rcs]] history and generates a page describing each recent change made to the wiki. These @@ -24,3 +25,6 @@ Here's an example of how to show only changes that Joey didn't make. \[[!inline pages="internal(recentchanges/change_*) and !author(joey) and !author(http://joey.kitenet.net*)" template=recentchanges show=0]] + +If you want to generate feeds for the RecentChanges page, you have to use +[[`rss`_or_`atom`_in_the_setup_file|/todo/minor adjustment to setup documentation for recentchanges feeds]]. diff --git a/doc/plugins/recentchangesdiff.mdwn b/doc/plugins/recentchangesdiff.mdwn index a7b113ade..57299f92d 100644 --- a/doc/plugins/recentchangesdiff.mdwn +++ b/doc/plugins/recentchangesdiff.mdwn @@ -1,4 +1,5 @@ [[!template id=plugin name=recentchangesdiff core=0 author="[[Joey]]"]] +[[!tag type/meta]] This plugin extends the [[recentchanges]] plugin, adding a diff for each change. The diffs are by default hidden from display on the recentchanges diff --git a/doc/plugins/relativedate.mdwn b/doc/plugins/relativedate.mdwn index 3ada0864b..d6e8eb08b 100644 --- a/doc/plugins/relativedate.mdwn +++ b/doc/plugins/relativedate.mdwn @@ -1,5 +1,6 @@ [[!template id=plugin name=relativedate author="[[Joey]]"]] [[!tag type/date]] +[[!tag type/chrome]] This plugin lets dates be displayed in relative form. Examples: "2 days ago", "1 month and 3 days ago", "30 minutes ago". Hovering over the date will @@ -8,9 +9,3 @@ cause a tooltip to pop up with the absolute date. This only works in browsers with javascript enabled; other browsers will show the absolute date instead. Also, this plugin can be used with other plugins like [[prettydate]] that change how the absolute date is displayed. - -If this plugin is enabled, you may also add relative dates to pages in the -wiki, by using html elements in the "relativedate" class. For example, this -will display as a relative date: - - <span class="relativedate">Tue Jan 20 12:00:00 EDT 2009</span> diff --git a/doc/plugins/rename.mdwn b/doc/plugins/rename.mdwn index ddaede8b0..abb361329 100644 --- a/doc/plugins/rename.mdwn +++ b/doc/plugins/rename.mdwn @@ -2,7 +2,8 @@ [[!tag type/web]] This plugin allows pages or other files to be renamed using the web -interface. +interface. Following Unix tradition, renaming also allows moving to a +different directory. Users can only rename things that they are allowed to edit or upload. diff --git a/doc/plugins/repolist.mdwn b/doc/plugins/repolist.mdwn index 9b3a7575e..efd9c9352 100644 --- a/doc/plugins/repolist.mdwn +++ b/doc/plugins/repolist.mdwn @@ -1,5 +1,5 @@ [[!template id=plugin name=repolist author="[[Joey]]"]] -[[!tag type/useful]] +[[!tag type/web]] This plugin allows you to configure ikiwiki with the location of [[rcs]] repositories for your wiki's source. This is done via the diff --git a/doc/plugins/rst/discussion.mdwn b/doc/plugins/rst/discussion.mdwn index 9909784d5..c84a6218e 100644 --- a/doc/plugins/rst/discussion.mdwn +++ b/doc/plugins/rst/discussion.mdwn @@ -41,3 +41,41 @@ such since they are inline elements in the text.. But images work fine in rst's syntax.. what about using rst syntax for wikilinks as well? Is it possible to inject something into the parser to turn unmached links ``WikiLink`_` into ikiwiki links? --ulrik + +------ + +Resolving WikiLinks in rst +========================== + +I wanted to look into if we can hook into rst and influence how links are resolved. +It turns out it is possible, and I have a working WIP for the rst plugin that does this. + +My work in progress for `/usr/lib/ikiwiki/plugins/rst` is here: +[[todo/Resolve native reStructuredText links to ikiwiki pages]] + +It basically matches normal rst links just like ikiwiki would match a wikilink +if it existed. +I can't read perl so I haven't found out so much. The plugin successfully registers backlinks using +`proxy.rpc('add_link', on_page, bestlink)` (since the destination page will be rebuilt to update), +but the backlinks don't show up. + +I converted one of my pages to rst: + +Before: <http://kaizer.se/wiki/kupfer-mdwn> +After: <http://kaizer.se/wiki/kupfer-rst> + +I need help on a couple of points + +* How to fix the backlinks with `add_link`? +* How to generate NonExistingLinks using the plugin API? +* Can we include this in ikiwiki's rst if it is not too hairy? + +--ulrik + + +---- + +> The main problem with more sophisticated RST support is that ikiwiki turns +preprocessor directives into raw HTML and reST hates inline HTML. + +Is it possible for ikiwiki to store preprocessor directives in memory, and replace them with place holders, then do the rst process. After the rst processing, process the preprocessor directives and replace place holders. --[[weakish]] diff --git a/doc/plugins/rsync.mdwn b/doc/plugins/rsync.mdwn new file mode 100644 index 000000000..e48886168 --- /dev/null +++ b/doc/plugins/rsync.mdwn @@ -0,0 +1,19 @@ +[[!template id=plugin name=rsync author="[[schmonz]]"]] +[[!tag type/special-purpose]] + +This plugin allows ikiwiki to push generated pages to another host +by running a command such as `rsync`. + +The command to run is specified by setting `rsync_command` in your setup +file. The command will be run in your destdir, so something like this +is a typical command: + + rsync_command => 'rsync -qa --delete . user@host:/path/to/docroot/', + +If using rsync over ssh, you will need to enable noninteractive ssh login +to the remote host. It's also a good idea to specify the exact command line +to be permitted in the remote host's `$HOME/.ssh/authorized_keys`. + +A typical ikiwiki configuration when using this plugin is to disable cgi +support, so ikiwiki builds a completely static site that can be served from +the remote host. diff --git a/doc/plugins/rsync/discussion.mdwn b/doc/plugins/rsync/discussion.mdwn new file mode 100644 index 000000000..ef0fa9967 --- /dev/null +++ b/doc/plugins/rsync/discussion.mdwn @@ -0,0 +1,79 @@ +## A use case + +Why I needed this plugin: I have two web servers available to me +for a project. Neither does everything I need, but together they +do. (This is a bit like the [Amazon S3 +scenario](http://kitenet.net/~joey/blog/entry/running_a_wiki_on_Amazon_S3/).) + +Server (1) is a university web server. It provides plentiful space +and bandwidth, easy authentication for people editing the wiki, and +a well-known stable URL. The wiki really wants to live here and +very easily could except that the server doesn't allow arbitrary +CGIs. + +Server (2) is provided by a generous alumnus's paid [[tips/DreamHost]] +account. Disk and particularly network usage need to be minimized +because over some threshold it costs him. CGI, etc. are available. + +My plan was to host the wiki on server (1) by taking advantage of +server (2) to store the repository, source checkout, and generated +pages, to host the repository browser, and to handle ikiwiki's CGI +operations. In order for this to work, web edits on (2) would need +to automatically push any changed pages to (1). + +As a proof of concept, I added an rsync post-commit hook after +ikiwiki's usual. It worked, just not for web edits, which is how +the wiki will be used. So I wrote this plugin to finish the job. +The wiki now lives on (1), and clicking "edit" just works. --[[schmonz]] + +> Just out of interest, why use `rsync` and not `git push`. i.e. a +> different setup to solve the same problem would be to run a +> normal ikiwiki setup on the universities server with its git +> repository available over ssh (same security setup your using +> for rsync should work for git over ssh). On the cgi-capable server, +> when it would rsync, make it git push. It would seem that git +> has enough information that it should be able to be more +> network efficient. It also means that corruption at one end +> wouldn't be propagated to the other end. -- [[Will]] + +>> Hey, that's a nice solution. (The site was in svn to begin with, +>> but it's in git now.) One advantage of my approach in this particular +>> case: server (1) doesn't have `git` installed, but does have `rsync`, +>> so (1)'s environment can remain completely untweaked other than the +>> SSH arrangement. I kind of like that all the sysadmin effort is +>> contained on one host. +>> +>> This plugin is definitely still useful for projects not able to use +>> a DVCS (of which I've got at least one other), and possibly for +>> other uses not yet imagined. ;-) --[[schmonz]] + +>>> I'm now using this plugin for an additional purpose. One of the aforementioned wikis (there are actually two) can only be read by trusted users, the list of which is kept in an `.htaccess` file. I added it to git as `htaccess.txt`, enabled the [[plugins/txt]] plugin, and in my `rsync_command` script, have it copied to the destdir as `.htaccess` before calling `rsync`. Now my users (who aren't tech-savvy, but are trustworthy) can edit the access list directly in the wiki. This idea might also be useful for wikis not using `rsync` at all. --[[schmonz]] + +---- + +Revew: --[[Joey]] + +* I think it should not throw an error if no command is set. Just don't do anything. +* If the rsync fails, it currently errors out, which will probably also leave + the wiki in a broken state, since ikiwiki will not get a chance to save + its state. This seems fragile; what if the laptop is offline, or the + server is down, etc. Maybe it should just warn if the rsync fails? +* Is a new hook really needed? The savestate hook runs at a similar time; + only issue with it is that it is run even when ikiwiki has not + rendered any updated pages. Bah, I think you do need the new hook, how + annoying.. + +> * Depends whether the plugin would be on by default. If yes, then yes. +> If the admin has to enable it, I'd think they'd want the error. +> * Changed the other errors to warnings. +> * The name might be wrong: there isn't anything rsync-specific about the +> plugin, that's just the command I personally need to run. --[[schmonz]] + +>> One problem with the error is that it prevents dumping a new setup file with +>> the plugin enabled, and then editing it to configure. ie: + + joey@gnu:~>ikiwiki -setup .ikiwiki/joeywiki.setup -plugin rsync -dumpsetup new.setup + Must specify rsync_command + +> rsync seems by far the most likely command, though someone might use something +> to push via ftp instead. I think calling it rsync is ok. --[[Joey]] diff --git a/doc/plugins/search.mdwn b/doc/plugins/search.mdwn index 92cc5945a..e95739cf3 100644 --- a/doc/plugins/search.mdwn +++ b/doc/plugins/search.mdwn @@ -4,7 +4,7 @@ This plugin adds full text search to ikiwiki, using the [xapian](http://xapian.org/) engine, its [omega](http://xapian.org/docs/omega/overview.html) frontend, and the -[[!cpan Search::Xapian]], [[!cpan Digest::SHA1]], and [[!cpan HTML::Scrubber]] +[[!cpan Search::Xapian]], [[!cpan Digest::SHA]], and [[!cpan HTML::Scrubber]] perl modules. The [[ikiwiki/searching]] page describes how to write search queries. diff --git a/doc/plugins/shortcut.mdwn b/doc/plugins/shortcut.mdwn index cca1f4bdd..1e8e85ed8 100644 --- a/doc/plugins/shortcut.mdwn +++ b/doc/plugins/shortcut.mdwn @@ -1,5 +1,5 @@ [[!template id=plugin name=shortcut author="[[Joey]]"]] -[[!tag type/format]] +[[!tag type/widget]] This plugin provides the [[ikiwiki/directive/shortcut]] [[ikiwiki/directive]]. It allows external links to commonly linked to sites to be made diff --git a/doc/plugins/shortcut/discussion.mdwn b/doc/plugins/shortcut/discussion.mdwn index 4e11ce08c..2e2b1b281 100644 --- a/doc/plugins/shortcut/discussion.mdwn +++ b/doc/plugins/shortcut/discussion.mdwn @@ -9,4 +9,10 @@ Maybe use the `default_pageext` is better than hardcode .mdwn? > done, it will use `default_pageext` now --[[Joey]] +--- +Instead of modifying the [[basewiki]]'s [[shortcuts]] file for local needs -- +thus copying it at some point and losing continuity with upstream enhancements -- +what about handling a `shortcuts-local.mdwn` or `shortcuts/local.mdwn` (if such +a file exists in the wiki), and additionally process that one. Possibily a +conditional `\[[!inline]]` could be used. --[[tschwinge]] diff --git a/doc/plugins/sidebar.mdwn b/doc/plugins/sidebar.mdwn index 36982eff3..012733456 100644 --- a/doc/plugins/sidebar.mdwn +++ b/doc/plugins/sidebar.mdwn @@ -1,21 +1,28 @@ [[!template id=plugin name=sidebar author="Tuomo Valkonen"]] [[!tag type/chrome]] -If this plugin is enabled, then a sidebar is added to pages in the wiki. -The content of the sidebar is simply the content of a page named -"sidebar" (ie, create a "sidebar.mdwn"). +This plugin allows adding a sidebar to pages in the wiki. + +By default, and unless the `global_sidebars` setting is turned off, +a sidebar is added to all pages in the wiki. The content of the sidebar +is simply the content of a page named "sidebar" (ie, create a "sidebar.mdwn"). Typically this will be a page in the root of the wiki, but it can also be a [[ikiwiki/SubPage]]. In fact, this page, [[plugins/sidebar|plugins/sidebar]], will be treated as a sidebar for the [[plugins]] page, and of all of its SubPages, if the plugin is enabled. -Note that to disable a sidebar for a [[ikiwiki/SubPage]] of a page that has -a sidebar, you can create a sidebar page that is completely empty. This -will turn off the sidebar altogether. +There is also a [[ikiwiki/directive/sidebar]] directive that can be used +to provide a custom sidebar content for a page. + +---- + +Warning: Any change to the sidebar page will cause a rebuild of the whole +wiki, since every page includes a copy that has to be updated. This can +especially be a problem if the sidebar includes an +[[ikiwiki/directive/inline]] directive, since any changes to pages inlined +into the sidebar will change the sidebar and cause a full wiki rebuild. -Warning: Any change to the sidebar will cause a rebuild of the whole wiki, -since every page includes a copy that has to be updated. This can -especially be a problem if the sidebar includes [[inline]] or [[map]] -directives, since any changes to pages inlined or mapped onto the sidebar -will change the sidebar and cause a full wiki rebuild. +Instead, if you include a [[ikiwiki/directive/map]] directive on the sidebar, +and it does not use the `show` parameter, only adding or removing pages +included in the map will cause a full rebuild. Modifying pages will not. diff --git a/doc/plugins/sidebar/discussion.mdwn b/doc/plugins/sidebar/discussion.mdwn new file mode 100644 index 000000000..eb441529c --- /dev/null +++ b/doc/plugins/sidebar/discussion.mdwn @@ -0,0 +1,5 @@ +> Warning: Any change to the sidebar will cause a rebuild of the whole wiki, since every page includes a copy that has to be updated. This can especially be a problem if the sidebar includes inline or map directives, since any changes to pages inlined or mapped onto the sidebar will change the sidebar and cause a full wiki rebuild. + +I tried exactly that, namely having an inline in my sidebar to include an rss feed from some other side. I think the complete wiki rebuild should be doable every few days when a new article appears in that feed. But contrary to that warning there is no complete wiki rebuild, only the sidebar page is rebuilt by the "ikiwiki --aggregate" from cron. Is that a bug or a feature? + +> It's a bug, discussed in [[bugs/transitive_dependencies]]. --[[Joey]] diff --git a/doc/plugins/sortnaturally.mdwn b/doc/plugins/sortnaturally.mdwn new file mode 100644 index 000000000..a16381946 --- /dev/null +++ b/doc/plugins/sortnaturally.mdwn @@ -0,0 +1,6 @@ +[[!template id=plugin name=sortnaturally core=1 author="[[chrysn]], [[smcv]]"]] +[[!tag type/meta]] + +This plugin provides the `title_natural` [[ikiwiki/pagespec/sorting]] +order, which uses [[!cpan Sort::Naturally]] to sort numbered pages in a +more natural order. diff --git a/doc/plugins/sparkline.mdwn b/doc/plugins/sparkline.mdwn index bcc5daec6..83e24a27d 100644 --- a/doc/plugins/sparkline.mdwn +++ b/doc/plugins/sparkline.mdwn @@ -1,5 +1,5 @@ [[!template id=plugin name=sparkline author="[[Joey]]"]] -[[!tag type/chrome]] +[[!tag type/widget]] This plugin provides the [[ikiwiki/directive/sparkline]] [[ikiwiki/directive]], which allows for easily embedding sparklines into @@ -16,7 +16,7 @@ to use the plugin, you will need: php can find it when `sparkline/Sparkline.php` is required. * The GD PHP module used by the Sparkline library. * A "php" program in the path, that can run standalone php programs. -* [[!cpan Digest::SHA1]] +* [[!cpan Digest::SHA]] On a Debian system, this can be accomplished by installing these packages: `libsparkline-php` `php5-gd` `php5-cli` `libdigest-sha1-perl` diff --git a/doc/plugins/table.mdwn b/doc/plugins/table.mdwn index 7b080acda..fe66f90a8 100644 --- a/doc/plugins/table.mdwn +++ b/doc/plugins/table.mdwn @@ -1,8 +1,8 @@ [[!template id=plugin name=table author="[[VictorMoral]]"]] -[[!tag type/format]] +[[!tag type/widget]] This plugin provides the [[ikiwiki/directive/table]] [[ikiwiki/directive]]. It can build HTML tables from data in CSV (comma-separated values) -or DSV (delimiter-separated values) format. +or DSV ([delimiter-separated values](http://en.wikipedia.org/wiki/Delimiter-separated_values)) format. It needs the perl module [[!cpan Text::CSV]] for the CSV data. diff --git a/doc/plugins/table/discussion.mdwn b/doc/plugins/table/discussion.mdwn index 16e5a5a56..86572c935 100644 --- a/doc/plugins/table/discussion.mdwn +++ b/doc/plugins/table/discussion.mdwn @@ -64,3 +64,10 @@ Do you like it? Can you implement the same in Ikiwiki? :) --[[Paweł|ptecza]] >> was written rather for simple usage. However cell alignment is very >> helpful feature, so I think the plugin should be able to do it. >> --[[Paweł|ptecza]] + +----- + +If you see `[[!table\ Error: ]]` you probably need to `sudo apt-get install libtext-csv-perl`. + +> Perhaps more helpfully, ikiwiki 3.1415926 fixes display of such errors to +> actualy include the error message. --[[Joey]] diff --git a/doc/plugins/tag.mdwn b/doc/plugins/tag.mdwn index 8ff70a069..8e1286e62 100644 --- a/doc/plugins/tag.mdwn +++ b/doc/plugins/tag.mdwn @@ -8,6 +8,13 @@ These directives allow tagging pages. It also provides the `tagged()` [[ikiwiki/PageSpec]], which can be used to match pages that are tagged with a specific tag. +The `tagbase` setting can be used to make tags default to being put in a +particular subdirectory. + +The `tag_autocreate` setting can be used to control whether new tag pages +are created as needed. It defaults to being done only if a `tagbase` is +set. + [[!if test="enabled(tag)" then=""" This wiki has the tag plugin enabled, so you'll see a note below that this page is tagged with the "tags" tag. diff --git a/doc/plugins/tag/discussion.mdwn b/doc/plugins/tag/discussion.mdwn index b6dab5358..dfd749252 100644 --- a/doc/plugins/tag/discussion.mdwn +++ b/doc/plugins/tag/discussion.mdwn @@ -16,7 +16,7 @@ Thanks. That works fine. @Ben: could you publish the code for that? ---David Riebenbauer <davrieb@htu.tugraz.at> +--[[David_Riebenbauer]] AOLMODE=true echo "I too would really like this feature, which would make cgi free life much better" --[[DavidBremner]] @@ -28,3 +28,4 @@ See [[todo/auto-create tag pages according to a template]] -- Jeremy Schultz <jeremy.schultz@uleth.ca> +`tag_autocreate` can now enable this. --[[Joey]] diff --git a/doc/plugins/template.mdwn b/doc/plugins/template.mdwn index 3485fe64c..8d17e2825 100644 --- a/doc/plugins/template.mdwn +++ b/doc/plugins/template.mdwn @@ -1,7 +1,7 @@ [[!template id=plugin name=template author="[[Joey]]"]] -[[!tag type/format]] +[[!tag type/widget]] This plugin provides the [[ikiwiki/directive/template]] [[ikiwiki/directive]]. With this plugin, you can set up templates, and cause them to be filled out -and inserted into pages in the wiki. It's documented and existing templates -are listed in the [[templates]] page. +and inserted into pages in the wiki. Existing templates are listed in the +[[templates]] page. diff --git a/doc/plugins/testpagespec.mdwn b/doc/plugins/testpagespec.mdwn index dabcb0bec..8180d5d4b 100644 --- a/doc/plugins/testpagespec.mdwn +++ b/doc/plugins/testpagespec.mdwn @@ -1,5 +1,5 @@ [[!template id=plugin name=testpagespec author="[[Joey]]"]] -[[!tag type/useful]] +[[!tag type/special-purpose]] This plugin provides a [[ikiwiki/directive/testpagespec]] [[ikiwiki/directive]]. The directive allows testing a [[ikiwiki/PageSpec]] to see if it matches a diff --git a/doc/plugins/teximg.mdwn b/doc/plugins/teximg.mdwn index ae052837f..f3cade85f 100644 --- a/doc/plugins/teximg.mdwn +++ b/doc/plugins/teximg.mdwn @@ -1,5 +1,5 @@ [[!template id=plugin name=teximg author="[[PatrickWinnertz]]"]] -[[!tag type/chrome type/slow]] +[[!tag type/widget type/slow]] This plugin provides a [[ikiwiki/directive/teximg]] [[ikiwiki/directive]], that renders LaTeX formulas into images. diff --git a/doc/plugins/theme.mdwn b/doc/plugins/theme.mdwn new file mode 100644 index 000000000..7149cc163 --- /dev/null +++ b/doc/plugins/theme.mdwn @@ -0,0 +1,11 @@ +[[!template id=plugin name=theme author="[[Joey]]"]] +[[!tag type/web]] + +The theme plugin allows easily applying a theme to your wiki, by +configuring the `theme` setting in the setup file with the name of a theme +to use. The themes you can choose from are all subdirectories, typically +inside `/usr/share/ikiwiki/themes/`. + +A theme provides, via the underlay, an enhanced version of the regular +[[style.css]]. This leaves [[local.css]] free for you to further +customise. Themes can also provide header and background images. diff --git a/doc/plugins/theme/discussion.mdwn b/doc/plugins/theme/discussion.mdwn new file mode 100644 index 000000000..9331713db --- /dev/null +++ b/doc/plugins/theme/discussion.mdwn @@ -0,0 +1,20 @@ +### What license do themes need to have for distribution? + +Could someone specify what license the themes need to have to get +distributed in ikiwiki or Debian? The current included theme seem to be +under the GPLv2. Does the [Creative Commons Attribution 3.0 Unported +License](http://creativecommons.org/licenses/by/3.0/) also work. This way a +lot of free CSS templates could be included, e. g. from +[freecsstemplates.org](http://www.freecsstemplates.org/). --PaulePanter + +> Paule, I'd love it if you did that! The only hard requirement on themes +> included in ikiwiki is that they need to be licensed with a [DFSG +> compatable license](https://wiki.debian.org/DFSGLicenses). CC-BY-SA 3.0 +> is DFSG; CC-BY is apparently being accepted by Debian too. +> +> As a soft requirement, I may exersise some discretion about themes that +> require obtrusive attributions links be included on every page of a +> site using the theme. While probably DFSG, that adds a requirement +> that ikiwiki itself does not require. --[[Joey]] + +### Once one has enabled the 'theme' plugin in the setup file, how does one use themes? diff --git a/doc/plugins/toc.mdwn b/doc/plugins/toc.mdwn index 2b7686681..a0ad3a5d0 100644 --- a/doc/plugins/toc.mdwn +++ b/doc/plugins/toc.mdwn @@ -1,5 +1,5 @@ [[!template id=plugin name=toc author="[[Joey]]"]] -[[!tag type/chrome]] +[[!tag type/widget]] This plugin provides the [[ikiwiki/directive/toc]] [[ikiwiki/directive]], which adds a table of contents to a page. diff --git a/doc/plugins/toggle.mdwn b/doc/plugins/toggle.mdwn index 69ac613e0..d1500eba0 100644 --- a/doc/plugins/toggle.mdwn +++ b/doc/plugins/toggle.mdwn @@ -1,5 +1,5 @@ [[!template id=plugin name=toggle author="[[Joey]]"]] -[[!tag type/chrome]] +[[!tag type/widget]] This plugin provides the [[ikiwiki/directive/toggle]] and [[ikiwiki/directive/toggleable]] [[directives|ikiwiki/directive]]. diff --git a/doc/plugins/txt.mdwn b/doc/plugins/txt.mdwn index 420898d09..a3087c9e0 100644 --- a/doc/plugins/txt.mdwn +++ b/doc/plugins/txt.mdwn @@ -12,3 +12,8 @@ The only exceptions are that [[WikiLinks|ikiwiki/WikiLink]] and [[directives|ikiwiki/directive]] are still expanded by ikiwiki, and that, if the [[!cpan URI::Find]] perl module is installed, URLs in the txt file are converted to hyperlinks. + +---- + +As a special case, a file `robots.txt` will be copied intact into the +`destdir`, as well as creating a wiki page named "robots". diff --git a/doc/plugins/type/chrome.mdwn b/doc/plugins/type/chrome.mdwn index d3f0eb3d3..a1c6d0728 100644 --- a/doc/plugins/type/chrome.mdwn +++ b/doc/plugins/type/chrome.mdwn @@ -1 +1 @@ -These plugins affect the look and feel of the wiki. +These plugins affect the look and feel of the overall wiki. diff --git a/doc/plugins/type/useful.mdwn b/doc/plugins/type/useful.mdwn deleted file mode 100644 index 92fcf5af1..000000000 --- a/doc/plugins/type/useful.mdwn +++ /dev/null @@ -1 +0,0 @@ -These plugins perform various miscellaneous useful functions. diff --git a/doc/plugins/type/widget.mdwn b/doc/plugins/type/widget.mdwn new file mode 100644 index 000000000..875829d0b --- /dev/null +++ b/doc/plugins/type/widget.mdwn @@ -0,0 +1,2 @@ +These plugins allow inserting various things into pages via a +[[ikiwiki/directive]]. diff --git a/doc/plugins/typography.mdwn b/doc/plugins/typography.mdwn index 030ef8052..9ff6c4ffd 100644 --- a/doc/plugins/typography.mdwn +++ b/doc/plugins/typography.mdwn @@ -1,5 +1,5 @@ [[!template id=plugin name=typography author="[[Roktas]]"]] -[[!tag type/format]] +[[!tag type/chrome]] This plugin, also known as [SmartyPants](http://daringfireball.net/projects/smartypants/), translates diff --git a/doc/plugins/underlay.mdwn b/doc/plugins/underlay.mdwn index 09d096a6e..0cf819472 100644 --- a/doc/plugins/underlay.mdwn +++ b/doc/plugins/underlay.mdwn @@ -1,14 +1,14 @@ [[!template id=plugin name=underlay author="[[Simon_McVittie|smcv]]"]] -[[!tag type/useful]] +[[!tag type/special-purpose]] -This plugin adds an `add_underlays` option to the `.setup` file. -Its value is a list of underlay directories whose content is added to the wiki. +This plugin adds an `add_underlays` option to the setup file. Its value is +a list of underlay directories whose content is added to the wiki. Multiple underlays are normally set up automatically by other plugins (for -instance, the images used by the [[plugins/smiley]] plugin), but they can also be -used as a way to pull in external files that you don't want in revision control, -like photos or software releases. +instance, the images used by the [[plugins/smiley]] plugin), but they can +also be used as a way to pull in external files that you don't want in +revision control, like photos or software releases. -Directories in `add_underlays` should usually be absolute. If relative, they're -interpreted as relative to the parent directory of the basewiki underlay, which -is probably not particularly useful in this context. +Directories in `add_underlays` should usually be absolute. If relative, +they're interpreted as relative to the parent directory of the basewiki +underlay, which is probably not particularly useful in this context. diff --git a/doc/plugins/version.mdwn b/doc/plugins/version.mdwn index 43027bdd7..326a2e7ce 100644 --- a/doc/plugins/version.mdwn +++ b/doc/plugins/version.mdwn @@ -1,5 +1,6 @@ [[!template id=plugin name=version author="[[Joey]]"]] [[!tag type/meta]] +[[!tag type/widget]] This plugin provides the [[ikiwiki/directive/version]] [[ikiwiki/directive]], which inserts the current version diff --git a/doc/plugins/websetup.mdwn b/doc/plugins/websetup.mdwn index f1756ba8f..a20a32489 100644 --- a/doc/plugins/websetup.mdwn +++ b/doc/plugins/websetup.mdwn @@ -2,7 +2,7 @@ [[!tag type/web]] This plugin allows wiki admins to configure the wiki using a web interface, -rather than editing the setup file directly. A "Wiki Setup" button is added +rather than editing the setup file directly. A "Setup" button is added to the admins' preferences page. Warning: This plugin rewrites your setup file. Any comments or unusual @@ -16,7 +16,8 @@ enabled and disabled using it too. Some settings are not considered safe enough to be manipulated over the web; these are still shown, by default, but cannot be modified. To hide them, set `websetup_show_unsafe` to false in the setup file. A few settings have too complex a data type to be -configured via the web. +configured via the web. To mark additional settings as unsafe, you can +list them in `websetup_unsafe`. Plugins that should not be enabled/disabled via the web interface can be listed in `websetup_force_plugins` in the setup file. diff --git a/doc/plugins/wmd.mdwn b/doc/plugins/wmd.mdwn index dc9a30703..96c6e2e6c 100644 --- a/doc/plugins/wmd.mdwn +++ b/doc/plugins/wmd.mdwn @@ -1,5 +1,5 @@ [[!template id=plugin name=wmd author="[[Will]]"]] -[[!tag type/chrome]] +[[!tag type/web]] [WMD](http://wmd-editor.com/) is a What You See Is What You Mean editor for [[mdwn]]. This plugin makes WMD be used for editing pages in the wiki. diff --git a/doc/plugins/write.mdwn b/doc/plugins/write.mdwn index 3976f9adf..10b4df835 100644 --- a/doc/plugins/write.mdwn +++ b/doc/plugins/write.mdwn @@ -3,8 +3,84 @@ written to extend ikiwiki in many ways. Despite the length of this page, it's not really hard. This page is a complete reference to everything a plugin might want to do. There is also a quick [[tutorial]]. +[[!template id="note" text=""" +Ikiwiki is a compiler + +One thing to keep in mind when writing a plugin is that ikiwiki is a wiki +*compiler*. So plugins influence pages when they are built, not when they +are loaded. A plugin that inserts the current time into a page, for +example, will insert the build time. + +Also, as a compiler, ikiwiki avoids rebuilding pages unless they have +changed, so a plugin that prints some random or changing thing on a page +will generate a static page that won't change until ikiwiki rebuilds the +page for some other reason, like the page being edited. + +The [[tutorial]] has some other examples of ways that ikiwiki being a +compiler may trip up the unwary. +"""]] + [[!toc levels=2]] +## Highlevel view of ikiwiki + +Ikiwiki mostly has two modes of operation. It can either be running +as a compiler, building or updating a wiki; or as a cgi program, providing +user interface for editing pages, etc. Almost everything ikiwiki does +is accomplished by calling various hooks provided by plugins. + +### compiler + +As a compiler, ikiwiki starts by calling the `refresh` hook. Then it checks +the wiki's source to find new or changed pages. The `needsbuild` hook is +then called to allow manipulation of the list of pages that need to be +built. + +Now that it knows what pages it needs to build, ikiwiki runs two +compile passes. First, it runs `scan` hooks, which collect metadata about +the pages. Then it runs a page rendering pipeline, by calling in turn these +hooks: `filter`, `preprocess`, `linkify`, `htmlize`, `indexhtml`, +`pagetemplate`, `sanitize`, `format`. + +After all necessary pages are built, it calls the `change` hook. Finally, +if a page is was deleted, the `delete` hook is called, and the files that +page had previously produced are removed. + +### cgi + +The flow between hooks when ikiwiki is run as a cgi is best illustrated by +an example. + +Alice browses to a page and clicks Edit. + +* Ikiwiki is run as a cgi. It assigns Alice a session cookie, and, + by calling the `auth` hooks, sees that she is not yet logged in. +* The `sessioncgi` hooks are then called, and one of them, + from the [[editpage]] plugin, notices that the cgi has been told "do=edit". +* The [[editpage]] plugin calls the `canedit` hook to check if this + page edit is allowed. The [[signinedit]] plugin has a hook that says not: + Alice is not signed in. +* The [[signinedit]] plugin then launches the signin process. A signin + page is built by calling the `formbuilder_setup` hook. + +Alice signs in with her openid. + +* The [[openid]] plugin's `formbuilder` hook sees that an openid was + entered in the signin form, and redirects to Alice's openid provider. +* Alice's openid provider calls back to ikiwiki. The [[openid]] plugin + has an `auth` hook that finishes the openid signin process. +* Signin complete, ikiwiki returns to what Alice was doing before; editing + a page. +* Now all the `canedit` hooks are happy. The [[editpage]] plugin calls + `formbuilder_setup` to display the page editing form. + +Alice saves her change to the page. + +* The [[editpage]] plugin's `formbuilder` hook sees that the Save button + was pressed, and calls the `checkcontent` and `editcontent` hooks. + Then it saves the page to disk, and branches into the compiler part + of ikiwiki to refresh the wiki. + ## Types of plugins Most ikiwiki [[plugins]] are written in perl, like ikiwiki. This gives the @@ -31,16 +107,20 @@ they're the same as far as how they hook into ikiwiki. This document will explain how to write both sorts of plugins, albeit with an emphasis on perl plugins. -## Considerations +## Plugin interface -One thing to keep in mind when writing a plugin is that ikiwiki is a wiki -*compiler*. So plugins influence pages when they are built, not when they -are loaded. A plugin that inserts the current time into a page, for -example, will insert the build time. Also, as a compiler, ikiwiki avoids -rebuilding pages unless they have changed, so a plugin that prints some -random or changing thing on a page will generate a static page that won't -change until ikiwiki rebuilds the page for some other reason, like the page -being edited. +To import the ikiwiki plugin interface: + + use IkiWiki '3.00'; + +This will import several variables and functions into your plugin's +namespace. These variables and functions are the ones most plugins need, +and a special effort will be made to avoid changing them in incompatible +ways, and to document any changes that have to be made in the future. + +Note that IkiWiki also provides other variables and functions that are not +exported by default. No guarantee is made about these in the future, so if +it's not exported, the wise choice is to not use it. ## Registering plugins @@ -68,20 +148,21 @@ In roughly the order they are called. This allows for plugins to perform their own processing of command-line options and so add options to the ikiwiki command line. It's called during -command line processing, with @ARGV full of any options that ikiwiki was +command line processing, with `@ARGV` full of any options that ikiwiki was not able to process on its own. The function should process any options it -can, removing them from @ARGV, and probably recording the configuration -settings in %config. It should take care not to abort if it sees +can, removing them from `@ARGV`, and probably recording the configuration +settings in `%config`. It should take care not to abort if it sees an option it cannot process, and should just skip over those options and -leave them in @ARGV. +leave them in `@ARGV`. ### checkconfig hook(type => "checkconfig", id => "foo", call => \&checkconfig); This is useful if the plugin needs to check for or modify ikiwiki's -configuration. It's called early in the startup process. The -function is passed no values. It's ok for the function to call +configuration. It's called early in the startup process. `%config` +is populated at this point, but other state has not yet been loaded. +The function is passed no values. It's ok for the function to call `error()` if something isn't configured right. ### refresh @@ -117,8 +198,8 @@ value is ignored. hook(type => "filter", id => "foo", call => \&filter); -Runs on the raw source of a page, before anything else touches it, and can -make arbitrary changes. The function is passed named parameters "page", +Runs on the full raw source of a page, before anything else touches it, and +can make arbitrary changes. The function is passed named parameters "page", "destpage", and "content". It should return the filtered content. ### preprocess @@ -201,11 +282,22 @@ like `Makefile` that have no extension. If `hook` is passed an optional "longname" parameter, this value is used when prompting a user to choose a page type on the edit page form. +### indexhtml + + hook(type => "indexhtml", id => "foo", call => \&indexhtml); + +This hook is called once the page has been converted to html (but before +the generated html is put in a template). The most common use is to +update search indexes. Added in ikiwiki 2.54. + +The function is passed named parameters "page", "destpage", and "content". +Its return value is ignored. + ### pagetemplate hook(type => "pagetemplate", id => "foo", call => \&pagetemplate); -[[Templates|wikitemplates]] are filled out for many different things in +[[Templates]] are filled out for many different things in ikiwiki, like generating a page, or part of a blog page, or an rss feed, or a cgi. This hook allows modifying the variables available on those templates. The function is passed named parameters. The "page" and @@ -221,11 +313,20 @@ a new custom parameter to the template. hook(type => "templatefile", id => "foo", call => \&templatefile); -This hook allows plugins to change the [[template|wikitemplates]] that is +This hook allows plugins to change the [[template|templates]] that is used for a page in the wiki. The hook is passed a "page" parameter, and -should return the name of the template file to use, or undef if it doesn't -want to change the default ("page.tmpl"). Template files are looked for in -/usr/share/ikiwiki/templates by default. +should return the name of the template file to use (relative to the +template directory), or undef if it doesn't want to change the default +("page.tmpl"). + +### pageactions + + hook(type => "pageactions", id => "foo", call => \&pageactions); + +This hook allows plugins to add arbitrary actions to the action bar on a +page (next to Edit, RecentChanges, etc). The hook is passed a "page" +parameter, and can return a list of html fragments to add to the action +bar. ### sanitize @@ -237,17 +338,6 @@ modify the body of a page after it has been fully converted to html. The function is passed named parameters: "page", "destpage", and "content", and should return the sanitized content. -### postscan - - hook(type => "postscan", id => "foo", call => \&postscan); - -This hook is called once the full page body is available (but before the -format hook). The most common use is to update search indexes. Added in -ikiwiki 2.54. - -The function is passed named parameters "page" and "content". Its return -value is ignored. - ### format hook(type => "format", id => "foo", call => \&format); @@ -455,7 +545,13 @@ The data returned is a list of `%config` options, followed by a hash describing the option. There can also be an item named "plugin", which describes the plugin as a whole. For example: - return + return + plugin => { + description => "description of this plugin", + safe => 1, + rebuild => 1, + section => "misc", + }, option_foo => { type => "boolean", description => "enable foo?", @@ -470,11 +566,6 @@ describes the plugin as a whole. For example: safe => 1, rebuild => 0, }, - plugin => { - description => "description of this plugin", - safe => 1, - rebuild => 1, - }, * `type` can be "boolean", "string", "integer", "pagespec", or "internal" (used for values that are not user-visible). The type is @@ -495,29 +586,38 @@ describes the plugin as a whole. For example: the plugin) will require a wiki rebuild, false if no rebuild is needed, and undef if a rebuild could be needed in some circumstances, but is not strictly required. +* `section` can optionally specify which section in the config file + the plugin fits in. The convention is to name the sections the + same as the tags used for [[plugins|plugin]] on this wiki. -## Plugin interface +### genwrapper -To import the ikiwiki plugin interface: + hook(type => "genwrapper", id => "foo", call => \&genwrapper); - use IkiWiki '3.00'; +This hook is used to inject C code (which it returns) into the `main` +function of the ikiwiki wrapper when it is being generated. -This will import several variables and functions into your plugin's -namespace. These variables and functions are the ones most plugins need, -and a special effort will be made to avoid changing them in incompatible -ways, and to document any changes that have to be made in the future. +The code runs before anything else -- in particular it runs before +the suid wrapper has sanitized its environment. -Note that IkiWiki also provides other variables and functions that are not -exported by default. No guarantee is made about these in the future, so if -it's not exported, the wise choice is to not use it. +### disable + + hook(type => "disable", id => "foo", call => \&disable); + +This hook is only run when a previously enabled plugin gets disabled +during ikiwiki setup. Plugins can use this to perform cleanups. + +## Exported variables -### %config +Several variables are exported to your plugin when you `use IkiWiki;` + +### `%config` A plugin can access the wiki's configuration via the `%config` hash. The best way to understand the contents of the hash is to look at your ikiwiki setup file, which sets the hash content to configure the wiki. -### %pagestate +### `%pagestate` The `%pagestate` hash can be used by plugins to save state that they will need next time ikiwiki is run. The hash holds per-page state, so to set a value, @@ -535,7 +635,7 @@ When pages are deleted, ikiwiki automatically deletes their pagestate too. Note that page state does not persist across wiki rebuilds, only across wiki updates. -### %wikistate +### `%wikistate` The `%wikistate` hash can be used by a plugin to store persistant state that is not bound to any one page. To set a value, use @@ -544,23 +644,53 @@ serialize, `$key` is any string you like, and `$id` must be the same as the "id" parameter passed to `hook()` when registering the plugin, so that the state can be dropped if the plugin is no longer used. -### Other variables +### `%links` + +The `%links` hash can be used to look up the names of each page that +a page links to. The name of the page is the key; the value is an array +reference. Do not modify this hash directly; call `add_link()`. + + $links{"foo"} = ["bar", "baz"]; + +### `%typedlinks` -If your plugin needs to access data about other pages in the wiki. It can -use the following hashes, using a page name as the key: +The `%typedlinks` hash records links of specific types. Do not modify this +hash directly; call `add_link()`. The keys are page names, and the values +are hash references. In each page's hash reference, the keys are link types +defined by plugins, and the values are hash references with link targets +as keys, and 1 as a dummy value, something like this: -* `%links` lists the names of each page that a page links to, in an array - reference. -* `%destsources` contains the name of the source file used to create each - destination file. -* `%pagesources` contains the name of the source file for each page. + $typedlinks{"foo"} = { + tag => { short_word => 1, metasyntactic_variable => 1 }, + next_page => { bar => 1 }, + }; -Also, the `%IkiWiki::version` variable contains the version number for the -ikiwiki program. +Ordinary [[WikiLinks|ikiwiki/WikiLink]] appear in `%links`, but not in +`%typedlinks`. -### Library functions +### `%pagesources` -#### `hook(@)` +The `%pagesources` has can be used to look up the source filename +of a page. So the key is the page name, and the value is the source +filename. Do not modify this hash. + + $pagesources{"foo"} = "foo.mdwn"; + +### `%destsources` + +The `%destsources` hash records the name of the source file used to +create each destination file. The key is the output filename (ie, +"foo/index.html"), and the value is the source filename that it was built +from (eg, "foo.mdwn"). Note that a single source file may create multiple +destination files. Do not modify this hash directly; call `will_render()`. + + $destsources{"foo/index.html"} = "foo.mdwn"; + +## Library functions + +Several functions are exported to your plugin when you `use IkiWiki;` + +### `hook(@)` Hook into ikiwiki's processing. See the discussion of hooks above. @@ -569,12 +699,12 @@ named `no_override` is supported, If it's set to a true value, then this hook will not override any existing hook with the same id. This is useful if the id can be controled by the user. -#### `debug($)` +### `debug($)` Logs a debugging message. These are supressed unless verbose mode is turned on. -#### `error($;$)` +### `error($;$)` Aborts with an error message. If the second parameter is passed, it is a function that is called after the error message is printed, to do any final @@ -588,13 +718,32 @@ In other hooks, error() is a fatal error, so use with care. Try to avoid dying on bad input when building a page, as that will halt the entire wiki build and make the wiki unusable. -#### `template($;@)` +### `template($;@)` + +Creates and returns a [[!cpan HTML::Template]] object. (In a list context, +returns the parameters needed to construct the obhect.) -Creates and returns a [[!cpan HTML::Template]] object. The first parameter -is the name of the file in the template directory. The optional remaining +The first parameter is the name of the template file. The optional remaining parameters are passed to `HTML::Template->new`. -#### `htmlpage($)` +Normally, the template file is first looked for in the templates/ subdirectory +of the srcdir. Failing that, it is looked for in the templatedir. + +Wiki pages can be used as templates. This should be done only for templates +which it is safe to let wiki users edit. Enable it by passing a filename +with no ".tmpl" extension. Template pages are normally looked for in +the templates/ directory. If the page name starts with "/", a page +elsewhere in the wiki can be used. + +### `template_depends($$;@)` + +Use this instead of `template()` if the content of a template is being +included into a page. This causes the page to depend on the template, +so it will be updated if the template is modified. + +Like `template()`, except the second parameter is the page. + +### `htmlpage($)` Passed a page name, returns the base name that will be used for a the html page created from it. (Ie, it appends ".html".) @@ -602,34 +751,79 @@ page created from it. (Ie, it appends ".html".) Use this when constructing the filename of a html file. Use `urlto` when generating a link to a page. -#### `add_depends($$)` +### `pagespec_match_list($$;@)` + +Passed a page name, and [[ikiwiki/PageSpec]], returns a list of pages +in the wiki that match the [[ikiwiki/PageSpec]]. + +The page will automatically be made to depend on the specified +[[ikiwiki/PageSpec]], so `add_depends` does not need to be called. This +is often significantly more efficient than calling `add_depends` and +`pagespec_match` in a loop. You should use this anytime a plugin +needs to match a set of pages and do something based on that list. + +Unlike pagespec_match, this may throw an error if there is an error in +the pagespec. + +Additional named parameters can be specified: + +* `deptype` optionally specifies the type of dependency to add. Use the + `deptype` function to generate a dependency type. +* `filter` is a reference to a function, that is called and passed a page, + and returns true if the page should be filtered out of the list. +* `sort` specifies a sort order for the list. See + [[ikiwiki/PageSpec/sorting]] for the avilable sort methods. Note that + if a sort method is specified that depends on the + page content (such as 'meta(foo)'), the deptype needs to be set to + a content dependency. +* `reverse` if true, sorts in reverse. +* `num` if nonzero, specifies the maximum number of matching pages that + will be returned. +* `list` makes it only match amoung the specified list of pages. + Default is to match amoung all pages in the wiki. + +Any other named parameters are passed on to `pagespec_match`, to further +limit the match. + +### `add_depends($$;$)` Makes the specified page depend on the specified [[ikiwiki/PageSpec]]. -#### `pagespec_match($$;@)` +By default, dependencies are full content dependencies, meaning that the +page will be updated whenever anything matching the PageSpec is modified. +This can be overridden by passing a `deptype` value as the third parameter. -Passed a page name, and [[ikiwiki/PageSpec]], returns true if the +### `pagespec_match($$;@)` + +Passed a page name, and [[ikiwiki/PageSpec]], returns a true value if the [[ikiwiki/PageSpec]] matches the page. +Note that the return value is overloaded. If stringified, it will be a +message indicating why the PageSpec succeeded, or failed, to match the +page. + Additional named parameters can be passed, to further limit the match. The most often used is "location", which specifies the location the PageSpec should match against. If not passed, relative PageSpecs will match relative to the top of the wiki. -#### `pagespec_match_list($$;@)` +### `deptype(@)` -Passed a reference to a list of page names, and [[ikiwiki/PageSpec]], -returns the set of pages that match the [[ikiwiki/PageSpec]]. +Use this function to generate ikiwiki's internal representation of a +dependency type from one or more of these keywords: -Additional named parameters can be passed, to further limit the match. -The most often used is "location", which specifies the location the -PageSpec should match against. If not passed, relative PageSpecs will match -relative to the top of the wiki. +* `content` is the default. Any change to the content + of a page triggers the dependency. +* `presence` is only triggered by a change to the presence + of a page. +* `links` is only triggered by a change to the links of a page. + This includes when a link is added, removed, or changes what + it points to due to other changes. It does not include the + addition or removal of a duplicate link. -Unlike pagespec_match, this may throw an error if there is an error in -the pagespec. +If multiple types are specified, they are combined. -#### `bestlink($$)` +### `bestlink($$)` Given a page and the text of a link on the page, determine which existing page that link best points to. Prefers pages under a @@ -637,7 +831,7 @@ subdirectory with the same name as the source page, failing that goes down the directory tree to the base looking for matching pages, as described in [[ikiwiki/SubPage/LinkingRules]]. -#### `htmllink($$$;@)` +### `htmllink($$$;@)` Many plugins need to generate html links and add them to a page. This is done by using the `htmllink` function. The usual way to call @@ -663,8 +857,9 @@ control some options. These are: * anchor - set to make the link include an anchor * rel - set to add a rel attribute to the link * class - set to add a css class to the link +* title - set to add a title attribute to the link -#### `readfile($;$)` +### `readfile($;$)` Given a filename, reads and returns the entire file. @@ -673,7 +868,7 @@ in binary mode. A failure to read the file will result in it dying with an error. -#### `writefile($$$;$$)` +### `writefile($$$;$$)` Given a filename, a directory to put it in, and the file's content, writes a file. @@ -701,7 +896,7 @@ generally the directory parameter is a trusted toplevel directory like the srcdir or destdir, and any subdirectories of this are included in the filename parameter. -#### `will_render($$)` +### `will_render($$)` Given a page name and a destination file name (not including the base destination directory), register that the page will result in that file @@ -717,34 +912,34 @@ Ikiwiki uses this information to automatically clean up rendered files when the page that rendered them goes away or is changed to no longer render them. will_render also does a few important security checks. -#### `pagetype($)` +### `pagetype($)` Given the name of a source file, returns the type of page it is, if it's a type that ikiwiki knowns how to htmlize. Otherwise, returns undef. -#### `pagename($)` +### `pagename($)` Given the name of a source file, returns the name of the wiki page that corresponds to that file. -#### `pagetitle($)` +### `pagetitle($)` Give the name of a wiki page, returns a version suitable to be displayed as the page's title. This is accomplished by de-escaping escaped characters in the page name. "_" is replaced with a space, and '__NN__' is replaced by the UTF character with code NN. -#### `titlepage($)` +### `titlepage($)` This performs the inverse of `pagetitle`, ie, it converts a page title into a wiki page name. -#### `linkpage($)` +### `linkpage($)` This converts text that could have been entered by the user as a [[ikiwiki/WikiLink]] into a wiki page name. -#### `srcfile($;$)` +### `srcfile($;$)` Given the name of a source file in the wiki, searches for the file in the source directory and the underlay directories (most recently added @@ -754,7 +949,7 @@ Normally srcfile will fail with an error message if the source file cannot be found. The second parameter can be set to a true value to make it return undef instead. -#### `add_underlay($)` +### `add_underlay($)` Adds a directory to the set of underlay directories that ikiwiki will search for files. @@ -762,18 +957,25 @@ search for files. If the directory name is not absolute, ikiwiki will assume it is in the parent directory of the configured underlaydir. -#### `displaytime($;$)` +### `displaytime($;$$)` Given a time, formats it for display. The optional second parameter is a strftime format to use to format the time. -#### `gettext` +If the third parameter is true, this is the publication time of a page. +(Ie, set the html5 pubdate attribute.) + +### `gettext` This is the standard gettext function, although slightly optimised. -#### `urlto($$;$)` +### `ngettext` + +This is the standard ngettext function, although slightly optimised. + +### `urlto($$;$)` Construct a relative url to the first parameter from the page named by the second. The first parameter can be either a page name, or some other @@ -782,13 +984,13 @@ destination file, as registered by `will_render`. If the third parameter is passed and is true, an absolute url will be constructed instead of the default relative url. -#### `newpagefile($$)` +### `newpagefile($$)` This can be called when creating a new page, to determine what filename to save the page to. It's passed a page name, and its type, and returns the name of the file to create, relative to the srcdir. -#### `targetpage($$;$)` +### `targetpage($$;$)` Passed a page and an extension, returns the filename that page will be rendered to. @@ -797,11 +999,31 @@ Optionally, a third parameter can be passed, to specify the preferred filename of the page. For example, `targetpage("foo", "rss", "feed")` will yield something like `foo/feed.rss`. -#### `add_link($$)` +### `add_link($$;$)` This adds a link to `%links`, ensuring that duplicate links are not added. Pass it the page that contains the link, and the link text. +An optional third parameter sets the link type. If not specified, +it is an ordinary [[ikiwiki/WikiLink]]. + +### `add_autofile($$$)` + +Sometimes you may want to add a file to the `srcdir` as a result of content +of other pages. For example, [[plugins/tag]] pages can be automatically +created as needed. This function can be used to do that. + +The three parameters are the filename to create (relative to the `srcdir`), +the name of the plugin, and a callback function. The callback will be +called if it is appropriate to automatically add the file, and should then +take care of creating it, and doing anything else it needs to (such as +checking it into revision control). Note that the callback may not always +be called. For example, if an automatically added file is deleted by the +user, ikiwiki will avoid re-adding it again. + +This function needs to be called during the scan hook, or earlier in the +build process, in order to add the file early enough for it to be built. + ## Miscellaneous ### Internal use pages @@ -839,16 +1061,20 @@ token, that will be passed into `rcs_commit` when committing. For example, it might return the current revision ID of the file, and use that information later when merging changes. -#### `rcs_commit($$$;$$)` +#### `rcs_commit(@)` + +Passed named parameters: `file`, `message`, `token` (from `rcs_prepedit`), +and `session` (optional). -Passed a file, message, token (from `rcs_prepedit`), user, and ip address. Should try to commit the file. Returns `undef` on *success* and a version of the page with the rcs's conflict markers on failure. -#### `rcs_commit_staged($$$)` +#### `rcs_commit_staged(@)` + +Passed named parameters: `message`, and `session` (optional). -Passed a message, user, and ip address. Should commit all staged changes. -Returns undef on success, and an error message on failure. +Should commit all staged changes. Returns undef on success, and an +error message on failure. Changes can be staged by calls to `rcs_add`, `rcs_remove`, and `rcs_rename`. @@ -891,7 +1117,9 @@ The data structure returned for each change is: { rev => # the RCSs id for this commit - user => # name of user who made the change, + user => # user who made the change (may be an openid), + nickname => # short name for user (optional; not an openid), + committype => # either "web" or the name of the rcs, when => # time when the change was made, message => [ @@ -921,6 +1149,17 @@ it up in the history. It's ok if this is not implemented, and throws an error. +If the RCS cannot determine a ctime for the file, return 0. + +#### `rcs_getmtime($)` + +This is used to get the page modification time for a file from the RCS, by +looking it up in the history. + +It's ok if this is not implemented, and throws an error. + +If the RCS cannot determine a mtime for the file, return 0. + #### `rcs_receive()` This is called when ikiwiki is running as a pre-receive hook (or @@ -959,6 +1198,33 @@ an IkiWiki::FailReason object if the match fails. If the match cannot be attempted at all, for any page, it can instead return an IkiWiki::ErrorReason object explaining why. +When constructing these objects, you should also include information about +of any pages whose contents or other metadata influenced the result of the +match. Do this by passing a list of pages, followed by `deptype` values. + +For example, "backlink(foo)" is influenced by the contents of page foo; +"link(foo)" and "title(bar)" are influenced by the contents of any page +they match; "created_before(foo)" is influenced by the metadata of foo; +while "glob(*)" is not influenced by the contents of any page. + +### Sorting plugins + +Similarly, it's possible to write plugins that add new functions as +[[ikiwiki/pagespec/sorting]] methods. To achieve this, add a function to +the IkiWiki::SortSpec package named `cmp_foo`, which will be used when sorting +by `foo` or `foo(...)` is requested. + +The names of pages to be compared are in the global variables `$a` and `$b` +in the IkiWiki::SortSpec package. The function should return the same thing +as Perl's `cmp` and `<=>` operators: negative if `$a` is less than `$b`, +positive if `$a` is greater, or zero if they are considered equal. It may +also raise an error using `error`, for instance if it needs a parameter but +one isn't provided. + +The function will also be passed one or more parameters. The first is +`undef` if invoked as `foo`, or the parameter `"bar"` if invoked as `foo(bar)`; +it may also be passed additional, named parameters. + ### Setup plugins The ikiwiki setup file is loaded using a pluggable mechanism. If you look diff --git a/doc/post-commit/discussion.mdwn b/doc/post-commit/discussion.mdwn index 6ae0d9bcb..fc0a27ee4 100644 --- a/doc/post-commit/discussion.mdwn +++ b/doc/post-commit/discussion.mdwn @@ -91,7 +91,7 @@ Can you offer an educated guess what's going wrong here? --[[Schmonz]] >>> Aaaaaand I was wrong about the second half of the conjecture being >>> wrong. The wrapper script wasn't correctly identifying directories; >>> with that fixed, everything works. I've created a ->>> [[plugins/contrib/cvs]] plugin page. Thanks for listening. :-) +>>> [[rcs/cvs]] page. Thanks for listening. :-) >>> --[[Schmonz]] >> Here is a comment I committed to my laptop from Madrid Airport before @@ -116,3 +116,8 @@ Can you offer an educated guess what's going wrong here? --[[Schmonz]] >> process, so you could just use a temporary list of things to add. >> --[[Joey]] +>>> Thanks for the comments. Attempting to set up a wiki on a different system with a different version of `cvs`, I've encountered a new locking problem within CVS: `cvs commit` takes a write lock, post-commit ikiwiki calls `rcs_update()`, `cvs update` wants a read lock and blocks. The easiest fix I can think of is to make `cvs commit` return and relinquish its lock -- so instead of my wrapper script `exec`ing ikiwiki's post-commit hook, I amp it off and exit 0. Seems to do the trick and, if I grok ikiwiki's behavior here, is not dangerous. (Beats me why my development `cvs` doesn't behave the same WRT locking.) + +>>> I was all set to take your third suggestion, but now that there's more than one CVS oddity fixed trivially in a wrapper script, I think I prefer doing it that way. + +>>> I'd be glad for the CVS plugin to be included in ikiwiki, if and when you deem it ready. Please let me know what needs to be done for that to happen. --[[Schmonz]] diff --git a/doc/quotes.mdwn b/doc/quotes.mdwn new file mode 100644 index 000000000..22f3a28d8 --- /dev/null +++ b/doc/quotes.mdwn @@ -0,0 +1,3 @@ +Collecting some happy quotes about ikiwiki here. + +[[!inline pages="quotes/* and !*/Discussion"]] diff --git a/doc/quotes/pizza.mdwn b/doc/quotes/pizza.mdwn new file mode 100644 index 000000000..34899edfb --- /dev/null +++ b/doc/quotes/pizza.mdwn @@ -0,0 +1,4 @@ +> Best. Wiki. Ever. Now a wiki that I can't "git clone" and "git push" to is +> like a pizza that I have to eat with a knife and fork. + +-- [Don Marti](http://lwn.net/Articles/360888/) diff --git a/doc/quotes/pizza/discussion.mdwn b/doc/quotes/pizza/discussion.mdwn new file mode 100644 index 000000000..ecf8c44a6 --- /dev/null +++ b/doc/quotes/pizza/discussion.mdwn @@ -0,0 +1 @@ +It would be cool to know where this was written (assuming it was somewhere public :-)) Googling around, I've found a few places Don has recommended ikiwiki, but not in such glowing terms. -- [[Jon]] diff --git a/doc/quotes/sold.mdwn b/doc/quotes/sold.mdwn new file mode 100644 index 000000000..4bd021f74 --- /dev/null +++ b/doc/quotes/sold.mdwn @@ -0,0 +1,3 @@ +I'm totally sold on ikiwiki now. + +-- Anna Hess diff --git a/doc/rcs.mdwn b/doc/rcs.mdwn index f66b85495..83c6910d0 100644 --- a/doc/rcs.mdwn +++ b/doc/rcs.mdwn @@ -6,11 +6,37 @@ histories. Ikiwiki started out supporting only [[Subversion|svn]], but the interface ikiwiki uses to a revision control system is sufficiently simple and generic that it can be adapted to work with many systems by writing a -[[plugin|plugins/write]]. [[Subversion|svn]] is still a recommended choice; -[[git]] is another well-tested option. +[[plugin|plugins/write]]. These days, most people use [[git]]. -These are all the supported revision control systems: -[[!inline pages="rcs/* and !*/Discussion and !rcs/details" archive=yes]] +While all supported revision control systems work well enough for basic +use, some advanced or special features are not supported in all of them. +The table below summarises this for each revision control system and +links to more information about each. + +[[!table data=""" +feature |[[git]]|[[svn]]|[[bzr]] |[[monotone]]|[[mercurial]]|[[darcs]]|[[tla]] |[[cvs]] +[[ikiwiki-makerepo]]|yes |yes |yes |yes |yes |yes |no |yes +auto.setup |yes |yes |incomplete|yes |incomplete |yes |incomplete|yes +`rcs_commit_staged` |yes |yes |yes |yes |no |yes |no |yes +`rcs_rename` |yes |yes |yes |yes |no |yes |no |yes +`rcs_remove` |yes |yes |yes |yes |no |yes |no |yes +`rcs_diff` |yes |yes |yes |yes |no |yes |yes |yes +`rcs_getctime` |fast |slow |slow |slow |slow |slow |slow |slow +`rcs_getmtime` |fast |slow |slow |no |no |no |no |no +anonymous push |yes |no |no |no |no |no |no |no +conflict handling |yes |yes |yes |buggy |yes |yes |yes |yes +openid username |yes |no |no |no |no |no |no |no +"""]] + +Notes: + +* Lack of support in [[ikiwiki-makerepo]] or auto.setup can make it harder to + set up a wiki using that revision control system. +* The `rcs_commit_staged` hook is needed to use [[attachments|plugins/attachment]] + or [[plugins/comments]]. +* `rcs_getctime` and `rcs_getmtime` may be implemented in a fast way (ie, one log + lookup for all files), or very slowly (one lookup per file). +* Openid username support allows avoiding display of Google's ugly openids. There is a page with [[details]] about how the different systems work with ikiwiki, for the curious. diff --git a/doc/rcs/cvs.mdwn b/doc/rcs/cvs.mdwn new file mode 100644 index 000000000..9beb08ece --- /dev/null +++ b/doc/rcs/cvs.mdwn @@ -0,0 +1,28 @@ +If you really need to, you can use [[!wikipedia desc="CVS" Concurrent Versions System]] +with ikiwiki. + +### Usage +7. Install [[!cpan File::chdir]], [[!cpan File::ReadBackwards]], +[cvsps](http://www.cobite.com/cvsps/), and +[cvsweb](http://www.freebsd.org/projects/cvsweb.html) or the like. +7. Adjust CVS-related parameters in your setup file. + +Consider creating `$HOME/.cvsrc` if you don't have one already; the plugin doesn't need it, but you yourself might. Here's a good general-purpose one: + + cvs -q + checkout -P + update -dP + diff -u + rdiff -u + +### Implementation details +* [[ikiwiki-makerepo]]: + * creates a repository, + * imports `$SRCDIR` into top-level module `ikiwiki` (vendor tag IKIWIKI, release tag PRE_CVS), + * configures the post-commit hook in `CVSROOT/loginfo`. + +### To do +* Have `ikiwiki-makerepo` set up NetBSD-like `log_accum` and `commit_prep` scripts that coalesce commits into changesets. Reasons: + 7. Obviates the need to scrape the repo's complete history to determine the last N changesets. (Repositories without such records can fall back on the `cvsps` and `File::ReadBackwards` code.) + 7. Arranges for ikiwiki to be run once per changeset, rather than CVS's once per committed file (!), which is a waste at best and bug-inducing at worst. (Currently, on multi-directory commits, only the first directory's changes get mentioned in [[recentchanges|plugins/recentchanges]].) +* Perhaps prevent web edits from attempting to create `.../CVS/foo.mdwn` (and `.../cvs/foo.mdwn` on case-insensitive filesystems); thanks to the CVS metadata directory, the attempt will fail anyway (and much more confusingly) if we don't. diff --git a/doc/rcs/cvs/discussion.mdwn b/doc/rcs/cvs/discussion.mdwn new file mode 100644 index 000000000..645b2388b --- /dev/null +++ b/doc/rcs/cvs/discussion.mdwn @@ -0,0 +1,149 @@ +I've started reviewing this, and the main thing I don't like is the +post-commit wrapper wrapper that ikiwiki-makerepo is patched to set up. +That just seems unnecessarily complicated. Why can't ikiwiki itself detect +the "cvs add <directory>" call and avoid doing anything in that case? +--[[Joey]] + +> The wrapper wrapper does three things: +> +> 7. It ignores `cvs add <directory>`, since this is a weird CVS +> behavior that ikiwiki gets confused by and doesn't need to act on. +> 7. It prevents `cvs` locking against itself: `cvs commit` takes a +> write lock and runs the post-commit hook, which runs `cvs update`, +> which wants a read lock and sleeps forever -- unless the post-commit +> hook runs in the background so the commit can "finish". +> 7. It fails silently if the ikiwiki post-commit hook is missing. +> CVS doesn't have any magic post-commit filenames, so hooks have to +> be configured explicitly. I don't think a commit will actually fail +> if a configured post-commit hook is missing (though I can't test +> this at the moment). +> +> Thing 1 can probably be handled within ikiwiki, if that seems less +> gross to you. + +>> It seems like it might be. You can use a `getopt` hook to check +>> `@ARGV` to see how it was called. --[[Joey]] + +>>> This does the trick iff the post-commit wrapper passes its args +>>> along. Committed on my branch. This seems potentially dangerous, +>>> since the args passed to ikiwiki are influenced by web commits. +>>> I don't see an exploit, but for paranoia's sake, maybe the wrapper +>>> should only be built with execv() if the cvs plugin is loaded? +>>> --[[schmonz]] + +>>>> Hadn't considered that. While in wrapper mode the normal getopt is not +>>>> done, plugin getopt still runs, and so any unsafe options that +>>>> other plugins support could be a problem if another user runs +>>>> the setuid wrapper and passes those options through. --[[Joey]] + +>>>>> I've tried compiling the argument check into the wrapper as +>>>>> the first thing main() does, and was surprised to find that +>>>>> this doesn't prevent the `cvs add <dir>` deadlock in a web +>>>>> commit. I was convinced this'd be a reasonable solution, +>>>>> especially if conditionalized on the cvs plugin being loaded, +>>>>> but it doesn't work. And I stuck debug printfs at the beginning +>>>>> of all the rcs_foo() subs, and whatever `cvs add <dir>` is +>>>>> doing to ikiwiki isn't visible to my plugin, because none of +>>>>> those subs are getting called. Nuts. Can you think of anything +>>>>> else that might solve the problem, or should I go back to +>>>>> generating a minimal wrapper wrapper that checks for just +>>>>> this one thing? --[[schmonz]] + +>>>>>> I don't see how there could possibly be a difference between +>>>>>> ikiwiki's C wrapper and your shell wrapper wrapper here. --[[Joey]] + +>>>>>>> I was comparing strings overly precisely. Fixed on my branch. +>>>>>>> I've also knocked off the two most pressing to-do items. I +>>>>>>> think the plugin's ready for prime time. --[[schmonz]] + +> Thing 2 I'm less sure of. (I'd like to see the web UI return +> immediately on save anyway, to a temporary "rebuilding, please wait +> if you feel like knowing when it's done" page, but this problem +> with CVS happens with any kind of commit, and could conceivably +> happen with some other VCS.) + +>> None of the other VCSes let a write lock block a read lock, apparently. +>> +>> Anyway, re the backgrounding, when committing via the web, the +>> post-commit hook doesn't run anyway; the rendering is done via the +>> ikiwiki CGI. It would certianly be nice if it popped up a quick "working" +>> page and replaced it with the updated page when done, but that's +>> unrelated; the post-commit +>> hook only does rendering when committing using the VCS directly. The +>> backgrounding you do actually seems safe enough -- but tacking +>> on a " &" to the ikiwiki wrapper call doesn't need a wrapper script, +>> does it? --[[Joey]] + +>>> Nope, it works fine to append it to the `CVSROOT/loginfo` line. +>>> Fixed on my branch. --[[schmonz]] + +> Thing 3 I think I did in order to squelch the error messages that +> were bollixing up the CGI. It was easy to do this in the wrapper +> wrapper, but if that's going away, it can be done just as easily +> with output redirection in `CVSROOT/loginfo`. +> +> --[[schmonz]] + +>> If the error messages screw up the CGI they must go to stdout. +>> I thought we had stderr even in the the CVS dark ages. ;-) --[[Joey]] + +>>> Some messages go to stderr, but definitely not all. That's why +>>> I wound up reaching for IPC::Cmd, to execute the command line +>>> safely while shutting CVS up. Anyway, I've tested what happens +>>> if a configured post-commit hook is missing, and it seems fine, +>>> probably also thanks to IPC::Cmd. +>>> --[[schmonz]] + +---- + + +Further review.. --[[Joey]] + +I don't understand what `cvs_shquote_commit` is +trying to do with the test message, but it seems +highly likely to be insecure; I never trust anything +that relies on safely quoting user input passed to the shell. + +(As an aside, `shell_quote` can die on certian inputs.) + +Seems to me that, if `IPC::Cmd` exposes input to the shell +(which I have not verified but its docs don't specify; a bad sign) +you chose the wrong tool and ended up doing down the wrong +route, dragging in shell quoting problems and fixes. Since you +chose to use `IPC::Cmd` just because you wanted to shut +up CVS stderr, my suggestion would be to use plain `system` +to run the command, with stderr temporarily sent to /dev/null: + + open(my $savederr, ">&STDERR"); + open(STDERR, ">", "/dev/null"); + my $ret=system("cvs", "-Q", @_); + open(STDERR, ">$savederr"); + +`cvs_runcvs` should not take an array reference. It's +usual for this type of function to take a list of parameters +to pass to the command. + +> Thanks for reading carefully. I've tested your suggestions and +> applied them on my branch. --[[schmonz]] + +---- + +I've abstracted out CVS's involvement in the wrapper, adding a new +"wrapperargcheck" hook to examine `argc/argv` and return success or +failure (failure causes the wrapper to terminate) and implementing +this hook in the plugin. In the non-CVS case, the check immediately +returns success, so the added overhead is just a function call. + +Given how rarely anything should need to reach in and modify the +wrapper -- I'd go so far as to say we shouldn't make it too easy +-- I don't think it's worth the effort to try and design a more +general-purpose way to do so. If and when some other problem thinks +it wants to be solved by a new wrapper hook, it's easy enough to add +one. Until then, I'd say it's more important to keep the wrapper as +short and clear as possible. --[[schmonz]] + +> I've committed a slightly different hook, which should be general enough +> that `IkiWiki::Receive` can also use it, so please adapt your code to +> that. --[[Joey]] + +>> Done. --[[schmonz]]. diff --git a/doc/rcs/details.mdwn b/doc/rcs/details.mdwn index 6492cf38c..013ddb745 100644 --- a/doc/rcs/details.mdwn +++ b/doc/rcs/details.mdwn @@ -288,3 +288,5 @@ user for cleanup. This is less neat than it could be, in that a conflict marked revision gets committed to the repository. ## [[bzr]] + +## [[cvs]] diff --git a/doc/rcs/git.mdwn b/doc/rcs/git.mdwn index 000eb0b3c..c627792d7 100644 --- a/doc/rcs/git.mdwn +++ b/doc/rcs/git.mdwn @@ -28,12 +28,7 @@ updates the published wiki itself. The other (optional) leaf node repositories are meant for you to work on, and commit to, changes should then be pushed to the bare root -repository. In theory, you could work on the same leaf node repository -that ikiwiki uses to compile the wiki from, and the [[cgi]] commits -to, as long as you ensure that permissions and ownership don't hinder -the working of the [[cgi]]. This can be done, for example, by using -ACL's, in practice, it is easier to just setup separate clones for -yourself. +repository. So, to reiterate, when using Git, you probably want to set up three repositories: @@ -41,9 +36,9 @@ repositories: * The root repository. This should be a bare repository (meaning that it does not have a working tree checked out), which the other repositories will push to/pull from. It is a bare repository, since - there are problems pushing to a repository that has a working + git does not support pushing to a repository that has a working directory. This is called _repository_ in [[ikiwiki-makerepo]]'s - manual page. Nominally, this bare repository has a `post-update` hook + manual page. This bare repository has a `post-update` hook that either is or calls ikiwiki's git wrapper, which changes to the working directory for ikiwiki, does a `git pull`, and refreshes ikiwiki to regenerate the wiki with any new content. The [[setup]] page describes @@ -51,10 +46,11 @@ repositories: * The second repository is a clone of the bare root repository, and has a working tree which is used as ikiwiki's srcdir for compiling - the wiki. **Never** push to this repository. When running as a - [[cgi]], the changes are committed to this repository, and pushed to - the master repository above. This is called _srcdir_ in - [[ikiwiki-makerepo]]'s manual page. + the wiki. **Never** push to this repository. It is wise to not make + changes or commits directly to this repository, to avoid conflicting + with ikiwiki's own changes. When running as a [[cgi]], the changes + are committed to this repository, and pushed to the master repository + above. This is called _srcdir_ in [[ikiwiki-makerepo]]'s manual page. * The other (third, fourth, fifth, sixth -- however many pleases you) repositories are also clones of the bare root repository above -- @@ -87,8 +83,8 @@ It can be tricky to get the permissions right to allow multiple people to commit to an ikiwiki git repository. As the [[security]] page mentions, for a secure ikiwiki installation, only one person should be able to write to ikiwiki's srcdir. When other committers make commits, their commits -should go to the bare repository, which has a `post-update` hook that uses -ikiwiki to pull the changes to the srcdir. +should be pushed to the bare repository, which has a `post-update` hook +that uses ikiwiki to pull the changes to the srcdir. One setup that will work is to put all committers in a group (say, "ikiwiki"), and use permissions to allow that group to commit to the bare git diff --git a/doc/rcs/svn.mdwn b/doc/rcs/svn.mdwn index f8c44b6eb..7aa682978 100644 --- a/doc/rcs/svn.mdwn +++ b/doc/rcs/svn.mdwn @@ -1,4 +1,4 @@ -[Subversion](http://subversion.tigris.org/) is a revision control system. While ikiwiki is relatively +[Subversion](http://subversion.tigris.org/) is a [[revision control system|rcs]]. While ikiwiki is relatively independent of the underlying revision control system, and can easily be used without one, using it with Subversion or another revision control system is recommended. diff --git a/doc/rcs/tla.mdwn b/doc/rcs/tla.mdwn index cad5d51f4..79eecd627 100644 --- a/doc/rcs/tla.mdwn +++ b/doc/rcs/tla.mdwn @@ -2,6 +2,9 @@ [GNU](http://www.gnu.org/) [Arch](http://www.gnuarch.org/) revision control system. Ikiwiki supports storing a wiki in tla. +Warning: Since tla is not being maintained, neither is this plugin, and +using ikiwiki with tla is not recommended. + Ikiwiki can run as a [[post-commit]] hook to update a wiki whenever commits come in. When running as a [[cgi]] with tla, ikiwiki automatically commits edited pages to the Arch repostory, and uses the Arch diff --git a/doc/roadmap.mdwn b/doc/roadmap.mdwn index 2f79f7978..134ebcb7b 100644 --- a/doc/roadmap.mdwn +++ b/doc/roadmap.mdwn @@ -7,7 +7,7 @@ This is the roadmap for ikiwiki development. Released 29 April 2006. -The 1.x series is no longer supported. +The 1.X series is no longer supported. ---- @@ -69,6 +69,26 @@ backwards compatability. ---- +# compatability breaking changes + +Probably incomplete list: + +* Drop old `--getctime` option. +* Remove compatability code in `loadindex` to handle old index data layouts. +* Make pagespecs match relative by default? (see [[discussion]]) +* Flip wikilinks? (see [[todo/link_plugin_perhaps_too_general?]]) +* YADA format setup files per default? +* Enable tagbase by default (so that tag autocreation will work by default). + Note that this is already done for wikis created by `auto-blog.setup`. +* [[tips/html5]] on by default (some day..) +* Remove support for old `.ikiwiki/comments_pending` from comment plugin. +* Use yaml formatted setup files by default. (Not too compatability breaking + really.) + +In general, we try to use [[ikiwiki-transition]] or forced rebuilds on +upgrade to deal with changes that break compatability. Some things that +can't help with. + # future goals * Conversion support for existing other wikis. diff --git a/doc/roadmap/discussion.mdwn b/doc/roadmap/discussion.mdwn index 0b69867bf..8233b1990 100644 --- a/doc/roadmap/discussion.mdwn +++ b/doc/roadmap/discussion.mdwn @@ -3,6 +3,7 @@ backwards compatibility problems. Should this be marked as a future plan, perhap major version number like 2.0? --Ethan Yes, I'm looking at making this kind of change at 2.0, added to the list. +(Update: Didn't make it in 2.0 or 3.0...) However, I have doubts that it makes good sense to go relative by default. While it's not consitent with links, it seems to work better overall to have pagespecs be absolute by default, IMHO. --[[Joey]] diff --git a/doc/sandbox.mdwn b/doc/sandbox.mdwn index 9b0a53c68..365cb3321 100644 --- a/doc/sandbox.mdwn +++ b/doc/sandbox.mdwn @@ -1,48 +1,10 @@ This is the [[SandBox]], a page anyone can edit to try out ikiwiki (version [[!version ]]). ----- -2009/7/13 Monday Blue...\\ -While working on [[http://eoffice.im.fju.edu.tw/phpbb/viewtopic.php?p=27199|[97] Phpbb 與 Dokuwiki 之間的往來]] (External Link to //http://eoffice.im.fju.edu.tw/phpbb/viewtopic.php?p=27199// not working?), surf Internet for Wikis supporting //Creole V1.0//, found this site. - -<<<<<<< HEAD:doc/sandbox.mdwn -* I thought that creole markup is used by ikiwiki... -* Thought about using //SVN/CVS// server, however with different idea -* Kinda curious why **Tcl** does not show her face in this Wiki arena -======= -* Thought about using //SVN/CVS// server, however with different idea -* Kinda curious why **Tcl** does not show her face in this Wiki arena -* It looks like that I was wrong, from Wikipedia Creole page it mention that ikiwiki is also adopting Creole markup. ->>>>>>> 55c1a37cce91d4fd01bf80fcaeeaf56535218a5c:doc/sandbox.mdwn - - ---- -I try to have a discussion page !!! ----- - -Testing OpenID some more.. - -test more test -[[中文显示]] - -Here's a paragraph. - -The following code block is pre-formatted: - - Testing what - pre-formatted code blocks - look like. - -Here's another one with *emphasised* text. - -# Header - -## Subheader - > This is a blockquote. > > This is the first level of quoting. > -> > This is nested blockquote. +> > This is a nested blockquote. > >> Without a space works too. >>> to three levels @@ -53,14 +15,12 @@ Numbered list 1. First item. 1. Sub item. + 2. bar 1. Another. 1. And another.. 1. foo 2. bar - 3. baz - 4. qux - 5. quux - 6. wibble + 3. quz Bulleted list @@ -68,10 +28,9 @@ Bulleted list * *item* * item * one - * two - * three - * four - * five + * footballs; runner; unices + * Cool ! + ---- @@ -80,92 +39,20 @@ Bulleted list ---- -The haiku will change after every save, mind you. - ## Different sorts of links: * [[Features]] +* <http://ikiwiki.info/ikiwiki/formatting/> * [[different_name_for_a_WikiLink|ikiwiki/WikiLink]] * <http://www.gnu.org/> * [GNU](http://www.gnu.org/) -* [Email](mailto:noone@invalid) -* [![ikiwiki logo](http://ikiwiki.info/logo/ikiwiki.png)](http://ikiwiki.info) -* <a href="http://www.google.com/">plain old html link</a> -* [[foo]] - ------ - -This SandBox is also a [[blog]]! - -[[!inline pages="sandbox/* and !*/Discussion" rootpage="sandbox" show="4" archive="yes"]] - --------- - -This gives an example of inline code: `tar | netcat` is a nice way to transfer bulk files over the net - -But, of course, rsync is better. +* <test.html> +<google.de> +<http://www.perl.it> ---- -Let's see what happens... ~~ - -#中文标题一 -##中文标题二 -###中文标题三 -... -######中文标题六 - -###正文: +This **SandBox** is also a [[blog]]! +[[!inline pages="blog/* and !*/Discussion" show="10" rootpage="blog"]] -君不见,黄河之水天上来,奔流到海不复回。 - -君不见,高堂明镜悲白发,朝如青丝暮成雪。 - -人生得意须尽欢,莫使金樽空对月。 - -天生我材必有用,千金散尽还复来。 - -###列表: - -* 天空 - - 1. 蓝色的 - 2. 好高啊 - -* 海洋 - - 1. 有鱼 - - * 鲸鱼 - * 鲨鱼 - - -* 大地 - -###链接: - -[谷歌](http://www.google.com) - -###引用: - -> 一级引用 -> -> 一级引用 -> -> > 二级引用 -> ->> 没有空格也可以 ->>> 三级引用 -> -> 回到一级引用 - - ----- - - -testing - --- -[[!toc levels=2]] - -[[Mamma Mia]] +[[!inline pages="sandbox/* and !*/Discussion" rootpage="sandbox" show="4" archive="yes"]] diff --git a/doc/sandbox/Blagging_is_cool.mdwn b/doc/sandbox/Blagging_is_cool.mdwn deleted file mode 100644 index 16aafa304..000000000 --- a/doc/sandbox/Blagging_is_cool.mdwn +++ /dev/null @@ -1,6 +0,0 @@ -This is a testing Blag-Entry. -/me loves blagging. - - * Entry one - * entry two - diff --git a/doc/sandbox/Bleep.mdwn b/doc/sandbox/Bleep.mdwn deleted file mode 100644 index 316f7291c..000000000 --- a/doc/sandbox/Bleep.mdwn +++ /dev/null @@ -1,3 +0,0 @@ -Just a test page ... - -... to see what happens. diff --git a/doc/sandbox/I_hate_making_new_blog_entries_-_sometimes.wiki b/doc/sandbox/I_hate_making_new_blog_entries_-_sometimes.wiki deleted file mode 100644 index 54914bb0b..000000000 --- a/doc/sandbox/I_hate_making_new_blog_entries_-_sometimes.wiki +++ /dev/null @@ -1,25 +0,0 @@ -!!!Heading Three - -Itemized -* list -* of -* things - -or not -1. mom -1. hi -1. there - - - -!!!Heading Three - -Itemized -* list -* of -* things - -or not -1. mom -1. hi -1. there diff --git a/doc/sandbox/Omgwtf_a_blof_post__33____33____33____33____33__1__33__1__33__11111__33____33____33__1__33__1__33____33__1five.html b/doc/sandbox/Omgwtf_a_blof_post__33____33____33____33____33__1__33__1__33__11111__33____33____33__1__33__1__33____33__1five.html deleted file mode 100644 index fc1757d6e..000000000 --- a/doc/sandbox/Omgwtf_a_blof_post__33____33____33____33____33__1__33__1__33__11111__33____33____33__1__33__1__33____33__1five.html +++ /dev/null @@ -1,31 +0,0 @@ -<math xmlns="http://www.w3.org/1998/Math/MathML"> - <mrow> - <msup> - <mfenced open="(" close=")"> - <mrow> - <mi>a</mi> - <mo>+</mo> - - <mi>b</mi> - </mrow> - </mfenced> - <mn>2</mn> - </msup> - <mo>-</mo> - <msub> - - <mfenced open="{" close="}"> - <mrow> - <mi>x</mi> - <mo>+</mo> - <mi>y</mi> - </mrow> - </mfenced> - - <mi>i</mi> - </msub> - </mrow> -</math> -<br> -test <b>test</b><abbr title="test">T.</abbr> <h1>test</h1> -<a href="https://bugzilla.mozilla.org">øđ</a> diff --git a/doc/sandbox/Post.mdwn b/doc/sandbox/Post.mdwn new file mode 100644 index 000000000..7ae61669b --- /dev/null +++ b/doc/sandbox/Post.mdwn @@ -0,0 +1,5 @@ +Ikiwiki is a **wiki compiler**. It converts wiki pages into HTML pages +suitable for publishing on a website. Ikiwiki stores pages and history in a +[[revision_control_system|rcs]] such as [[Subversion|rcs/svn]] or [[rcs/Git]]. +There are many other [[features]], including support for +[[blogging|blog]], as well as a large array of [[plugins]]. diff --git a/doc/sandbox/Test:_with_a_colon_in_its_name.mdwn b/doc/sandbox/Test:_with_a_colon_in_its_name.mdwn deleted file mode 100644 index b91d7cb03..000000000 --- a/doc/sandbox/Test:_with_a_colon_in_its_name.mdwn +++ /dev/null @@ -1 +0,0 @@ -what happens? diff --git a/doc/sandbox/Teximg.mdwn b/doc/sandbox/Teximg.mdwn deleted file mode 100644 index 8483c1489..000000000 --- a/doc/sandbox/Teximg.mdwn +++ /dev/null @@ -1,2 +0,0 @@ -[[!teximg code="E = - \frac{Z^2 \cdot \mu \cdot e^4}{32\pi^2 \epsilon_0^2 \hbar^2 n^2}" ]] - diff --git a/doc/sandbox/Try_some_math_formulas.mdwn b/doc/sandbox/Try_some_math_formulas.mdwn deleted file mode 100644 index 6c0fe6de5..000000000 --- a/doc/sandbox/Try_some_math_formulas.mdwn +++ /dev/null @@ -1,6 +0,0 @@ -# Title with $\TeX$ - -* How about some math? -* $\frac{1}{2} = \frac{3}{6}$ - -and teximg? [[!teximg code="\frac{1}{2}"]] diff --git a/doc/sandbox/bar.mdwn b/doc/sandbox/bar.mdwn deleted file mode 100644 index 3163ac12e..000000000 --- a/doc/sandbox/bar.mdwn +++ /dev/null @@ -1 +0,0 @@ -foo? diff --git a/doc/sandbox/blech.mdwn b/doc/sandbox/blech.mdwn deleted file mode 100644 index 1ec2727fd..000000000 --- a/doc/sandbox/blech.mdwn +++ /dev/null @@ -1 +0,0 @@ -This just a test, but I am impressed. diff --git a/doc/sandbox/castle/discussion/jon_tests_too.mdwn b/doc/sandbox/castle/discussion/jon_tests_too.mdwn deleted file mode 100644 index bc051b008..000000000 --- a/doc/sandbox/castle/discussion/jon_tests_too.mdwn +++ /dev/null @@ -1,3 +0,0 @@ -I recall testing this too, but I'm not sure where the test went. Let's try again. -- [[users/Jon]] - -Context: [[todo/discussion_page_as_blog/discussion/castle]] diff --git a/doc/sandbox/castle/discussion/test_comment.mdwn b/doc/sandbox/castle/discussion/test_comment.mdwn deleted file mode 100644 index 4dd2cb0bf..000000000 --- a/doc/sandbox/castle/discussion/test_comment.mdwn +++ /dev/null @@ -1 +0,0 @@ -I like this idea of using the blog feature to handle comments. Let's see how it works. diff --git a/doc/sandbox/check_openID.mdwn b/doc/sandbox/check_openID.mdwn deleted file mode 100644 index 15363c985..000000000 --- a/doc/sandbox/check_openID.mdwn +++ /dev/null @@ -1 +0,0 @@ -wibble diff --git a/doc/sandbox/danc.mdwn b/doc/sandbox/danc.mdwn new file mode 100644 index 000000000..9766475a4 --- /dev/null +++ b/doc/sandbox/danc.mdwn @@ -0,0 +1 @@ +ok diff --git a/doc/sandbox/discussion.mdwn b/doc/sandbox/discussion.mdwn deleted file mode 100644 index d8c565617..000000000 --- a/doc/sandbox/discussion.mdwn +++ /dev/null @@ -1,6 +0,0 @@ -normally a sandbow doesn't have a discussion page -but it's a sandbox - -- -little light -ikiwiki looks great diff --git a/doc/sandbox/foo.mdwn b/doc/sandbox/foo.mdwn deleted file mode 100644 index e3ec71332..000000000 --- a/doc/sandbox/foo.mdwn +++ /dev/null @@ -1 +0,0 @@ -a test blog in the ikiwiki sandbox! diff --git a/doc/sandbox/foobak.mdwn b/doc/sandbox/foobak.mdwn deleted file mode 100644 index c0b526f73..000000000 --- a/doc/sandbox/foobak.mdwn +++ /dev/null @@ -1 +0,0 @@ -Foobaka bakfoo. diff --git a/doc/sandbox/hello.mdwn b/doc/sandbox/hello.mdwn deleted file mode 100644 index a0e2270db..000000000 --- a/doc/sandbox/hello.mdwn +++ /dev/null @@ -1 +0,0 @@ -hehe hohoho diff --git a/doc/sandbox/ikiwiki_flexibility_makes_me_dream.mdwn b/doc/sandbox/ikiwiki_flexibility_makes_me_dream.mdwn deleted file mode 100644 index 63642c71c..000000000 --- a/doc/sandbox/ikiwiki_flexibility_makes_me_dream.mdwn +++ /dev/null @@ -1,5 +0,0 @@ -##Why? -* Because I can do things like this -* Because I can use my favourite SCM, as the rest of my project elements (that's the only reason I complain about Trac...) -* Because the perfect tool does not exist, but custommizing very simple approaches like this I can build my own -* Because I'm just adding a new topic to see how diff works diff --git a/doc/sandbox/ikiwiki_flexibility_makes_me_dream/discussion.mdwn b/doc/sandbox/ikiwiki_flexibility_makes_me_dream/discussion.mdwn deleted file mode 100644 index 8b604cb9f..000000000 --- a/doc/sandbox/ikiwiki_flexibility_makes_me_dream/discussion.mdwn +++ /dev/null @@ -1 +0,0 @@ -Perhaps diff --git a/doc/sandbox/plop.mdwn b/doc/sandbox/plop.mdwn new file mode 100644 index 000000000..e8b7c915c --- /dev/null +++ b/doc/sandbox/plop.mdwn @@ -0,0 +1 @@ +plop diff --git a/doc/sandbox/prova_blog.html b/doc/sandbox/prova_blog.html new file mode 100644 index 000000000..d69937ea8 --- /dev/null +++ b/doc/sandbox/prova_blog.html @@ -0,0 +1,8 @@ +Questa è una prova. +Vediamo se funziona + +<pre> +#!/bin/bash + +echo "ciao" +</pre> diff --git a/doc/sandbox/sidebar.mdwn b/doc/sandbox/sidebar.mdwn new file mode 100644 index 000000000..9daeafb98 --- /dev/null +++ b/doc/sandbox/sidebar.mdwn @@ -0,0 +1 @@ +test diff --git a/doc/sandbox/test.mdwn b/doc/sandbox/test.mdwn deleted file mode 100644 index db20a2fbc..000000000 --- a/doc/sandbox/test.mdwn +++ /dev/null @@ -1 +0,0 @@ -testing my openid provider @ www.steve.org.uk diff --git a/doc/sandbox/test_nested_inlines.mdwn b/doc/sandbox/test_nested_inlines.mdwn deleted file mode 100644 index 0e6074fbd..000000000 --- a/doc/sandbox/test_nested_inlines.mdwn +++ /dev/null @@ -1,3 +0,0 @@ -Testing nested inlines: - -[[!inline pages="sandbox/test_nested_inlines/* and !sandbox/test_nested_inlines/*/*" feeds="no"]] diff --git a/doc/sandbox/test_pi.mdwn b/doc/sandbox/test_pi.mdwn deleted file mode 100644 index 32b0e7f4c..000000000 --- a/doc/sandbox/test_pi.mdwn +++ /dev/null @@ -1 +0,0 @@ -I tried to make a page called **testπ** and it didn't work. So this is a page called **test pi**. diff --git a/doc/sandbox/test_post.mdwn b/doc/sandbox/test_post.mdwn deleted file mode 100644 index b687aa253..000000000 --- a/doc/sandbox/test_post.mdwn +++ /dev/null @@ -1 +0,0 @@ -hi mom!
\ No newline at end of file diff --git a/doc/sandbox/testpage.mdwn b/doc/sandbox/testpage.mdwn deleted file mode 100644 index b9439ad02..000000000 --- a/doc/sandbox/testpage.mdwn +++ /dev/null @@ -1 +0,0 @@ -Test page content. diff --git a/doc/sandbox/testsubpage.mdwn b/doc/sandbox/testsubpage.mdwn deleted file mode 100644 index 3db7aaaab..000000000 --- a/doc/sandbox/testsubpage.mdwn +++ /dev/null @@ -1,3 +0,0 @@ -# Test subpage - -This is a test subpage. Isn't that special? diff --git a/doc/sandbox/this_is_pretty_cool.mdwn b/doc/sandbox/this_is_pretty_cool.mdwn deleted file mode 100644 index 32b01f70f..000000000 --- a/doc/sandbox/this_is_pretty_cool.mdwn +++ /dev/null @@ -1 +0,0 @@ -I am very impressed.
\ No newline at end of file diff --git a/doc/sandbox/한글.mdwn b/doc/sandbox/한글.mdwn deleted file mode 100644 index e0ba32639..000000000 --- a/doc/sandbox/한글.mdwn +++ /dev/null @@ -1 +0,0 @@ -~를 어떻게 할까~ diff --git a/doc/sandbox/한글페이지.mdwn b/doc/sandbox/한글페이지.mdwn deleted file mode 100644 index 4fac7976c..000000000 --- a/doc/sandbox/한글페이지.mdwn +++ /dev/null @@ -1,2 +0,0 @@ - -Wow test diff --git a/doc/security.mdwn b/doc/security.mdwn index 939d65d01..34a005239 100644 --- a/doc/security.mdwn +++ b/doc/security.mdwn @@ -162,10 +162,11 @@ closed though. ## HTML::Template security -If the [[plugins/template]] plugin is enabled, users can modify templates -like any other part of the wiki. This assumes that HTML::Template is secure +If the [[plugins/template]] plugin is enabled, all users can modify templates +like any other part of the wiki. Some trusted users can modify templates +without it too. This assumes that HTML::Template is secure when used with untrusted/malicious templates. (Note that includes are not -allowed, so that's not a problem.) +allowed.) ---- @@ -417,3 +418,25 @@ attack. intrigeri discovered this problem on 12 Nov 2008 and a patch put in place later that day, in version 2.70. The fix was backported to testing as version 2.53.3, and to stable as version 1.33.7. + +## Insufficient blacklisting in teximg plugin + +Josh Triplett discovered on 28 Aug 2009 that the teximg plugin's +blacklisting of insecure TeX commands was insufficient; it could be +bypassed and used to read arbitrary files. This was fixed by +enabling TeX configuration options that disallow unsafe TeX commands. +The fix was released on 30 Aug 2009 in version 3.1415926, and was +backported to stable in version 2.53.4. If you use the teximg plugin, +I recommend upgrading. ([[!cve CVE-2009-2944]]) + +## javascript insertion via svg uris + +Ivan Shmakov pointed out that the htmlscrubber allowed `data:image/*` urls, +including `data:image/svg+xml`. But svg can contain javascript, so that is +unsafe. + +This hole was discovered on 12 March 2010 and fixed the same day +with the release of ikiwiki 3.20100312. +A fix was also backported to Debian etch, as version 2.53.5. I recommend +upgrading to one of these versions if your wiki can be edited by third +parties. diff --git a/doc/setup.mdwn b/doc/setup.mdwn index 89444c9a8..83409228c 100644 --- a/doc/setup.mdwn +++ b/doc/setup.mdwn @@ -47,7 +47,7 @@ Now you can go to the url it told you, and edit pages in your new wiki using the web interface. (If the web interface doesn't seem to allow editing or login, you may -need to configure [[configure_the_web_server|tips/dot_cgi]].) +need to [[configure_the_web_server|tips/dot_cgi]].) ## Checkout and edit wiki source @@ -65,8 +65,10 @@ source. (Remember to replace "foo" with the real directory name.) git clone foo.git foo.src svn checkout file://`pwd`/foo.svn/trunk foo.src + cvs -d `pwd`/foo get -P ikiwiki bzr clone foo foo.src hg clone foo foo.src + darcs get foo foo.src # TODO monotone, tla Now to edit pages by hand, go into the directory you checked out (ie, @@ -88,7 +90,7 @@ These range from changing the wiki's name, to enabling [[plugins]], to banning users and locking pages. If you log in as the admin user you configured earlier, and go to -your Preferences page, you can click on "Wiki Setup" to customize many +your Preferences page, you can click on "Setup" to customize many wiki settings and plugins. Some settings cannot be configured on the web, for security reasons or @@ -101,6 +103,12 @@ After making changes to this file, you need to tell ikiwiki to use it: % ikiwiki -setup foo.setup +Alternatively, you can ask ikiwiki to change settings in the file for you: + + % ikiwiki -changesetup foo.setup -plugin goodstuff + +See [[usage]] for more options. + ## Customizing file locations As a wiki compiler, ikiwiki builds a wiki from files in a source directory, diff --git a/doc/setup/byhand.mdwn b/doc/setup/byhand.mdwn index 0184d3d2a..86cff5af4 100644 --- a/doc/setup/byhand.mdwn +++ b/doc/setup/byhand.mdwn @@ -83,7 +83,7 @@ the rest of the files. A good place to put it is in a ~/.ikiwiki/ subdirectory. Most of the options, like `wikiname` in the setup file are the same as -ikiwiki's command line options (documented in [[usage]]. `srcdir` and +ikiwiki's command line options (documented in [[usage]]). `srcdir` and `destdir` are the two directories you specify when running ikiwiki by hand. Make sure that these are pointing to the right directories, and read through and configure the rest of the file to your liking. @@ -124,6 +124,12 @@ revision control. ikiwiki-makerepo svn $SRCDIR $REPOSITORY """]] +[[!toggle id=cvs text="CVS"]] +[[!toggleable id=cvs text=""" + REPOSITORY=~/wikirepo + ikiwiki-makerepo cvs $SRCDIR $REPOSITORY +"""]] + [[!toggle id=git text="Git"]] [[!toggleable id=git text=""" REPOSITORY=~/wiki.git @@ -171,7 +177,7 @@ about using the git repositories. Once your wiki is checked in to the revision control system, you should configure ikiwiki to use revision control. Edit your ikiwiki.setup, set -`rcs` to the the revision control system you chose to use. Be careful, +`rcs` to the revision control system you chose to use. Be careful, you may need to use the 'fullname'. For example, 'hg' doesn't work, you should use mercurial. Be sure to set `svnrepo` to the directory for your repository, if using subversion. Uncomment the configuration for the wrapper diff --git a/doc/setup/byhand/discussion.mdwn b/doc/setup/byhand/discussion.mdwn new file mode 100644 index 000000000..941976789 --- /dev/null +++ b/doc/setup/byhand/discussion.mdwn @@ -0,0 +1,7 @@ +What directory is the 'working copy'? There can be two interpretations: the current dir and the .git dir. + +> It is fairly common terminology amoung all version control systems to use +> "working copy" to refer to a checkout from version control, including +> copies of all the versioned files, and whatever VCS-specific cruft that +> entails. So, a working copy is everything you get when you `git clone` +> a repository. --[[Joey]] diff --git a/doc/setup/discussion.mdwn b/doc/setup/discussion.mdwn index 0501f443a..74f7740db 100644 --- a/doc/setup/discussion.mdwn +++ b/doc/setup/discussion.mdwn @@ -1,3 +1,7 @@ +I have copied over the ikiwiki.setup file from /usr/share/doc/ikiwiki/ to /etc/ikiwiki/ and run it after editing. My site gets built but when I click on the 'edit' button, firefox and google chrome download the cgi file instead of creating a way to edit it. The permissions on my ikiwiki.cgi script look like this: -rwsr-sr-x 1 root root 13359 2009-10-13 19:21 ikiwiki.cgi. Is there something I should do, i.e. change permissions, so I can get it to run correctly? (jeremiah) + +> Have a look [[here|tips/dot_cgi]]. --[[Jogo]] + I just went through the standard procedure described for setup, copied the blog directory from examples into my source directory, ran ikiwiki, and everything seems to have worked, except that none of the [[!meta ... ]] tags get converted. They simply show up in the html files unformatted, with no exclamation point, and with p tags around them. Any ideas? using ikiwiki version 2.40 on freebsd --mjg @@ -238,3 +242,19 @@ Thank you! I'm not a Perl programmer, so what's your opinion: is this behavior a > That is not entirely clear to me from the documentation. It doesn't > say the path has to exist, but doesn't say it cannot either. --[[Joey]] + +I am experiencing the same problem "/etc/ikiwiki/custom: failed to set up the repository with ikiwiki-makerepo +" on Debian squeeze with perl5.10.0. Upgrading to ikiwiki 3.10 fixes it. -- [Albert](http://www.docunext.com/) + +---- + +Just a note, perl 5.10 isn't packaged as part of RHEL or thus CentOS nor EPEL, +so it's not especially trivial to satisfy that requirement for ikiwiki on +those platforms, without backporting it from Fedora or building from source. +However, I have an ikiwiki 3.20100403 running on RHEL-4 supplied 5.8.8 without +(seemingly too much) complaint. How strong is the 5.10 requirement? what +precicely breaks without it? -- [[Jon]] + +> I don't remember what was the specific problem with perl 5.8.8. All I can +> find is some taint checking bugs, which are currently worked around by +> taint checking being disabled. --[[Joey]] diff --git a/doc/shortcuts.mdwn b/doc/shortcuts.mdwn index 14cd5ff2b..54dd0fdb1 100644 --- a/doc/shortcuts.mdwn +++ b/doc/shortcuts.mdwn @@ -18,7 +18,7 @@ This page controls what shortcut links the wiki supports. * [[!shortcut name=wikipedia url="http://en.wikipedia.org/wiki/%s"]] * [[!shortcut name=wikitravel url="http://wikitravel.org/en/%s"]] * [[!shortcut name=wiktionary url="http://en.wiktionary.org/wiki/%s"]] -* [[!shortcut name=debbug url="http://bugs.debian.org/%s" desc="bug #%s"]] +* [[!shortcut name=debbug url="http://bugs.debian.org/%S" desc="Debian bug #%s"]] * [[!shortcut name=deblist url="http://lists.debian.org/debian-%s" desc="debian-%s@lists.debian.org"]] * [[!shortcut name=debpkg url="http://packages.debian.org/%s"]] * [[!shortcut name=debpkgsid url="http://packages.debian.org/sid/%s"]] @@ -59,12 +59,14 @@ This page controls what shortcut links the wiki supports. * [[!shortcut name=flickr url="http://www.flickr.com/photos/%s"]] * [[!shortcut name=man url="http://linux.die.net/man/%s"]] * [[!shortcut name=ohloh url="http://www.ohloh.net/projects/%s"]] +* [[!shortcut name=cpanrt url="https://rt.cpan.org/Ticket/Display.html?id=%s" desc="CPAN RT#%s"]] +* [[!shortcut name=novellbug url="https://bugzilla.novell.com/show_bug.cgi?id=%s" desc="bug %s"]] To add a new shortcut, use the `shortcut` [[ikiwiki/directive]]. In the url, "%s" is replaced with the -text passed to the named shortcut, after url-encoding it, and '%S' is -replaced with the raw, non-encoded text. The optional `desc` parameter -controls the description of the link. +text passed to the named shortcut, after [[!wikipedia url_encoding]] +it, and '%S' is replaced with the raw, non-encoded text. The optional +`desc` parameter controls the description of the link. Remember that the `name` you give the shortcut will become a new [[ikiwiki/directive]]. Avoid using a `name` that conflicts @@ -73,5 +75,5 @@ parameter that will override the one provided at definition time. If you come up with a shortcut that you think others might find useful, consider contributing it to the [shortcuts page on the ikiwiki -ikiwiki](http://ikiwiki.info/shortcuts/), so that future versions of +wiki](http://ikiwiki.info/shortcuts/), so that future versions of ikiwiki will include your shortcut in the standard underlay. diff --git a/doc/style.css b/doc/style.css index e6512aed8..fa4b2a38a 100644 --- a/doc/style.css +++ b/doc/style.css @@ -4,9 +4,17 @@ * local.css and use it to override or change settings in this one. */ +/* html5 compat */ +article, +header, +footer, +nav { + display: block; +} + .header { margin: 0; - font-size: 22px; + font-size: 140%; font-weight: bold; line-height: 1em; display: block; @@ -14,16 +22,21 @@ .inlineheader .author { margin: 0; - font-size: 18px; + font-size: 112%; font-weight: bold; display: block; } .actions ul { margin: 0; - padding: 6px; + padding: 6px .4em; + height: 1em; list-style-type: none; } +.actions li { + display: inline; + padding: .2em; +} .pageheader .actions ul { border-bottom: 1px solid #000; } @@ -32,23 +45,27 @@ border-bottom: 0; } -div.inlinecontent { - margin-top: .4em; +#otherlanguages ul { + margin: 0; + padding: 6px; + list-style-type: none; } - -.actions li { +#otherlanguages li { display: inline; padding: .2em .4em; } - -.pagefooter { - clear: both; +.pageheader #otherlanguages { + border-bottom: 1px solid #000; } -.inlinefooter { - clear: both; + +.inlinecontent { + margin-top: .4em; } -.tags { +.pagefooter, +.inlinefooter, +.comments { + clear: both; } #pageinfo { @@ -56,10 +73,14 @@ div.inlinecontent { border-top: 1px solid #000; } -div.tags { +.tags { margin-top: 1em; } +.inlinepage .tags { + display: inline; +} + .mapparent { text-decoration: none; } @@ -70,6 +91,18 @@ div.tags { text-align: center; } +img.img { + margin: 0.5ex; +} + +.align-left { + float:left; +} + +.align-right { + float:right; +} + #backlinks { margin-top: 1em; } @@ -80,19 +113,28 @@ div.tags { } #editcontent { - width: 100%; + width: 98%; +} + +.editcontentdiv { + width: auto; + overflow: auto; } img { border-style: none; } +pre { + overflow: auto; +} + div.recentchanges { border-style: solid; border-width: 1px; overflow: auto; - clear: both; - width: 100%; + width: auto; + clear: none; background: #eee; color: black !important; } @@ -136,17 +178,19 @@ div.recentchanges { width: 60%; } -/* Used for adding a blog page. */ #blogform { padding: 10px 10px; border: 1px solid #aaa; background: #eee; color: black !important; + width: auto; + overflow: auto; } .inlinepage { padding: 10px 10px; border: 1px solid #aaa; + overflow: auto; } .pagedate, @@ -161,90 +205,18 @@ div.recentchanges { color: #C00; } -/* Used for invalid form fields. */ -.fb_invalid { - color: red; - background: white !important; -} - -/* Used for required form fields. */ -.fb_required { - font-weight: bold; -} - -/* Orange feed button. */ -.feedbutton { - background: #ff6600; - color: white !important; - border-left: 1px solid #cc9966; - border-top: 1px solid #ccaa99; - border-right: 1px solid #993300; - border-bottom: 1px solid #331100; - padding: 0px 0.5em 0px 0.5em; - font-family: sans-serif; - font-weight: bold; - font-size: small; - text-decoration: none; - margin-top: 1em; -} -.feedbutton:hover { - color: white !important; - background: #ff9900; -} - -/* Tag cloud. */ -.pagecloud { - float: right; - width: 30%; - text-align: center; - padding: 10px 10px; - border: 1px solid #aaa; - background: #eee; - color: black !important; -} -.smallestPC { font-size: 70%; } -.smallPC { font-size: 85%; } -.normalPC { font-size: 100%; } -.bigPC { font-size: 115%; } -.biggestPC { font-size: 130%; } - -#sidebar { - line-height: 3ex; +.sidebar { width: 20ex; float: right; - margin-left: 40px; - margin-bottom: 40px; - padding: 2ex 2ex; + margin-left: 4px; + margin-bottom: 4px; + margin-top: -1px; + padding: 0ex 2ex; background: white; + border: 1px solid black; color: black !important; } -/* outlines */ -li.L1 { - list-style: upper-roman; -} -li.L2 { - list-style: decimal; -} -li.L3 { - list-style: lower-alpha; -} -li.L4 { - list-style: disc; -} -li.L5 { - list-style: square; -} -li.L6 { - list-style: circle; -} -li.L7 { - list-style: lower-roman; -} -li.L8 { - list-style: upper-alpha; -} - hr.poll { height: 10pt; color: white !important; @@ -258,6 +230,27 @@ div.poll { border: 1px solid #aaa; } +span.color { + padding: 2px; +} + +.comment-header, +.microblog-header { + font-style: italic; + margin-top: .3em; +} +.comment .author, +.microblog .author { + font-weight: bold; +} +.comment-subject { + font-weight: bold; +} +.comment { + border: 1px solid #aaa; + padding: 3px; +} + div.progress { margin-top: 1ex; margin-bottom: 1ex; @@ -274,23 +267,7 @@ div.progress-done { padding: 1px; } -input#openid_url { - background: url(wikiicons/openidlogin-bg.gif) no-repeat; - background-color: #fff; - background-position: 0 50%; - color: #000; - padding-left: 18px; -} - -input#searchbox { - background: url(wikiicons/search-bg.gif) no-repeat; - background-color: #fff; - background-position: 100% 50%; - color: #000; - padding-right: 16px; -} - -/* Things to hide in printouts. */ +/* things to hide in printouts */ @media print { .actions { display: none; } .tags { display: none; } @@ -300,7 +277,7 @@ input#searchbox { #backlinks { display: none; } } -/* Provided for use by template plugin for floating info boxes. */ +/* infobox template */ .infobox { float: right; margin-left: 2ex; @@ -312,7 +289,7 @@ input#searchbox { color: black !important; } -/* Provided for use by template plugin for floating note boxes. */ +/* notebox template */ .notebox { float: right; margin-left: 2ex; @@ -325,7 +302,7 @@ input#searchbox { color: black !important; } -/* Used by the popup template and for backlinks hiding. */ +/* popup template and backlinks hiding */ .popup { border-bottom: 1px dotted #366; color: #366; @@ -346,7 +323,7 @@ input#searchbox { color: black; } -/* Formbuilder styling */ +/* form styling */ fieldset { margin: 1ex 0; border: 1px solid black; @@ -358,40 +335,37 @@ legend { float: left; margin: 2px 0; } -#signin_openid_url_label { - float: left; - margin-right: 1ex; +label.block { + display: block; } -#signin_openid { - padding: 10px 10px; - border: 1px solid #aaa; - background: #eee; - color: black !important; +label.inline { + display: inline; } - -span.color { - padding: 2px; +input#openid_identifier { + background: url(wikiicons/openidlogin-bg.gif) no-repeat; + background-color: #fff; + background-position: 0 50%; + color: #000; + padding-left: 18px; } - -.comment-header, -.microblog-header { - font-style: italic; - margin-top: .3em; +input#searchbox { + background: url(wikiicons/search-bg.gif) no-repeat; + background-color: #fff; + background-position: 100% 50%; + color: #000; + padding-right: 16px; } -.comment .author, -.microblog .author { - font-weight: bold; +/* invalid form fields */ +.fb_invalid { + color: red; + background: white !important; } -.comment-subject { +/* required form fields */ +.fb_required { font-weight: bold; } -.comment { - border: 1px solid #aaa; - padding: 3px; -} - -/* Used by the highlight plugin. */ +/* highlight plugin */ pre.hl { color:#000000; background-color:#ffffff; } .hl.num { color:#2928ff; } .hl.esc { color:#ff00ff; } @@ -407,3 +381,111 @@ pre.hl { color:#000000; background-color:#ffffff; } .hl.kwb { color:#830000; } .hl.kwc { color:#000000; font-weight:bold; } .hl.kwd { color:#010181; } + +/* calendar plugin */ +.month-calendar-day-this-day, +.year-calendar-this-month { + background-color: #eee; +} +.month-calendar-day-head, +.month-calendar-day-nolink, +.month-calendar-day-link, +.month-calendar-day-this-day, +.month-calendar-day-future { + text-align: right; +} +.month-calendar-arrow A:link, +.year-calendar-arrow A:link, +.month-calendar-arrow A:visited, +.year-calendar-arrow A:visited { + text-decoration: none; + font-weight: normal; + font-size: 150%; +} + +/* outlines */ +li.L1 { list-style: upper-roman; } +li.L2 { list-style: decimal; } +li.L3 { list-style: lower-alpha; } +li.L4 { list-style: disc; } +li.L5 { list-style: square; } +li.L6 { list-style: circle; } +li.L7 { list-style: lower-roman; } +li.L8 { list-style: upper-alpha; } + +/* tag cloud */ +.pagecloud { + float: right; + width: 30%; + text-align: center; + padding: 10px 10px; + border: 1px solid #aaa; + background: #eee; + color: black !important; +} +.smallestPC { font-size: 70%; } +.smallPC { font-size: 85%; } +.normalPC { font-size: 100%; } +.bigPC { font-size: 115%; } +.biggestPC { font-size: 130%; } + +/* orange feed button */ +.feedbutton { + background: #ff6600; + color: white !important; + border-left: 1px solid #cc9966; + border-top: 1px solid #ccaa99; + border-right: 1px solid #993300; + border-bottom: 1px solid #331100; + padding: 0px 0.5em 0px 0.5em; + font-family: sans-serif; + font-weight: bold; + font-size: small; + text-decoration: none; + margin-top: 1em; +} +.feedbutton:hover { + color: white !important; + background: #ff9900; +} + +.FlattrButton { + display: none; +} + +/* openid selector */ +#openid_choice { + display: none; +} +#openid_input_area { + clear: both; + padding: 10px; +} +#openid_btns, #openid_btns br { + clear: both; +} +#openid_highlight { + background-color: black; + float: left; +} +.openid_large_btn { + padding: 1em 1.5em; + border: 1px solid #DDD; + margin: 3px; + float: left; +} +.openid_small_btn { + padding: 4px 4px; + border: 1px solid #DDD; + margin: 3px; + float: left; +} +a.openid_large_btn:focus { + outline: none; +} +a.openid_large_btn:focus { + -moz-outline-style: none; +} +.openid_selected { + border: 4px solid #DDD; +} diff --git a/doc/templates.mdwn b/doc/templates.mdwn index eff0e15e9..bfb6a439a 100644 --- a/doc/templates.mdwn +++ b/doc/templates.mdwn @@ -1,87 +1,80 @@ -[[!meta robots="noindex, follow"]] -[[!if test="enabled(template)" -then="This wiki has templates **enabled**." -else="This wiki has templates **disabled**." -]] - -Templates are files that can be filled out and inserted into pages in the -wiki. - -[[!if test="enabled(template) and enabled(inline)" then=""" - -These templates are available for inclusion onto other pages in this -wiki: - -[[!inline pages="templates/* and !*/discussion" feeds=no archive=yes -sort=title template=titlepage]] -"""]] - -## Using a template - -Using a template works like this: - - \[[!template id=note text="""Here is the text to insert into my note."""]] +[[Ikiwiki]] uses many templates for many purposes. By editing its templates, +you can fully customise this site. -This fills out the [[note]] template, filling in the `text` field with -the specified value, and inserts the result into the page. +Templates are located in `/usr/share/ikiwiki/templates` by default; +the `templatedir` setting can be used to make another directory be +searched first. Customised templates can also be placed inside the +"templates/" directory in your wiki's source. -Generally, a value can include any markup that would be allowed in the wiki -page outside the template. Triple-quoting the value even allows quotes to -be included in it. Combined with multi-line quoted values, this allows for -large chunks of marked up text to be embedded into a template: +Ikiwiki uses the HTML::Template module as its template engine. This +supports things like conditionals and loops in templates and is pretty +easy to learn. All you really need to know to modify templates is this: - \[[!template id=foo name="Sally" color="green" age=8 notes=""" - * \[[Charley]]'s sister. - * "I want to be an astronaut when I grow up." - * Really 8 and a half. - """]] +* To insert the value of a template variable, use `<TMPL_VAR variable>`. +* To make a block of text conditional on a variable being set use + `<TMPL_IF variable>text</TMPL_IF>`. +* To use one block of text if a variable is set and a second if it's not, + use `<TMPL_IF variable>text<TMPL_ELSE>other text</TMPL_IF>` -## Creating a template +[[!if test="enabled(template)" then=""" +## template pages -To create a template, simply add a template directive to a page, and the -page will provide a link that can be used to create the template. The template -is a regular wiki page, located in the `templates/` subdirectory inside -the source directory of the wiki. +The [[!iki ikiwiki/directive/template desc="template directive"]] allows +wiki pages to be used as templates, filled out and inserted into other +pages in the wiki. +"""]] -The template uses the syntax used by the [[!cpan HTML::Template]] perl -module, which allows for some fairly complex things to be done. Consult its -documentation for the full syntax, but all you really need to know are a -few things: +[[!if test="enabled(edittemplate)" then=""" +## default content for new pages -* Each parameter you pass to the template directive will generate a - template variable. There are also some pre-defined variables like PAGE - and BASENAME. -* To insert the value of a variable, use `<TMPL_VAR variable>`. Wiki markup - in the value will first be converted to html. -* To insert the raw value of a variable, with wiki markup not yet converted - to html, use `<TMPL_VAR raw_variable>`. -* To make a block of text conditional on a variable being set use - `<TMPL_IF NAME="variable">text</TMPL_IF>`. -* To use one block of text if a variable is set and a second if it's not, - use `<TMPL_IF NAME="variable">text<TMPL_ELSE>other text</TMPL_IF>` - -Here's a sample template: +The [[!iki ikiwiki/directive/edittemplate desc="edittemplate directive"]] can +be used to make new pages default to containing text from a template +page, which can be filled out as the page is edited. +"""]] - <span class="infobox"> - Name: \[[<TMPL_VAR raw_name>]]<br /> - Age: <TMPL_VAR age><br /> - <TMPL_IF NAME="color"> - Favorite color: <TMPL_VAR color><br /> - <TMPL_ELSE> - No favorite color.<br /> - </TMPL_IF> - <TMPL_IF NAME="notes"> - <hr /> - <TMPL_VAR notes> - </TMPL_IF> - </span> +[[!if test="(enabled(template) or enabled(edittemplate)) +and enabled(inline)" then=""" +[[!inline pages="templates/* and !*.tmpl and !templates/*/* and !*/discussion" +feeds=no archive=yes sort=title template=titlepage +rootpage=templates postformtext="Add a new template named:"]] +"""]] -The filled out template will be formatted the same as the rest of the page -that contains it, so you can include WikiLinks and all other forms of wiki -markup in the template. Note though that such WikiLinks will not show up as -backlinks to the page that uses the template. +## wiki templates + +These templates are used to build the wiki. The aim is to keep almost all +html out of ikiwiki and in the templates. + +* `page.tmpl` - Used for displaying all regular wiki pages. This is the + key template to customise. [[!if test="enabled(pagetemplate)" then=""" + (The [[!iki ikiwiki/directive/pagetemplate desc="pagetemplate directive"]] + can be used to make a page use a different template than `page.tmpl`.)"""]] +* `rsspage.tmpl` - Used for generating rss feeds for blogs. +* `rssitem.tmpl` - Used for generating individual items on rss feeds. +* `atompage.tmpl` - Used for generating atom feeds for blogs. +* `atomitem.tmpl` - Used for generating individual items on atom feeds. +* `inlinepage.tmpl` - Used for displaying a post in a blog. +* `archivepage.tmpl` - Used for listing a page in a blog archive page. +* `titlepage.tmpl` - Used for listing a page by title in a blog archive page. +* `microblog.tmpl` - Used for showing a microblogging post inline. +* `blogpost.tmpl` - Used for a form to add a post to a blog (and rss/atom links) +* `feedlink.tmpl` - Used to add rss/atom links if `blogpost.tmpl` is not used. +* `aggregatepost.tmpl` - Used by the aggregate plugin to create + a page for a post. +* `searchform.tmpl`, `googleform.tmpl` - Used by the search plugin + and google plugin to add search forms to wiki pages. +* `searchquery.tmpl` - This is a Omega template, used by the + search plugin. +* `comment.tmpl` - Used by the comments plugin to display a comment. +* `change.tmpl` - Used to create a page describing a change made to the wiki. +* `recentchanges.tmpl` - Used for listing a change on the RecentChanges page. +* `autoindex.tmpl` - Filled in by the autoindex plugin to make index pages. +* `autotag.tmpl` - Filled in by the tag plugin to make tag pages. +* `calendarmonth.tmpl`, `calendaryear.tmpl` - Used by ikiwiki-calendar to + make calendar archive pages. +* `editpage.tmpl`, `editconflict.tmpl`, `editcreationconflict.tmpl`, + `editfailedsave.tmpl`, `editpagegone.tmpl`, `pocreatepage.tmpl`, + `editcomment.tmpl` `commentmoderation.tmpl`, `renamesummary.tmpl`, + `passwordmail.tmpl`, `openid-selector.tmpl` - Parts of ikiwiki's user + interface; do not normally need to be customised. -Note the use of "raw_name" inside the [[ikiwiki/WikiLink]] generator. This -ensures that if the name contains something that might be mistaken for wiki -markup, it's not converted to html before being processed as a -[[ikiwiki/WikiLink]]. +[[!meta robots="noindex, follow"]] diff --git a/doc/templates/discussion.mdwn b/doc/templates/discussion.mdwn index 220d36455..4ddc48ac3 100644 --- a/doc/templates/discussion.mdwn +++ b/doc/templates/discussion.mdwn @@ -6,3 +6,14 @@ note and popups are templates? But they're not in the templates directory, and i > your personal wiki sources. The note and popup template pages are > installed there, typically in `/usr/share/ikiwiki/basewiki/templates/` > --[[Joey]] + + +Is there a list of the TMPL_VAR-Variables that are defined by ikiwiki? + +What I'm trying to achieve is to print the URL of every page on the page itself and therefore I would need the corresponding value in the Template. + +Am I missing something? --[[jwalzer]] + +> If there isn't a suitable variable (I don't think there is a list at +> the moment), a [[plugin|plugins/write]] to add one would be about 10 +> lines of perl - you'd just need to define a `pagetemplate` hook. --[[smcv]] diff --git a/doc/templates/gitbranch.mdwn b/doc/templates/gitbranch.mdwn index fcce925d9..962420940 100644 --- a/doc/templates/gitbranch.mdwn +++ b/doc/templates/gitbranch.mdwn @@ -3,7 +3,7 @@ Available in a [[!taglink /git]] repository.<br /> Branch: <TMPL_VAR branch><br /> Author: <TMPL_VAR author><br /> </span> -<TMPL_UNLESS NAME="branch"> +<TMPL_UNLESS branch> This template is used to create an infobox for a git branch. It uses these parameters: diff --git a/doc/templates/links.mdwn b/doc/templates/links.mdwn index 2b18bceb2..b7c3028cd 100644 --- a/doc/templates/links.mdwn +++ b/doc/templates/links.mdwn @@ -8,5 +8,9 @@ <li>[[Users|IkiWikiUsers]]</li> <li>[[SiteMap]]</li> <li>[[Contact]]</li> +<li>[[TipJar]]</li> </ul> +<a href="http://flattr.com/thing/39811/ikiwiki"> +<img src="http://api.flattr.com/button/button-compact-static-100x17.png" +alt="Flattr this" title="Flattr this" /></a> </div> diff --git a/doc/templates/note.mdwn b/doc/templates/note.mdwn index 4cc323c0e..9ef5ad942 100644 --- a/doc/templates/note.mdwn +++ b/doc/templates/note.mdwn @@ -1,7 +1,7 @@ <div class="notebox"> <TMPL_VAR text> </div> -<TMPL_UNLESS NAME="text"> +<TMPL_UNLESS text> Use this template to insert a note into a page. The note will be styled to float to the right of other text on the page. This template has one parameter: diff --git a/doc/templates/plugin.mdwn b/doc/templates/plugin.mdwn index c1d1974d6..322c49445 100644 --- a/doc/templates/plugin.mdwn +++ b/doc/templates/plugin.mdwn @@ -8,7 +8,7 @@ Currently enabled: [[!if test="enabled(<TMPL_VAR name>)" then="yes" else="no"]]< </span> [[!if test="sourcepage(plugins/contrib/*)" then="""[[!meta title="<TMPL_VAR name> (third party plugin)"]]"""]] <TMPL_IF core>[[!tag plugins/type/core]]</TMPL_IF> -<TMPL_UNLESS NAME="name"> +<TMPL_UNLESS name> This template is used to create an infobox for an ikiwiki plugin. It uses these parameters: <ul> diff --git a/doc/templates/popup.mdwn b/doc/templates/popup.mdwn index b355daa2e..92455eb21 100644 --- a/doc/templates/popup.mdwn +++ b/doc/templates/popup.mdwn @@ -1,4 +1,4 @@ -<TMPL_UNLESS NAME="mouseover"> +<TMPL_UNLESS mouseover> Use this template to create a popup window that is displayed when the mouse is over part of the page. This template has two parameters: <ul> diff --git a/doc/tipjar.mdwn b/doc/tipjar.mdwn index 787df9bf7..ae612e129 100644 --- a/doc/tipjar.mdwn +++ b/doc/tipjar.mdwn @@ -5,6 +5,9 @@ choose. If you'd like to fund development of a specific feature, see the <a href="https://www.paypal.com/cgi-bin/webscr?cmd=_xclick&business=joey%40kitenet%2enet&item_name=ikiwiki&no_shipping=1&cn=Comments%3f&tax=0¤cy_code=USD&lc=US&bn=PP%2dDonationsBF&charset=UTF%2d8"><img src="https://www.paypal.com/en_US/i/btn/x-click-but04.gif" alt="donate with PayPal" /></a> +<script type="text/javascript">var flattr_url = 'http://ikiwiki.info';</script> +<script src="http://api.flattr.com/button/load.js" type="text/javascript"></script> + Thanks to the following people for their kind contributions: * James Westby @@ -13,6 +16,9 @@ Thanks to the following people for their kind contributions: * Martin Krafft * Paweł Tęcza * Mick Pollard +* Nico Schottelius +* Jon Dowland +* Amitai Schlair (Note that this page is locked to prevent anyone from tampering with the PayPal button. If you prefer your donation *not* be listed here, let [[Joey]] know.) diff --git a/doc/tips/DreamHost/discussion.mdwn b/doc/tips/DreamHost/discussion.mdwn index 74f48938e..258d385ae 100644 --- a/doc/tips/DreamHost/discussion.mdwn +++ b/doc/tips/DreamHost/discussion.mdwn @@ -3,3 +3,16 @@ I managed to install ikiwiki on eggplant farms, with most basic features except I think ikiwiki is more suitable for VPS/dedicated server. Shared hosting doesn't fit. I just (2009/04/27) installed ikiwiki on DreamHost and the CPAN instructions here are unnecessarily complicated. I used "cpan" instead of "perl -MCPAN -e shell" and had no trouble with that portion of the install. --[[schmonz]] + +After tiring of managing things by hand, I've switched to using +pkgsrc as an unprivileged user. This uses a bit more disk for my +own copies of perl, python, etc., but in exchange I can `cd +.../pkgsrc/www/ikiwiki && make install` and everything just works. +Plus I get all the benefits of a package system, like easy uninstalling +and being notified of outdated or insecure software. + +The only catch: sometimes the package dependency tree gets too deep +for DreamHost's user process limit, resulting in build death. I +work around this by resuming the build partway down the tree, then +trying again from whatever I was actually trying to install. +--[[schmonz]] diff --git a/doc/tips/Importing_posts_from_Wordpress.mdwn b/doc/tips/Importing_posts_from_Wordpress.mdwn index 59330caa4..1ea82b862 100644 --- a/doc/tips/Importing_posts_from_Wordpress.mdwn +++ b/doc/tips/Importing_posts_from_Wordpress.mdwn @@ -1,9 +1,13 @@ Use case: You want to move away from Wordpress to Ikiwiki as your blogging/website platform, but you want to retain your old posts. -[This](http://git.chris-lamb.co.uk/?p=ikiwiki-wordpress-import.git) is a simple tool that generates [git-fast-import](http://www.kernel.org/pub/software/scm/git/docs/git-fast-import.html)-compatible data from a WordPress export XML file. It retains creation time of each post, so you can use Ikiwiki's <tt>--getctime</tt> to get the preserve creation times on checkout. +[This](http://git.chris-lamb.co.uk/?p=ikiwiki-wordpress-import.git) is a simple tool that generates [git-fast-import](http://www.kernel.org/pub/software/scm/git/docs/git-fast-import.html)-compatible data from a WordPress export XML file. WordPress categories are mapped onto Ikiwiki tags. The ability to import comments is planned. +The script uses the [BeautifulSoup][] module. + +[BeautifulSoup]: http://www.crummy.com/software/BeautifulSoup/ + ----- I include a modified version of this script. This version includes the ability to write \[[!tag foo]] directives, which the original intended, but didn't actually do. @@ -11,3 +15,88 @@ I include a modified version of this script. This version includes the ability t -- [[users/simonraven]] [[ikiwiki-wordpress-import]] + +----- + +Perhaps slightly insane, but here's an XSLT style sheet that handles my pages. It's basic, but sufficient to get started. +Note that I had to break up the ikiwiki meta strings to post this. + +-- JasonRiedy + + <?xml version="1.0" encoding="UTF-8"?> + <xsl:stylesheet version="2.0" + xmlns:xsl="http://www.w3.org/1999/XSL/Transform" + xmlns:content="http://purl.org/rss/1.0/modules/content/" + xmlns:wp="http://wordpress.org/export/1.0/"> + + <xsl:output method="text"/> + <xsl:output method="text" name="txt"/> + + <xsl:variable name='newline'><xsl:text> + </xsl:text></xsl:variable> + + <xsl:template match="channel"> + <xsl:apply-templates select="item[wp:post_type = 'post']"/> + </xsl:template> + + <xsl:template match="item"> + <xsl:variable name="idnum" select="format-number(wp:post_id,'0000')" /> + <xsl:variable name="basename" + select="concat('wp-posts/post-',$idnum)" /> + <xsl:variable name="filename" + select="concat($basename, '.html')" /> + <xsl:text>Creating </xsl:text> + <xsl:value-of select="concat($filename, $newline)" /> + <xsl:result-document href="{$filename}" format="txt"> + <xsl:text>[[</xsl:text><xsl:text>meta title="</xsl:text> + <xsl:value-of select="replace(title, '"', '&ldquo;')"/> + <xsl:text>"]]</xsl:text><xsl:value-of select="$newline"/> + <xsl:text>[[</xsl:text><xsl:text>meta date="</xsl:text> + <xsl:value-of select="pubDate"/> + <xsl:text>"]]</xsl:text><xsl:value-of select="$newline"/> + <xsl:text>[[</xsl:text><xsl:text>meta updated="</xsl:text> + <xsl:value-of select="pubDate"/> + <xsl:text>"]]</xsl:text> <xsl:value-of select="$newline"/> + <xsl:value-of select="$newline"/> + <xsl:value-of select="content:encoded"/> + <xsl:text> + + </xsl:text> + <xsl:apply-templates select="category[@domain='tag' and not(@nicename)]"> + <xsl:sort select="name()"/> + </xsl:apply-templates> + </xsl:result-document> + <xsl:apply-templates select="wp:comment"> + <xsl:sort select="date"/> + <xsl:with-param name="basename">$basename</xsl:with-param> + </xsl:apply-templates> + </xsl:template> + + <xsl:template match="wp:comment"> + <xsl:param name="basename"/> + <xsl:variable name="cnum" select="format-number(wp:comment_id, '000')" /> + <xsl:variable name="filename" select="concat($basename, '/comment_', $cnum, '._comment')"/> + <xsl:variable name="nickname" select="concat(' nickname="', wp:comment_author, '"')" /> + <xsl:variable name="username" select="concat(' username="', wp:comment_author_url, '"')" /> + <xsl:variable name="ip" select="concat(' ip="', wp:comment_author_IP, '"')" /> + <xsl:variable name="date" select="concat(' date="', wp:comment_date_gmt, '"')" /> + <xsl:result-document href="{$filename}" format="txt"> + <xsl:text>[[</xsl:text><xsl:text>comment format=html</xsl:text><xsl:value-of select="$newline"/> + <xsl:value-of select="$nickname"/> + <xsl:value-of select="$username"/> + <xsl:value-of select="$ip"/> + <xsl:value-of select="$date"/> + <xsl:text>subject=""</xsl:text><xsl:value-of select="$newline"/> + <xsl:text>content="""</xsl:text><xsl:value-of select="$newline"/> + <xsl:value-of select="wp:comment_content"/> + <xsl:value-of select="$newline"/> + <xsl:text>"""]]</xsl:text><xsl:value-of select="$newline"/> + </xsl:result-document> + </xsl:template> + + <xsl:template match="category"> + <xsl:text>[</xsl:text><xsl:text>[</xsl:text><xsl:text>!tag "</xsl:text><xsl:value-of select="."/><xsl:text>"]]</xsl:text> + <xsl:value-of select="$newline"/> + </xsl:template> + + </xsl:stylesheet> diff --git a/doc/tips/add_chatterbox_to_blog.mdwn b/doc/tips/add_chatterbox_to_blog.mdwn index aa35b9331..e07e36b07 100644 --- a/doc/tips/add_chatterbox_to_blog.mdwn +++ b/doc/tips/add_chatterbox_to_blog.mdwn @@ -18,4 +18,7 @@ from there, like I have on [my blog](http://kitenet.net/~joey/blog/) show=5 feeds=no]] """]] +* To filter out `@-replies`, append "and !*@*" to the [[ikiwiki/PageSpec]]. + The same technique can be used for other filtering. + Note: Works best with ikiwiki 3.10 or better. diff --git a/doc/tips/comments_feed.mdwn b/doc/tips/comments_feed.mdwn index 6f8137256..3d6a8c449 100644 --- a/doc/tips/comments_feed.mdwn +++ b/doc/tips/comments_feed.mdwn @@ -3,8 +3,15 @@ blog can have comments added to them. Pages with comments even have special feeds that can be used to subscribe to those comments. But you'd like to add a feed that contains all the comments posted to any page. Here's how: - \[[!inline pages="internal(*/comment_*)" template=comment]] + \[[!inline pages="comment(*)" template=comment]] The special [[ikiwiki/PageSpec]] matches all comments. The -[[template|wikitemplates]] causes the comments to be displayed formatted +[[template|templates]] causes the comments to be displayed formatted nicely. + +--- + +It's also possible to make a feed of comments that are held pending +moderation. + + \[[!inline pages="comment_pending(*)" template=comment]] diff --git a/doc/tips/convert_mediawiki_to_ikiwiki.mdwn b/doc/tips/convert_mediawiki_to_ikiwiki.mdwn index f03703b46..38de01109 100644 --- a/doc/tips/convert_mediawiki_to_ikiwiki.mdwn +++ b/doc/tips/convert_mediawiki_to_ikiwiki.mdwn @@ -1,4 +1,159 @@ -[[sabr]] explains how to [import MediaWiki content into -git](http://u32.net/Mediawiki_Conversion/index.html?updated), including -full edit hostory. The [[plugins/contrib/mediawiki]] plugin can then be -used by ikiwiki to build the wiki. +[[!toc levels=2]] + +Mediawiki is a dynamically-generated wiki which stores it's data in a +relational database. Pages are marked up using a proprietary markup. It is +possible to import the contents of a Mediawiki site into an ikiwiki, +converting some of the Mediawiki conventions into Ikiwiki ones. + +The following instructions describe ways of obtaining the current version of +the wiki. We do not yet cover importing the history of edits. + +Another set of instructions and conversion tools (which imports the full history) +can be found at <http://github.com/mithro/media2iki> + +## Step 1: Getting a list of pages + +The first bit of information you require is a list of pages in the Mediawiki. +There are several different ways of obtaining these. + +### Parsing the output of `Special:Allpages` + +Mediawikis have a special page called `Special:Allpages` which list all the +pages for a given namespace on the wiki. + +If you fetch the output of this page to a local file with something like + + wget -q -O tmpfile 'http://your-mediawiki/wiki/Special:Allpages' + +You can extract the list of page names using the following python script. Note +that this script is sensitive to the specific markup used on the page, so if +you have tweaked your mediawiki theme a lot from the original, you will need +to adjust this script too: + + import sys + from xml.dom.minidom import parse, parseString + + dom = parse(sys.argv[1]) + tables = dom.getElementsByTagName("table") + pagetable = tables[-1] + anchors = pagetable.getElementsByTagName("a") + for a in anchors: + print a.firstChild.toxml().\ + replace('&','&').\ + replace('<','<').\ + replace('>','>') + +Also, if you have pages with titles that need to be encoded to be represented +in HTML, you may need to add further processing to the last line. + +Note that by default, `Special:Allpages` will only list pages in the main +namespace. You need to add a `&namespace=XX` argument to get pages in a +different namespace. (See below for the default list of namespaces) + +Note that the page names obtained this way will not include any namespace +specific prefix: e.g. `Category:` will be stripped off. + +### Querying the database + +If you have access to the relational database in which your mediawiki data is +stored, it is possible to derive a list of page names from this. With mediawiki's +MySQL backend, the page table is, appropriately enough, called `table`: + + SELECT page_namespace, page_title FROM page; + +As with the previous method, you will need to do some filtering based on the +namespace. + +### namespaces + +The list of default namespaces in mediawiki is available from <http://www.mediawiki.org/wiki/Manual:Namespace#Built-in_namespaces>. Here are reproduced the ones you are most likely to encounter if you are running a small mediawiki install for your own purposes: + +[[!table data=""" +Index | Name | Example +0 | Main | Foo +1 | Talk | Talk:Foo +2 | User | User:Jon +3 | User talk | User_talk:Jon +6 | File | File:Barack_Obama_signature.svg +10 | Template | Template:Prettytable +14 | Category | Category:Pages_needing_review +"""]] + +## Step 2: fetching the page data + +Once you have a list of page names, you can fetch the data for each page. + +### Method 1: via HTTP and `action=raw` + +You need to create two derived strings from the page titles: the +destination path for the page and the source URL. Assuming `$pagename` +contains a pagename obtained above, and `$wiki` contains the URL to your +mediawiki's `index.php` file: + + src=`echo "$pagename" | tr ' ' _ | sed 's,&,&,g'` + dest=`"$pagename" | tr ' ' _ | sed 's,&,__38__,g'` + + mkdir -p `dirname "$dest"` + wget -q "$wiki?title=$src&action=raw" -O "$dest" + +You may need to add more conversions here depending on the precise page titles +used in your wiki. + +If you are trying to fetch pages from a different namespace to the default, +you will need to prefix the page title with the relevant prefix, e.g. +`Category:` for category pages. You probably don't want to prefix it to the +output page, but you may want to vary the destination path (i.e. insert an +extra directory component corresponding to your ikiwiki's `tagbase`). + +### Method 2: via HTTP and `Special:Export` + +Mediawiki also has a special page `Special:Export` which can be used to obtain +the source of the page and other metadata such as the last contributor, or the +full history, etc. + +You need to send a `POST` request to the `Special:Export` page. See the source +of the page fetched via `GET` to determine the correct arguments. + +You will then need to write an XML parser to extract the data you need from +the result. + +### Method 3: via the database + +It is possible to extract the page data from the database with some +well-crafted queries. + +## Step 3: format conversion + +The next step is to convert Mediawiki conventions into Ikiwiki ones. + +### categories + +Mediawiki uses a special page name prefix to define "Categories", which +otherwise behave like ikiwiki tags. You can convert every Mediawiki category +into an ikiwiki tag name using a script such as + + import sys, re + pattern = r'\[\[Category:([^\]]+)\]\]' + + def manglecat(mo): + return '\[[!tag %s]]' % mo.group(1).strip().replace(' ','_') + + for line in sys.stdin.readlines(): + res = re.match(pattern, line) + if res: + sys.stdout.write(re.sub(pattern, manglecat, line)) + else: sys.stdout.write(line) + +## Step 4: Mediawiki plugin + +The [[plugins/contrib/mediawiki]] plugin can be used by ikiwiki to interpret +most of the Mediawiki syntax. + +## External links + +[[sabr]] used to explain how to [import MediaWiki content into +git](http://u32.net/Mediawiki_Conversion/index.html?updated), including full +edit history, but as of 2009/10/16 that site is not available. A copy of the +information found on this website is stored at <http://github.com/mithro/media2iki> + + diff --git a/doc/tips/convert_mediawiki_to_ikiwiki/discussion.mdwn b/doc/tips/convert_mediawiki_to_ikiwiki/discussion.mdwn index b3fe9f86c..d67a9131b 100644 --- a/doc/tips/convert_mediawiki_to_ikiwiki/discussion.mdwn +++ b/doc/tips/convert_mediawiki_to_ikiwiki/discussion.mdwn @@ -1,3 +1,11 @@ +20100428 - I just wrote a simple ruby script which will connect to a mysql server and then recreate the pages and their revision histories with Grit. It also does one simple conversion of equals titles to pounds. Enjoy! + +<http://github.com/docunext/mediawiki2gitikiwiki> + +-- [[users/Albert]] + +---- + The u32 page is excellent, but I wonder if documenting the procedure here would be worthwhile. Who knows, the remote site might disappear. But also there are some variations on the approach that might be useful: @@ -13,9 +21,31 @@ Also, some detail on converting mediawiki transclusion to ikiwiki inlines... -- [[users/Jon]] +---- + > "Who knows, the remote site might disappear.". Right now, it appears to > have done just that. -- [[users/Jon]] +I have manage to recover most of the site using the Internet Archive. What +I was unable to retrieve I have rewritten. You can find a copy of the code +at <http://github.com/mithro/media2iki> + +> This is excellent news. However, I'm still keen on there being a +> comprehensive and up-to-date set of instructions on *this* site. I wouldn't +> suggest importing that material into ikiwiki like-for-like (not least for +> [[licensing|freesoftware]] reasons), but it's excellent to have it available +> for reference, especially since it (currently) is the only set of +> instructions that gives you the whole history. +> +> The `mediawiki.pm` that was at u32.net is licensed GPL-2. I'd like to see it +> cleaned up and added to IkiWiki proper (although I haven't requested this +> yet, I suspect the way it (ab)uses linkify would disqualify it at present). +> +> I've imported Scott's initial `mediawiki.pm` into a repository at +> <http://github.com/jmtd/mediawiki.pm> as a start. +> -- [[Jon]] + +---- The iki-fast-load ruby script from the u32 page is given below: @@ -286,7 +316,7 @@ Mediawiki.pm - A plugin which supports mediawiki format. } - # Called to handle bookmarks like [[#heading]] or <span class="createlink"><a href="http://u32.net/cgi-bin/ikiwiki.cgi?page=%20text%20&from=Mediawiki_Plugin%2Fmediawiki&do=create" rel="nofollow">?</a>#a</span> + # Called to handle bookmarks like \[[#heading]] or <span class="createlink"><a href="http://u32.net/cgi-bin/ikiwiki.cgi?page=%20text%20&from=Mediawiki_Plugin%2Fmediawiki&do=create" rel="nofollow">?</a>#a</span> sub generate_fragment_link { my $url = shift; @@ -316,10 +346,10 @@ Mediawiki.pm - A plugin which supports mediawiki format. # Ikiwiki's link link plugin wrecks this line when displaying on the site. # Until the code highlighter plugin can turn off link finding, - # always escape double brackets in double quotes: [[ + # always escape double brackets in double quotes: \[[ if($inlink eq '..') { - # Mediawiki doesn't touch links like [[..#hi|ho]]. - return "[[" . $inlink . ($anchor?"#$anchor":"") . + # Mediawiki doesn't touch links like \[[..#hi|ho]]. + return "\[[" . $inlink . ($anchor?"#$anchor":"") . ($title?"|$title":"") . "]]" . $trailing; } @@ -380,7 +410,7 @@ Mediawiki.pm - A plugin which supports mediawiki format. add_depends($page, $redir_page); my $link=bestlink($page, underscorize(translate_path($page,$redir_page))); if (! length $link) { - return "<b>Redirect Error:</b> <nowiki>[[$redir_page]] not found.</nowiki>"; + return "<b>Redirect Error:</b> <nowiki>\[[$redir_page]] not found.</nowiki>"; } $value=urlto($link, $page); @@ -393,7 +423,7 @@ Mediawiki.pm - A plugin which supports mediawiki format. my %seen; while (exists $pagestate{$at}{mediawiki}{redir}) { if ($seen{$at}) { - return "<b>Redirect Error:</b> cycle found on <nowiki>[[$at]]</nowiki>"; + return "<b>Redirect Error:</b> cycle found on <nowiki>\[[$at]]</nowiki>"; } $seen{$at}=1; $at=$pagestate{$at}{mediawiki}{redir}; @@ -612,3 +642,24 @@ Mediawiki.pm - A plugin which supports mediawiki format. } 1 + +---- + +Hello. Got ikiwiki running and I'm planning to convert my personal +Mediawiki wiki to ikiwiki so I can take offline copies around. If anyone +has an old copy of the instructions, or any advice on where to start I'd be +glad to hear it. Otherwise I'm just going to chronicle my journey on the +page.--[[users/Chadius]] + +> Today I saw that someone is working to import wikipedia into git. +> <http://www.gossamer-threads.com/lists/wiki/foundation/181163> +> Since wikipedia uses mediawiki, perhaps his importer will work +> on mediawiki in general. It seems to produce output that could be +> used by the [[plugins/contrib/mediawiki]] plugin, if the filenames +> were fixed to use the right extension. --[[Joey]] + +>> Here's another I found while browsing around starting from the link you gave Joey<br /> +>> <http://github.com/scy/levitation><br /> +>> As I don't run mediawiki anymore, but I still have my xz/gzip-compressed XML dumps, +>> it's certainly easier for me to do it this way; also a file or a set of files is easier to lug +>> around on some medium than a full mysqld or postgres master and relevant databases. diff --git a/doc/tips/dot_cgi.mdwn b/doc/tips/dot_cgi.mdwn index 4532c84cd..da55c1f1c 100644 --- a/doc/tips/dot_cgi.mdwn +++ b/doc/tips/dot_cgi.mdwn @@ -56,6 +56,10 @@ rule that allow `ikiwiki.cgi` to be executed. server (offline). I am not sure of how secure this approach is. If you have any thought about it, feel free to let me know. +## nginx + +* To run CGI under nginx, just use a FastCGI wrapper like [this one](http://technotes.1000lines.net/?p=23). The wrapper must be started somehow just like any other FastCGI program. I use launchd on OSX. + ## boa Edit /etc/boa/boa.conf and make sure the following line is not commented: diff --git a/doc/tips/dot_cgi/discussion.mdwn b/doc/tips/dot_cgi/discussion.mdwn index 124b9edff..a8854565c 100644 --- a/doc/tips/dot_cgi/discussion.mdwn +++ b/doc/tips/dot_cgi/discussion.mdwn @@ -34,3 +34,13 @@ there), and so I need to choose the more secure solution. --Ivan Z. >> The easiest way though is probably >> to add your ssh key to the special user's `.ssh/authorized_keys` >> and push that way. --[[Joey]] + +## apache2 - run from userdir +Followed instructions but couldn't get it right to run from user dir (running ubuntu jaunty), +Finally got it working once I've sym linked as follow (& restarted apache): +\# ln -s ../mods-available/userdir.load . +\# ln -s ../mods-available/userdir.conf . +\# pwd +/etc/apache2/mods-enabled + + diff --git a/doc/tips/follow_wikilinks_from_inside_vim.mdwn b/doc/tips/follow_wikilinks_from_inside_vim.mdwn new file mode 100644 index 000000000..015a4ecee --- /dev/null +++ b/doc/tips/follow_wikilinks_from_inside_vim.mdwn @@ -0,0 +1,47 @@ +The [ikiwiki-nav](http://www.vim.org/scripts/script.php?script_id=2968) plugin +for vim eases the editing of IkiWiki wikis, by letting you "follow" the +wikilinks on your file (page), by loading the file associated with a given +wikilink in vim. The plugin takes care of following the ikiwiki linking rules +to figure out which file a wikilink points to + +The plugin also includes commands (and mappings) to make the cursor jump to the +previous/next wikilink in the current file + +## Jumping to pages + +To open the file associated to a wikilink, place the cursor over its text, and +hit Enter (`<CR>`). This functionality is also available through the +`:IkiJumpToPage` command + +## Moving to next/previous wikilink in current file + +`Ctrl-j` will move the cursor to the next wikilink. `Ctrl-k` will move it to the +previous wikilink. This functionality is also available through the +`:IkiNextWikiLink` command. This command takes one argument, the direction to +move into + + * `:IkiNextWikiLink 0` will look forward for the wikilink + * `:IkiNextWikiLink 1` will look backwards for the wikilink + +## Installation + +Copy the `ikiwiki_nav.vim` file to your `.vim/ftplugin` directory. + +## Current issues: + + * The plugin only works for wikilinks contained in a single text line; + multiline wikilinks are not (yet) seen as such + +## Notes + +The official releases of the plugin are in the +[vim.org script page](http://www.vim.org/scripts/script.php?script_id=2968) + +The latest version of this script can be found in the following location + +<http://git.devnull.li/cgi-bin/gitweb.cgi?p=ikiwiki-nav.git;a=blob;f=ftplugin/ikiwiki_nav.vim;hb=HEAD> + +Any feedback you can provide is appreciated; the contact details can be found +inside the plugin + +[[!tag vim]] diff --git a/doc/tips/github.mdwn b/doc/tips/github.mdwn index c3fdab734..9bdf15751 100644 --- a/doc/tips/github.mdwn +++ b/doc/tips/github.mdwn @@ -24,7 +24,7 @@ for more space, or you can migrate your site elsewhere. ## Local Setup * On your laptop, create two empty git repositories to correspond to the github repositories: <br /> - `YOU = your github username here` <br /> + `YOU=your github username here` <br /> `mkdir ~/$YOU.github.com` <br /> `cd ~/$YOU.github.com` <br /> `git init` <br /> diff --git a/doc/tips/howto_limit_to_admin_users.mdwn b/doc/tips/howto_limit_to_admin_users.mdwn new file mode 100644 index 000000000..4d579327a --- /dev/null +++ b/doc/tips/howto_limit_to_admin_users.mdwn @@ -0,0 +1,9 @@ +Enable [[plugins/lockedit]] in your setup file. + +For example: + + add_plugins => [qw{goodstuff table rawhtml template embed typography sidebar img remove lockedit}], + +And to only allow admin users to edit the page, simply specify a pagespec for everything in the .setup: + + locked_pages => '*', diff --git a/doc/tips/htaccess_file.mdwn b/doc/tips/htaccess_file.mdwn new file mode 100644 index 000000000..6964cf24e --- /dev/null +++ b/doc/tips/htaccess_file.mdwn @@ -0,0 +1,27 @@ +If you try to include a `.htaccess` file in your wiki's source, in order to +configure the web server, you'll find that ikiwiki excludes it from +processing. In fact, ikiwiki excludes any file starting with a dot, as well +as a lot of other files, for good security reasons. + +You can tell ikiwiki not to exclude the .htaccess file by adding this to +your setup file: + + include => '^\.htaccess$', + +Caution! Before you do that, please think for a minute about who can edit +your wiki. Are attachment uploads enabled? Can users commit changes +directly to the version control system? Do you trust everyone who can +make a change to not do Bad Things with the htaccess file? Do you trust +everyone who *might* be able to make a change in the future? Note that a +determined attacker who can write to the htaccess file can probably get a +shell on your web server. + +If any of these questions have given you pause, I suggest you find a +different way to configure the web server. One way is to not put the +`.htaccess` file under ikiwiki's control, and just manually install it +in the destdir. --[[Joey]] + +[Apache's documentation](http://httpd.apache.org/docs/2.2/howto/htaccess.html) +says: +> In general, you should never use .htaccess files unless you don't have +> access to the main server configuration file. diff --git a/doc/tips/html5.mdwn b/doc/tips/html5.mdwn new file mode 100644 index 000000000..945efc4bc --- /dev/null +++ b/doc/tips/html5.mdwn @@ -0,0 +1,26 @@ +First, if you just want to embed videos using the html5 `<video>` tag, +you can do that without switching anything else to html5. +However, if you want to fully enter the brave new world of html5, read on.. + +Currently, ikiwiki does not use html5 by default. There is a `html5` +setting that can be turned on, in your setup file. Rebuild with it set, and +lots of fancy new semantic tags will be used all over the place. + +You may need to adapt your CSS for html5. While all the class and id names +are the same, some of the `div` elements are changed to other things. +Ikiwiki's default CSS will work in both modes. + +The html5 support is still experimental, and may break in some browsers. +No care is taken to add backwards compatability hacks for browsers that +are not html5 aware (like MSIE). If you want to include the javascript with +those hacks, you can edit `page.tmpl` to do so. +[Dive Into HTML5](http://diveintohtml5.org/) is a good reference for +current compatability issues and workarounds with html5. + +--- + +Known ikiwiki-specific issues: + +* [[plugins/htmltidy]] uses `tidy`, which is not html5 aware, so if you + have that enabled, it will mangle it back to html4. +* [[plugins/toc]] does not understand the html5 outline algorithm. diff --git a/doc/tips/ikiwiki_as_a_requirements_management_tool.mdwn b/doc/tips/ikiwiki_as_a_requirements_management_tool.mdwn new file mode 100644 index 000000000..6bef2619e --- /dev/null +++ b/doc/tips/ikiwiki_as_a_requirements_management_tool.mdwn @@ -0,0 +1,95 @@ +[[!template id=note text="**Table of contents** [[!toc ]]"]] + +Introduction +------------ +At work textual requirements and traceability are daily use terms, often used as contracts with clients or among stakeholders, but at the moment the only way we specify requirements is via a word processor, and traceability is managed manually (ouch!) unless we use a commercial UML (Unified Modeling Language) tool that handles office files, an also allows traceability from design, code and test artifacts. But functionality of that tool is less than basic for requirements. + +We are considering the use of a specific requirements management tool, but the problem and something that gets me really frustrated is the extremely expensive price of the licenses of the "de facto" commercial tools we should use. One floating license for both tools (requirements management and modeling) can go beyond $20,000. Of course we can purchase cheaper ones, but I'm tired of this licensing nightmare of worrying about how many licenses are being used, praying not to exceed the limit or restarting a dead license server. We pay companies to not trust us. Taking a look at the FLOSS world doesn't seem to add any reasonable alternative. + +These are the raw high level features of the tool I'd like to use: + + * Requirements edition + * Requirements attributes edition + * Traceability: edition, coverage analysis and navigation + * External traceability: from requirements in one document/module to requirements in other one (eg: software requirements tracing to system requirements). Note: a set of requirements will be called "module" hereafter in this page. + * Requirements identifiers management + * Requirements history and diff/blame + * Team work + * Easy integration with other software lifecycle tools: modeling (eg. BOUML), project management (eg. Trac) + * Support for other formats such as HTML, office... + * Filtering and searching + * Export facilities to create standards compliant documentation. + +Initial idea was to develop a simple web solution using XHTML files. These files would be created in a web broser with existing WYSIWIM editors and store all the stuff in Subversion. All requirements would be stored at the same level (no hierarchies among requirements of the same module) and atomically accessible via a simple web browser. No server side programming would be needed to read requirements. Also special XHTML files (let's call it "views") would be necessary to group requirements hierarchically in a requirements document fashion, using xinclude. + +When I first played with ikiwiki I was so happy that many of the ideas I worked on were already in use in this marvelous piece of software, specially the decision to use well-known RCS software to manage history instead of reinventing the wheel, opening also one interesting feature: off-line edition. Other similarity was the absence of special processing for read-only navigation. + +So, let's now take all the features above and describe how to make them real using ikiwiki and some simple conventions. Some features would need new functionality and improvement, I'd really appreciate additional ideas on how to better get to the point. + +Requirements edition +-------------------- +Suppose that all requirements would reside under a concrete folder. We will call it "reqs", and under "reqs" we add as folders as requirements modules we want to use for a system called "foo" (eg. "foo_sss" for system requirements, "foo_srs" for software requirements...). Index file for each document shall be a page summarizing the module: number of requirements, basic coverage information... Other similar pages under the "views" folder could be used in order to have different sets of requirements including additional stuff: introduction, document identification, etc... The rest of the files - actually requirements - shall be markdown files. So editing a requirement would be as simple as adding a page to the wiki. + +To create the summary and views, just [[ikiwiki/directive/inline]] and [[ikiwiki/directive/pagecount]] directives could provide nice pages. The uncomfortable part is having to use many [[pagespecs|ikiwiki/PageSpec]] to create the whole views, but it actually shoud work. One possible workaround would be an external tool to handle this and create directives automatically or graphically. + +Requirements attributes +----------------------- +There are lots of useful data to associate to a requirement. Eg: + + * If it is traceable or not + * Its criticality level + * Its priority + * If it is funtional or not + +How to implement this? Using [[ikiwiki/directive/meta]] could be a solution, not tried yet, I'd rather keep requirements content alone. Storing this information in SVN it is easy, although ikiwiki does not provide a way to do it it would imply really little effort. The requirement in itself is the content of the file; attributes are stored as key-value pairs in the file's properties. AFAIK this is a feature available only in SVN, although git has something similar (gitattributes) although path based, but anyway whichever RCS is used, a ".properties" file could be created always when a requirement file is created. + +Traceability: edition, coverage analysis and navigation +-------------------------------------------------------- +This is the most important feature of a requirements engineering tool. How to do this with ikiwiki? There are some ways, from extremely simple ones to more sophisticated: + + * One simple solution: Links. Just link from one requirement to another one to create a traceable directional connection + * One harder: file attributes (see section about requirements just above) + +For coverage analysis , using [[ikiwiki/directive/pagecount]] is the perfect solution to summarize and show covered and uncovered requirements. We could add several pages per module - probably using template pages- with ready made coverage analysis reports... Wow!!! [[ikiwiki/directive/linkmap]] directive can show traceability information graphically. + +Navigating among requirements needs... Nothing!!! Just follow the links of referring pages that ikiwiki adds by default. + +External traceability +--------------------- +Being just different folders under the wiki, external traceability is as easy as internal. + +Requirements identifiers management +----------------------------------- +Another useful convention: requirement identifier shall be the name of the requirements file. In ikiwiki page title is the same as requirement Id. No trouble, it works. I personally prefer to keep title as page title and create a short auto-increasing numeric codes with prefixes and/or suffixes as file name (eg. SRS_FOO_0001, SSS_FOO_002), hope to have somethig running soon. + +Requirements history and diff/blame +----------------------------------- +Out of the box! And really much more useful than average diffing components of requirement management tools. There are plenty online front ends to use and for offline work tools like meld are awesome. + +Team work +--------- +Also no need to do anything, RCS software does it all. Also for experienced users merging and conflict solving can provide much more practical solutions (most requirements management tools work blocking). + +Easy integration with other software lifecycle tools +---------------------------------------------------- +Modeling tools: as a general rule, store model elements in the most atomic parts: classes, enums, actors, use cases... and use again file attributes to store traceability information. Other way is transforming files representing these atomic model parts in independent mdwn files under, for example, a "mdl" folder + +Trac integration is so simple... As simple as any PM tool that acceses the same RCS as ikiwiki does. Diffing, blaming, even navigating directly to ikiwiki generated pages. Integration of a ticketing system will give awesome power to all the team. + +Support for other formats +------------------------- +Out of the box, at least for wiki and mathematical formats, and creating additional ones shouldn't be so difficult. + +Filtering and searching +----------------------- +Look that box in the top right corner? + +Export facilities +----------------- +Views with custom styles and html conversion tools would be enough for most purposes. + +That's all! + +One funny thing: our "de facto future" requirements management tool, after years of research included some years ago a really nice feature: a Discussion tag per each requirement... See this in ikiwiki? Again out of the box!!! + +Comments are really welcome!!! diff --git a/doc/tips/ikiwiki_as_a_requirements_management_tool/discussion.mdwn b/doc/tips/ikiwiki_as_a_requirements_management_tool/discussion.mdwn new file mode 100644 index 000000000..94f0f8b4b --- /dev/null +++ b/doc/tips/ikiwiki_as_a_requirements_management_tool/discussion.mdwn @@ -0,0 +1,18 @@ +How about using tags/links to associate attributes with requirements? +This could be as simple as adding a link, fo e.g. : + + * If it is traceable or not + + \[[attributes/traceable]] + + \[[attributes/untraceable]] + * Its criticality level + + \[[attributes/level/critical]] + + \[[attributes/level/important]] + + etc. + * Its priority + + \[[attributes/priority/low]] + + \[[attributes/priority/high]] + * If it is functional or not + + \[[attributes/functional]] + + \[[attributes/non-functional]] + +You just have to create pages for each attribute you want and then pagespec could be used to filter requirements by attributes. I think something similar is used to trac bug with ikiwiki (linking to a \[[done]] page, etc.). diff --git a/doc/tips/importing_posts_from_typo.mdwn b/doc/tips/importing_posts_from_typo.mdwn new file mode 100644 index 000000000..1b87e7dae --- /dev/null +++ b/doc/tips/importing_posts_from_typo.mdwn @@ -0,0 +1 @@ +[Here](http://blog.spang.cc/posts/migrating_from_typo_to_ikiwiki/) is a blog post that gives instructions and a script for importing posts from [Typo](http://typosphere.org/), a Ruby-on-Rails based blogging engine. diff --git a/doc/tips/inside_dot_ikiwiki.mdwn b/doc/tips/inside_dot_ikiwiki.mdwn index b81ffae8d..a74d00f47 100644 --- a/doc/tips/inside_dot_ikiwiki.mdwn +++ b/doc/tips/inside_dot_ikiwiki.mdwn @@ -6,9 +6,10 @@ you need/want to. ## the index -`.ikiwiki/indexdb` contains a cache of information about pages, as well -as all persisitant state about pages. It used to be a (semi) human-readable -text file, but is not anymore. +`.ikiwiki/indexdb` contains a cache of information about pages. +This information can always be recalculated by rebuilding the wiki. +(So the file is safe to delete and need not be backed up.) +It used to be a (semi) human-readable text file, but is not anymore. To dump the contents of the file, enter a perl command like this. diff --git a/doc/tips/inside_dot_ikiwiki/discussion.mdwn b/doc/tips/inside_dot_ikiwiki/discussion.mdwn index 34d5b9252..69df369ec 100644 --- a/doc/tips/inside_dot_ikiwiki/discussion.mdwn +++ b/doc/tips/inside_dot_ikiwiki/discussion.mdwn @@ -6,14 +6,15 @@ My database appears corrupted: No idea how this happened. I've blown it away and recreated it but, for future reference, is there any less violent way to recover from this situation? I miss having the correct created and last edited times. --[[sabr]] > update: fixed ctimes and mtimes using [these instructions](http://u32.net/Mediawiki_Conversion/Git_Import/#Correct%20Creation%20and%20Last%20Edited%20time) --[[sabr]] -> That's overly complex. Just run `ikiwiki -setup your.setup -getctime`. +> That's overly complex. Just run `ikiwiki -setup your.setup -gettime`. > BTW, I'd be interested in examining such a corrupt storable file to try > to see what happened to it. --[[Joey]] ->> --getctime appears to only set the last edited date. It's not supposed to set the creation date, is it? The only place that info is stored is in the git repo. +>> --gettime appears to only set the last edited date. It's not supposed to set the creation date, is it? The only place that info is stored is in the git repo. >>> Pulling the page creation date out of the git history is exactly what ->>> --getctime does. --[[Joey]] +>>> --gettime does. (It used to be called --getctime, and only do that; now +>>> it also pulls out the last modified date). --[[Joey]] >> Alas, I seem to have lost the bad index file to periodic /tmp wiping; I'll send it to you if it happens again. --[[sabr]] diff --git a/doc/tips/laptop_wiki_with_git.mdwn b/doc/tips/laptop_wiki_with_git.mdwn index 9758beb80..cfa565d1a 100644 --- a/doc/tips/laptop_wiki_with_git.mdwn +++ b/doc/tips/laptop_wiki_with_git.mdwn @@ -1,3 +1,5 @@ +[[!toc]] + Using ikiwiki with the [[rcs/git]] backend, some interesting things can be done with creating mirrors (or, really, branches) of a wiki. In this tip, I'll assume your wiki is located on a server, and you want to take a copy with @@ -8,6 +10,8 @@ version on the laptop, perhaps while offline. You can browse and edit the wiki using a local web server. When you're ready, you can manually push the changes to the main wiki on the server. +## simple clone approach + First, set up the wiki on the server, if it isn't already. Nothing special needs to be done here, just follow the regular instructions in [[setup]] for setting up ikiwiki with git. @@ -49,3 +53,19 @@ update the wiki, with a command such as `ikiwiki -setup wiki.setup -refresh`. If you'd like it to automatically update when changes are merged in, you can simply make a symlink `post-merge` hook pointing at the `post-update` hook ikiwiki created. + +## bare mirror approach + +As above, set up a normal ikiwiki on the server, with the usual bare repository. + +Next, `git clone --mirror server:/path/to/bare/repository` + +This will be used as the $REPOSITORY on the laptop. Then you can follow +the instructions in [[setup by hand|/setup/byhand]] as per a normal ikiwiki +installation. This means that you can clone from the local bare repository +as many times as you want (thus being able to have a repository which is +used by the ikiwiki CGI, and another which you can use for updating via +git). + +Use standard git commands, run in the laptop's bare git repository +to handle pulling from and pushing to the server. diff --git a/doc/tips/laptop_wiki_with_git_extended.mdwn b/doc/tips/laptop_wiki_with_git_extended.mdwn index 620370218..0666da450 100644 --- a/doc/tips/laptop_wiki_with_git_extended.mdwn +++ b/doc/tips/laptop_wiki_with_git_extended.mdwn @@ -10,7 +10,7 @@ a bare repo on `gitserver`, and clone that to a workingdir on gitserver. Next create a setup file for the laptop with gitorigin_branch=> "", - wrapper => "/working/dir/.git/hooks/post-commit", + git_wrapper => "/working/dir/.git/hooks/post-commit", At this point, assuming you followed page above, and not my hasty summary, @@ -21,7 +21,7 @@ a bare repo on `gitserver`, and clone that to a workingdir on gitserver. 2. Now create a setup file for the server (I call it server.setup). gitorigin_branch=> "origin", - wrapper => "/repo/wiki.git/hooks/post-update.ikiwiki" + git_wrapper => "/repo/wiki.git/hooks/post-update.ikiwiki" Note the non-standard and bizzare name of the hook. diff --git a/doc/tips/mathopd_permissions.mdwn b/doc/tips/mathopd_permissions.mdwn new file mode 100644 index 000000000..c0425b9ca --- /dev/null +++ b/doc/tips/mathopd_permissions.mdwn @@ -0,0 +1,15 @@ +When using [mathopd](http://www.mathopd.org) to serve ikiwiki, be careful of your Umask settings in the mathopd.conf. + +With `Umask 026` in mathopd.conf, editing pages resulted in the following errors and a 404 page when the wiki tried to take me to the updated page. + + append_indexes: cannot open .../[destdir]/[outputfile].html + open: Permission denied + +With `Umask 022` in mathopd.conf, editing pages works. + +Hopefully this prevents someone else from spending ~2 hours figuring out why this wouldn't work. ;) + +> More generally, if your web server uses a nonstandard umask +> or you're getting permissions related problems in the cgi log +> when using ikiwiki, you can force ikiwiki to use a sane umask +> via the `umask` setting in ikiwiki's own setup file. --[[Joey]] diff --git a/doc/tips/nearlyfreespeech/discussion.mdwn b/doc/tips/nearlyfreespeech/discussion.mdwn new file mode 100644 index 000000000..a003760b9 --- /dev/null +++ b/doc/tips/nearlyfreespeech/discussion.mdwn @@ -0,0 +1,11 @@ +with version 3.141592 I get +<pre> +HOME=/home/me /usr/bin/perl -Iblib/lib ikiwiki.out -libdir . -dumpsetup ikiwiki.setup +Failed to load plugin IkiWiki::Plugin::inline: Can't use global $_ in "my" at IkiWiki/Plugin/inline.pm line 198, near "my $_" +Compilation failed in require at (eval 19) line 2. +BEGIN failed--compilation aborted at (eval 19) line 2. +</pre> + +perl is 5.8.9 + +> This is fixed in 3.1415926. --[[Joey]] diff --git a/doc/tips/optimising_ikiwiki.mdwn b/doc/tips/optimising_ikiwiki.mdwn new file mode 100644 index 000000000..caed75ba6 --- /dev/null +++ b/doc/tips/optimising_ikiwiki.mdwn @@ -0,0 +1,188 @@ +Ikiwiki is a wiki compiler, which means that, unlike a traditional wiki, +all the work needed to display your wiki is done up front. Where you can +see it and get annoyed at it. In some ways, this is better than a wiki +where a page view means running a program to generate the page on the fly. + +But enough excuses. If ikiwiki is taking too long to build your wiki, +let's fix that. Read on for some common problems that can be avoided to +make ikiwiki run quick. + +[[!toc]] + +(And if none of that helps, file a [[bug|bugs]]. One other great thing about +ikiwiki being a wiki compiler is that it's easy to provide a test case when +it's slow, and get the problem fixed!) + +## rebuild vs refresh + +Are you building your wiki by running a command like this? + + ikiwiki -setup my.setup + +If so, you're always telling ikiwiki to rebuild the entire site, from +scratch. But, ikiwiki is smart, it can incrementally update a site, +building only things affected by the changes you make. You just have to let +it do so: + + ikiwiki -setup my.setup -refresh + +Ikiwiki automatically uses an incremental refresh like this when handing +a web edit, or when run from a [[rcs]] post-commit hook. (If you've +configured the hook in the usual way.) Most people who have run into this +problem got in the habit of running `ikiwiki -setup my.setup` by hand +when their wiki was small, and found it got slower as they added pages. + +## use the latest version + +If your version of ikiwiki is not [[!version]], try upgrading. New +optimisations are frequently added to ikiwiki, some of them yielding +*enormous* speed increases. + +## run ikiwiki in verbose mode + +Try changing a page, and run ikiwiki with `-v` so it will tell you +everything it does to deal with that changed page. Take note of +which other pages are rebuilt, and which parts of the build take a long +time. This can help you zero in on individual pages that contain some of +the expensive things listed below. + +## expensive inlines + +Do you have an archive page for your blog that shows all posts, +using an inline that looks like this? + + \[[!inline pages="blog/*" show=0]] + +Or maybe you have some tag pages for your blog that show all tagged posts, +something like this? + + \[[!inline pages="blog/* and tagged(foo)" show=0]] + +These are expensive, because they have to be updated whenever you modify a +matching page. And, if there are a lot of pages, it generates a large html +file, which is a lot of work. And also large RSS/Atom files, which is even +more work! + +To optimise the inline, consider enabling quick archive mode. Then the +inline will only need to be updated when new pages are added; no RSS +or Atom feeds will be built, and the generated html file will be much +smaller. + + \[[!inline pages="blog/*" show=0 archive=yes quick=yes]] + + \[[!inline pages="blog/* and link(tag)" show=0 archive=yes quick=yes]] + +Only downsides: This won't show titles set by the [[ikiwiki/directive/meta]] +directive. And there's no RSS feed for users to use -- but if this page +is only for the archives or tag for your blog, users should be subscribing +to the blog's main page's RSS feed instead. + +For the main blog page, the inline should only show the latest N posts, +which won't be a performance problem: + + \[[!inline pages="blog/*" show=30]] + +## expensive maps + +Do you have a sitemap type page, that uses a map directive like this? + + \[[!map pages="*" show=title]] + +This is expensive because it has to be updated whenever a page is modified. +The resulting html file might get big and expensive to generate as you +keep adding pages. + +First, consider removing the "show=title". Then the map will not show page +titles set by the [[ikiwiki/directive/meta]] directive -- but will also +only need to be generated when pages are added or removed, not for every +page change. + +Consider limiting the map to only show the toplevel pages of your site, +like this: + + \[[!map pages="* and !*/*" show=title]] + +Or, alternatively, to drop from the map parts of the site that accumulate +lots of pages, like individual blog posts: + + \[[!map pages="* and !blog/*" show=title]] + +## sidebar issues + +If you enable the [[plugins/sidebar]] plugin, be careful of what you put in +your sidebar. Any change that affects what is displayed by the sidebar +will require an update of *every* page in the wiki, since all pages include +the sidebar. + +Putting an expensive map or inline in the sidebar is the most common cause +of problems. At its worst, it can result in any change to any page in the +wiki requiring every page to be rebuilt. + +## avoid htmltidy + +A few plugins do neat stuff, but slowly. Such plugins are tagged +[[plugins/type/slow]]. + +The worst offender is possibly [[plugins/htmltidy]]. This runs an external +`tidy` program on each page that is built, which is necessarily slow. So don't +use it unless you really need it; consider using the faster +[[plugins/htmlbalance]] instead. + +## be careful of large linkmaps + +[[plugins/Linkmap]] generates a cool map of links between pages, but +it does it using the `graphviz` program. And any changes to links between +pages on the map require an update. So, avoid using this to map a large number +of pages with frequently changing links. For example, using it to map +all the pages on a traditional, highly WikiLinked wiki, is asking for things +to be slow. But using it to map a few related pages is probably fine. + +This site's own [[plugins/linkmap]] rarely slows it down, because it +only shows the [[index]] page, and the small set of pages that link to it. +That is accomplished as follows: + + \[[!linkmap pages="index or (backlink(index)"]] + +## overhead of the search plugin + +Be aware that the [[plugins/search]] plugin has to update the search index +whenever any page is changed. This can slow things down somewhat. + +## profiling + +If you have a repeatable change that ikiwiki takes a long time to build, +and none of the above help, the next thing to consider is profiling +ikiwiki. + +The best way to do it is: + +* Install [[!cpan Devel::NYTProf]] +* `PERL5OPT=-d:NYTProf` +* `export PER5OPT` +* Now run ikiwiki as usual, and it will generate a `nytprof.out` file. +* Run `nytprofhtml` to generate html files. +* Those can be examined to see what parts of ikiwiki are being slow. + +## scaling to large numbers of pages + +Finally, let's think about how huge number of pages can affect ikiwiki. + +* Every time it's run, ikiwiki has to scan your `srcdir` to find + new and changed pages. This is similar in speed to running the `find` + command. Obviously, more files will make it take longer. + +* Also, to see what pages match a [[ikiwiki/PageSpec]] like "blog/*", it has + to check if every page in the wiki matches. These checks are done quite + quickly, but still, lots more pages will make PageSpecs more expensive. + +* The backlinks calculation has to consider every link on every page + in the wiki. (In practice, most pages only link to at most a few dozen + other pages, so this is not a `O(N^2)`, but closer to `O(N)`.) + +* Ikiwiki also reads and writes an `index` file, which contains information + about each page, and so if you have a lot of pages, this file gets large, + and more time is spent on it. For a wiki with 2000 pages, this file + will run about 500 kb. + +If your wiki will have 100 thousand files in it, you might start seeing +the above contribute to ikiwiki running slowly. diff --git a/doc/tips/parentlinks_style.mdwn b/doc/tips/parentlinks_style.mdwn index 5294e5452..f9dfa8f55 100644 --- a/doc/tips/parentlinks_style.mdwn +++ b/doc/tips/parentlinks_style.mdwn @@ -6,7 +6,7 @@ a subset of a page's parents. It also provides a few bonus possibilities, such as styling the parent links depending on their place in the path. -[[!toc ]] +[[!toc levels=2]] Content ======= @@ -77,10 +77,29 @@ following lines in `page.tmpl`: <a href="<TMPL_VAR NAME="URL">" class="height<TMPL_VAR NAME="HEIGHT">"> <TMPL_VAR NAME="PAGE"> </a> / + </TMPL_IF> </TMPL_LOOP> Then write the appropriate CSS bits for `a.height1`, etc. +Avoid showing title of toplevel index page +------------------------------------------ + +If you don't like having "index" appear on the top page of the wiki, +but you do want to see the name of the page otherwise, you can use a +special `HAS_PARENTLINKS` template variable that the plugin provides. +It is true for every page *except* the toplevel index. + +Here is an example of using it to hide the title of the toplevel index +page: + + <TMPL_LOOP NAME="PARENTLINKS"> + <a href="<TMPL_VAR NAME=URL>"><TMPL_VAR NAME=PAGE></a>/ + </TMPL_LOOP> + <TMPL_IF HAS_PARENTLINKS> + <TMPL_VAR TITLE> + </TMPL_IF> + Full-blown example ------------------ diff --git a/doc/tips/spam_and_softwaresites.mdwn b/doc/tips/spam_and_softwaresites.mdwn new file mode 100644 index 000000000..a07889e6b --- /dev/null +++ b/doc/tips/spam_and_softwaresites.mdwn @@ -0,0 +1,87 @@ +Any wiki with a form of web-editing enabled will have to deal with +spam. (See the [[plugins/blogspam]] plugin for one defensive tool you +can deploy). + +If: + + * you are using ikiwiki to manage the website for a [[examples/softwaresite]] + * you allow web-based commits, to let people correct documentation, or report + bugs, etc. + * the documentation is stored in the same revision control repository as your + software + +It is undesirable to have your software's VCS history tainted by spam and spam +clean-up commits. Here is one approach you can use to prevent this. This +example is for the [[git]] version control system, but the principles should +apply to others. + +## Isolate web commits to a specific branch + +Create a separate branch to contain web-originated edits (named `doc` in this +example): + + $ git checkout -b doc + +Adjust your setup file accordingly: + + gitmaster_branch => 'doc', + +## merging good web commits into the master branch + +You will want to periodically merge legitimate web-based commits back into +your master branch. Ensure that there is no spam in the documentation +branch. If there is, see 'erase spam from the commit history', below, first. + +Once you are confident it's clean: + + # ensure you are on the master branch + $ git branch + doc + * master + $ git merge --ff doc + +## removing spam + +### short term + +In the short term, just revert the spammy commit. + +If the spammy commit was the top-most: + + $ git revert HEAD + +This will clean the spam out of the files, but it will leave both the spam +commit and the revert commit in the history. + +### erase spam from the commit history + +Git allows you to rewrite your commit history. We will take advantage of this +to eradicate spam from the history of the doc branch. + +This is a useful tool, but it is considered bad practise to rewrite the +history of public repositories. If your software's repository is public, you +should make it clear that the history of the `doc` branch in your repository +is unstable. + +Once you have been spammed, use `git rebase` to remove the spam commits from +the history. Assuming that your `doc` branch was split off from a branch +called `master`: + + # ensure you are on the doc branch + $ git branch + * doc + master + $ git rebase --interactive master + +In your editor session, you will see a series of lines for each commit made to +the `doc` branch since it was branched from `master` (or since the last merge +back into `master`). Delete the lines corresponding to spammy commits, then +save and exit your editor. + +Caveat: if there are no commits you want to keep (i.e. all the commits since +the last merge into master are either spam or spam reverts) then `git rebase` +will abort. Therefore, this approach only works if you have at least one +non-spam commit to the documentation since the last merge into `master`. For +this reason, it's best to wait until you have at least one +commit you want merged back into the main history before doing a rebase, +and until then, tackle spam with reverts. diff --git a/doc/tips/switching_to_usedirs.mdwn b/doc/tips/switching_to_usedirs.mdwn index 183ce00ac..92871439f 100644 --- a/doc/tips/switching_to_usedirs.mdwn +++ b/doc/tips/switching_to_usedirs.mdwn @@ -8,9 +8,7 @@ to usedirs, or edit your setup file and turn usedirs back off. or manually. * Since usedirs is enabled, ikiwiki will have created a bunch of new html files. Where before ikiwiki generated a `dest/foo.html`, now it will - generate `dest/foo/index.html`. But, the old html files will still be - present too. Remove them: - find dest -name \*.html -not -name index.html -exec rm {} \; + generate `dest/foo/index.html`. The old html files will be removed. * If you have a blog that is aggregated on a Planet or similar, all the items in the RSS or atom feed will seem like new posts, since their URLs have changed. See [[howto_avoid_flooding_aggregators]] for a workaround. diff --git a/doc/tips/untrusted_git_push.mdwn b/doc/tips/untrusted_git_push.mdwn index aef67a3db..3573a0ddf 100644 --- a/doc/tips/untrusted_git_push.mdwn +++ b/doc/tips/untrusted_git_push.mdwn @@ -74,7 +74,7 @@ Once you're done modifying the setup file, don't forget to run You'll need to arrange the permissions on your bare git repository so that user anon can write to it. One way to do it is to create a group, and put -both anon and your regular user in that group. Then make make the bare git +both anon and your regular user in that group. Then make the bare git repository owned and writable by the group. See [[rcs/git]] for some more tips on setting up a git repository with multiple committers. diff --git a/doc/tips/upgrade_to_3.0.mdwn b/doc/tips/upgrade_to_3.0.mdwn index d22813bf2..05b6d6fbd 100644 --- a/doc/tips/upgrade_to_3.0.mdwn +++ b/doc/tips/upgrade_to_3.0.mdwn @@ -45,7 +45,7 @@ contain the above, then run `ikiwiki-transition prefix_directives your.setup` ## GlobLists In 3.0, the old "GlobList" syntax for [[PageSpecs|ikiwiki/PageSpec]] is no -longer supported. A GlobList contains multiple term, but does not separate +longer supported. A GlobList contains multiple terms, but does not separate them with "and" or "or": sandbox !*/Discussion diff --git a/doc/tips/vim_syntax_highlighting.mdwn b/doc/tips/vim_syntax_highlighting.mdwn index 172b763c3..bf7104aec 100644 --- a/doc/tips/vim_syntax_highlighting.mdwn +++ b/doc/tips/vim_syntax_highlighting.mdwn @@ -1,4 +1,15 @@ -[[ikiwiki.vim]] is a vim syntax highlighting file for ikiwiki -[[ikiwiki/markdown]] files. +[ikiwiki-syntax](http://www.vim.org/scripts/script.php?script_id=3156) is a vim +syntax highlighting file for ikiwiki [[ikiwiki/markdown]] files. It highlights +directives and wikilinks. It only supports prefixed directives, i.e., +\[[!directive foo=bar baz]], not the old format with spaces. -Installation instructions are at the top of the file. +See also: [[follow_wikilinks_from_inside_vim]] + +------ + +The previous syntax definition for ikiwiki links is at [[ikiwiki.vim]]; however, +it seems to not be [[maintained +anymore|forum/navigation_of_wiki_pages_on_local_filesystem_with_vim#syn-maintenance]], +and it has some [[issues|forum/ikiwiki_vim_syntaxfile]]. + +[[!tag vim]] diff --git a/doc/tips/yaml_setup_files.mdwn b/doc/tips/yaml_setup_files.mdwn new file mode 100644 index 000000000..e8ab4f144 --- /dev/null +++ b/doc/tips/yaml_setup_files.mdwn @@ -0,0 +1,12 @@ +Here's how to convert your existing standard format ikiwiki setup file into +the new YAML format recently added to ikiwiki. + +1. First, make sure you have the [[!cpan YAML]] perl module installed. + (Run: `apt-get install libyaml-perl`) +2. Run: `ikiwiki -setup my.setup -dumpsetup my.setup --set setuptype=Yaml` + +The format of the YAML setup file should be fairly self-explanatory. + +(To convert the other way, use "setuptype=Standard" instead.) + +--[[Joey]] diff --git a/doc/todo/ACL.mdwn b/doc/todo/ACL.mdwn index e9fb2717f..dd9793233 100644 --- a/doc/todo/ACL.mdwn +++ b/doc/todo/ACL.mdwn @@ -21,6 +21,11 @@ something, that I think is very valuable. >>>> Which would rule out openid, or other fun forms of auth. And routing all access >>>> through the CGI sort of defeats the purpose of ikiwiki. --[[Ethan]] +>>>>> I think what Joey is suggesting is to use apache ACLs in conjunction +>>>>> with basic HTTP auth to control read access, and ikiwiki can use the +>>>>> information via the httpauth plugin for other ACLs (write, admin). But +>>>>> yes, that would rule out non-httpauth mechanisms. -- [[Jon]] + Also see [[!debbug 443346]]. > Just a few quick thoughts about this: @@ -69,3 +74,25 @@ Here is how I see it: <pre> \[[!acl user=* page=/subsite/* acl=/subsite/acl.mdwn]] </pre> + +Any idea when this is going to be finished? If you want, I am happy to beta test. + +> It's already done, though that is sorta hidden in the above. :-) +> Example of use to only allow two users to edit the tipjar page: +> locked_pages => 'tipjar and !(user(joey) or user(bob))', +> --[[Joey]] + +> > Thank you for the hint but I am being still confused (read: dense)... What I am trying to do is this: + +> > * No anonymous access. +> > * Logged in users can edit and create pages. +> > * Users can set who can edit their pages. +> > * Some pages are only viewable by admins. + +> > Is it possible? If so how?... + +>>> I don't believe this is currently possible. What is missing is the concept +>>> of page 'ownership'. -- [[Jon]] + +>>>> GAH! That is really a shame... Any chance of adding that? No, I do not really expect it to be added, after all my requirements are pushing the boundary of what a wikiwiki + should be. Nonetheless, thanks for your help! diff --git a/doc/todo/Add_HTML_support_to_po_plugin.mdwn b/doc/todo/Add_HTML_support_to_po_plugin.mdwn new file mode 100644 index 000000000..ec29e4f61 --- /dev/null +++ b/doc/todo/Add_HTML_support_to_po_plugin.mdwn @@ -0,0 +1,7 @@ +The HTML page type should be fully supported by the PO plugin: po4a's +HTML support is able to extract translatable strings and to disregard +the rest. + +This is implemented in my po branch, please review. --[[intrigeri]] + +[[!tag patch]] diff --git a/doc/todo/Add_label_to_search_form_input_field.mdwn b/doc/todo/Add_label_to_search_form_input_field.mdwn index e4e83428c..514108fba 100644 --- a/doc/todo/Add_label_to_search_form_input_field.mdwn +++ b/doc/todo/Add_label_to_search_form_input_field.mdwn @@ -47,4 +47,10 @@ The patch below adds a label for the field to improve usability: > to get it to appear higher up is to put it first, or to use Evil absolute > positioning. (CSS sucks.) --[[Joey]] -[[!tag done wishlist]] +> Update: html5 allows just adding `placeholder="Search"` to the input +> element. already works in eg, chromium. However, ikiwiki does not use +> html5 yet. --[[Joey]] + +>> [[Done]], placeholder added, in html5 mode only. + +[[!tag wishlist bugs/html5_support]] diff --git a/doc/todo/Add_nicer_math_formatting.mdwn b/doc/todo/Add_nicer_math_formatting.mdwn new file mode 100644 index 000000000..041eaee11 --- /dev/null +++ b/doc/todo/Add_nicer_math_formatting.mdwn @@ -0,0 +1,5 @@ +It would be nice to add nicer math formatting. I currently use the [[plugins/teximg]] plugin, but I wonder if [jsMath](http://www.math.union.edu/~dpvc/jsMath/) wouldn't be a better option. + +[[Will]] + +[[!tag wishlist]] diff --git a/doc/todo/CSS_classes_for_links.mdwn b/doc/todo/CSS_classes_for_links.mdwn index 5013a9d12..38db87724 100644 --- a/doc/todo/CSS_classes_for_links.mdwn +++ b/doc/todo/CSS_classes_for_links.mdwn @@ -75,3 +75,29 @@ I find CSS3 support still spotty... Here are some notes on how to do this in Ik > [[rel_attribute|rel_attribute_for_links]] now that CSS can use.) > > --[[Joey]] + +>> I had a little look at this, last weekend. I added a class definition to +>> the `htmllink` call in `linkify` in `link.pm`. It works pretty well, but +>> I'd also need to adjust other `htmllink` calls (map, inline, etc.). I found +>> other methods (CSS3 selectors, etc.) to be unreliable. +>> +>> Would you potentially accept a patch that added `class="internal"` to +>> various `htmllink` calls in ikiwiki? +>> +>> How configurable do you think this behaviour should be? I'm considering a +>> config switch to enable or disable this behaviour, or possibly a +>> configurable list of class names to append for internal links (defaulting +>> to an empty list for backwards compatibility)> +>> +>> As an alternative to patching the uses of `htmllink`, what do you think +>> about patching `htmllink` itself? Are there circumstances where it might be +>> used to generate a non-internal link? -- [[Jon]] + +>>> I think that the minimum configurability to get something that +>>> can be used by CSS to style the links however the end user wants +>>> is the best thing to shoot for. Ideally, no configurability. And +>>> a tip or something documenting how to use the classes in your CSS +>>> to style links so that eg, external links have a warning icon. +>>> +>>> `htmllink` can never be used to generate an external link. So, +>>> patching it seems the best approach. --[[Joey]] diff --git a/doc/todo/CVS_backend.mdwn b/doc/todo/CVS_backend.mdwn index 3c6527290..c450542e2 100644 --- a/doc/todo/CVS_backend.mdwn +++ b/doc/todo/CVS_backend.mdwn @@ -9,6 +9,8 @@ Original discussion: >> No, although the existing svn backend could fairly esily be modified into >> a CVS backend, by someone who doesn't mind working with CVS. --[[Joey]] >> ->>> Wouldn't say I don't mind, but I needed it. See [[plugins/contrib/cvs]]. --[[Schmonz]] +>>> Wouldn't say I don't mind, but I needed it. See [[rcs/cvs]]. --[[Schmonz]] + +[[done]] [[!tag wishlist]] diff --git a/doc/todo/Configurable_minimum_length_of_log_message_for_web_edits.mdwn b/doc/todo/Configurable_minimum_length_of_log_message_for_web_edits.mdwn index 7870281ae..74a4fb1b1 100644 --- a/doc/todo/Configurable_minimum_length_of_log_message_for_web_edits.mdwn +++ b/doc/todo/Configurable_minimum_length_of_log_message_for_web_edits.mdwn @@ -1,3 +1,5 @@ It would be nice to specify a minimum length for the change log for web edits, and if it's only required vs. non-required. I realise this is not going to solve the problem of crap log messages, but it helps guard against accidental submissions which one would have logged. Mediawiki/wikipedia has that option, and I find it a useful reminder. --[[madduck]] +> +1 --[[lnussel]] + [[!tag wishlist]] diff --git a/doc/todo/Fix_selflink_in_po_plugin.mdwn b/doc/todo/Fix_selflink_in_po_plugin.mdwn new file mode 100644 index 000000000..b276c075d --- /dev/null +++ b/doc/todo/Fix_selflink_in_po_plugin.mdwn @@ -0,0 +1,21 @@ +Using the po plugin, a link to /bla is present in the sidebar. +When viewing /bla in the default language, this link is detected as +a selflink. When viewing a translation of /bla, it +isn't. --[[intrigeri]] + +Fixed in my po branch. --[[intrigeri]] + +[[!tag patch done]] + +> bump? + +>> I know I've looked at 88c6e2891593fd508701d728602515e47284180c +>> before, and something about it just seemed wrong. Maybe it's +>> the triviality of the sub, which it would seem to be easy to +>> decide to refactor back into its one caller (which would reintroduce the +>> bug). --[[Joey]] + +>>> Well, I can hear and understand this. Apart of adding a comment to +>>> the sub, explaining the rationale (which is now done in my po +>>> branch), I don't know what I can do to make it not seem wrong. +>>> --[[intrigeri]] diff --git a/doc/todo/Google_Sitemap_protocol.mdwn b/doc/todo/Google_Sitemap_protocol.mdwn index 057a88b72..ea8ee7f03 100644 --- a/doc/todo/Google_Sitemap_protocol.mdwn +++ b/doc/todo/Google_Sitemap_protocol.mdwn @@ -34,6 +34,9 @@ for an example. You will probably need to strip out the metadata variables I >>>[xtermin.us rather than localhost](http://xtermin.us/git/?p=website.git;a=blob;f=plugins/googlesitemap.pm) is 404 now. >>> -- weakish + +Although it is not able to read the meta-data from files, using google-sitemapgen [works well for me](http://bzed.de/posts/2010/06/creating_a_google_sitemap_for_ikiwiki/) to create a sitemap for my ikiwiki installation. -- [[bzed|BerndZeimetz]] + There is a [sitemap XML standard](http://www.sitemaps.org/protocol.php) that ikiwiki needs to generate for. # Google Webmaster tools and RSS @@ -45,3 +48,13 @@ On [Google Webmaster tools](https://www.google.com/webmasters/tools) you can sub [Google should grok feeds as sitemaps.](http://www.google.com/support/webmasters/bin/answer.py?answer=34654) Or rather [[plugins/inline]] should be improved to support the [sitemap protocol](http://sitemaps.org/protocol.php) natively. -- [[Hendry]] + + +Took me a minute to figure this out so I figured I'd share the steps I took: + +* Added rss=>1 and allowrss=>1 to my setup file +* Created a new page where the RSS would be created with this content, replacing "first_page" with the page in my wiki with the earliest date: + +<pre> +\[[!inline pages="* and !*/Discussion and created_after(first_page)" archive="yes" rss="yes" ]] +</pre> diff --git a/doc/todo/Mailing_list.mdwn b/doc/todo/Mailing_list.mdwn index b6a207420..67cbbb00b 100644 --- a/doc/todo/Mailing_list.mdwn +++ b/doc/todo/Mailing_list.mdwn @@ -18,3 +18,19 @@ Does this sound okay? > todo/bugs/forum feeds, or to some other feed they create on their user page. > And there's work on making the discussion pages more structured, on > accepting comments sent via mail, etc. --[[Joey]] + +>>I was going to make the very same request, so I'm glad to know I'm not the only one who felt the need for it. + +>>I can see your reasoning, though I don't think ikiwiki has reached the level (yet) of facilitating discussion as well as a mailing list does. +>>You've already pointed out the need for (a) more structured discussion pages, (b) comments sent via mail, but I'm not sure whether that will be enough. This is because the nature of a wiki means that discussions are scattered all over the site, as people discuss in discussion pages about the given topic - and so they should. The consequence of this, however, is that one has a choice (in regard to RSS feeds) of having too much or too little. Too little, if one only feeds on news/todo/bugs/forum, since one misses out on discussions elsewhere. Too much, because the only other option appears to be subscribing to recentchanges, which will give one *everything*, whether it is relevant or not. +>>Unfortunately, I'm not really sure what the best solution is for this problem. + +>> For those who might be interested, I've added the following RSS feeds to <http://www.dreamwidth.org>: +*ikiwiki_bugs_feed, +ikiwiki_forum_feed, +ikiwiki_news_feed, +ikiwiki_recent_feed, +ikiwiki_todo_feed, +ikiwiki_wishlist_feed* + +>>--[[KathrynAndersen]] diff --git a/doc/todo/More_flexible_po-plugin_for_translation.mdwn b/doc/todo/More_flexible_po-plugin_for_translation.mdwn new file mode 100644 index 000000000..3399f7834 --- /dev/null +++ b/doc/todo/More_flexible_po-plugin_for_translation.mdwn @@ -0,0 +1,5 @@ +I have a website with multi-language content, where some content is only in English, some in German, and some is available in both languages. + +The po-module currently has only one master-language, with slave languages, and a PageSpec should be considered. + +It would be nice to flag the content which should have a translation on a file-by-file basis (with some inline directive?) which could contain the information of the master-language for that file and the desired target-languages. diff --git a/doc/todo/Multiple_categorization_namespaces.mdwn b/doc/todo/Multiple_categorization_namespaces.mdwn new file mode 100644 index 000000000..3e9f8feaa --- /dev/null +++ b/doc/todo/Multiple_categorization_namespaces.mdwn @@ -0,0 +1,103 @@ +I came across this when working on converting my old blog into an ikiwiki, but I think it could be of more general use. + +The background: I have a (currently suspended, waiting to be converted) blog on the [il Cannocchiale](http://www.ilcannocchiale.it) hosting platform. Aside from the usual metatadata (title, author), il Cannocchiale also provides tags and two additional categorization namespaces: a blog-specific user-defind "column" (Rubrica) and a platform-wide "category" (Categoria). The latter is used to group and label a couple of platform-wide lists of latest posts, the former may be used in many different ways (e.g. multi-author blogs could have one column per author or so, or as a form of 'macro-tagging'). Columns are also a little more sophisticated than classical tags because you can assign them a subtitle too. + +When I started working on the conversion, my first idea was to convert Rubriche to subdirectories of an ikiwiki blog. However, this left me with a few annoying things: when rebuilding links from the import, I had to (programmatically) dive into each subdirectory to see where each post was; this would also be problematic for future posting, too. It also meant that moving a post from a Rubrica to the other would break all links (unless ikiwiki has a way to fix this automagically). And I wasn't too keen on the fact that the Rubrica would come up in the URL of the post. And finally, of course, I couldn't use this to preserve the Categoria metadata. + +Another solution I thought about was to use special deeper tags for the Rubrica and Categoria (like: `\[[!tag "Rubrica/Some name"]]`), but this is horrible, clumsy, and makes special treatment of these tags a PITN (for example you wouldn't want the Rubrica to be displayed together with the other tags, and you would want it displayed somewhere else like next to the title of the post). This solution however looks to me as the proper path, as long as tags could support totally separate namespaces. I have a tentative implementation of this `tagtype` feature at [my git clone of ikiwiki](http://git.oblomov.eu/ikiwiki). + +The feature is currently implemented as follows: a `tagtypes` config options takes an array of strings: the tag types to be defined _aside from the usual tags_. Each tag type automatically provides a new directive which sets up tags that different from standard tags by having a different tagbase (the same as the tagtype) and link type (again, the same as the tagtype) (a TODO item for this would to make the directive, tagbase and link type customizable). For example, for my imported blog I would define + + tagtypes => [qw{Categoria Rubrica}] + +and then in the blog posts I would have stuff like + + \[[!Categoria "LAVORO/Vita da impiegato"]] + \[[!Rubrica "Il mio mondo"]] + \[[!meta title="Blah blah"]] + \[[!meta author="oblomov"]] + + The body of the article + + \[[!tag a bunch of tags]] + +and the tags would appear at the bottom of the post, the Rubrica next to the title, etc. All of this information would end up as categories in the feeds (although I would like to rework that code to make use of namespaces, terms and labels in a different way). + +> Note [[plugins/contrib/report/discussion]]. To quote myself from the latter page: +> *I find tags as they currently exist to be too limiting. I prefer something that can be used for Faceted Tagging http://en.wikipedia.org/wiki/Faceted_classification; that is, things like Author:Fred Nurk, Genre:Historical, Rating:Good, and so on. Of course, that doesn't mean that each tag is limited to only one value, either; just to take the above examples, something might have more than one author, or have multiple genres (such as Historical + Romance).* + +> So you aren't the only one who wants to do more with tags, but I don't think that adding a new directive for each tag type is the way to go; I think it would be simpler to just have one directive, and take advantage of the new [[matching different kinds of links]] functionality, and enhance the tag directive. +> Perhaps something like this: + + \[[!tag categorica="LAVORO/Vita da impiegato" rubrica="Il mio mondo"]] + +> Part of my thinking in this is to also combine tags with [[plugins/contrib/field]], so that the tags for a page could be queried and displayed; that way, one could put them wherever you wanted on the page, using any of [[plugins/contrib/getfield]], [[plugins/contrib/ftemplate]], or [[plugins/contrib/report]]. +> --[[KathrynAndersen]] + +>> A very generic metadata framework could cover all possible usages of fields, tags, and related metadata, but keeping its _user interface_ generic would only make it hard to use. Note that this is not an objection to the idea of collapsing the fields and tags functionality (at quick glance, I cannot see a real difference between single-valued custom tagtypes and fields, but see below), but more about the syntax. + +>> I had thought about the `\[[!tag type1=value1 type2=value2]]` syntax myself, but ultimately decided against it for a number of reasons, most importantly the fact that (1) it's harder to type, (2) it's harder to spot errors in the tag types (so for example if one misspelled `categoria` as `categorica`, he might not notice it as quickly as seeing the un-parsed `\[[!categorica ]]` directive in the output html) and (3) it encourages collapsing possibly unrelated metadata together (for example, I would never consider putting the categoria information together with the rubrica one; of course with your syntax it's perfectly possible to keep them separate as well). + +>> Point (2) may be considered a downside as well as an upside, depending on perspective, of course. And it would be possible to have a set of predefined tag types to match against, like in my tagtype directive approach but with your syntax. + +>>> You seem to have answered your own objections already. -- K.A. + +>>Point (3) is of course entirely in the hands of the user, but that's exactly what syntax should be about. There is nothing functionally wrong with e.g. `\[[!meta tag=sometag author=someauthor title=sometitle rubrica=somecolumn]]`, but I honestly find it horrible. + +>>> So, really, point 3 comes down to differing aesthetics. -- K.A. + +>> A solution could be to allow both syntaxes, getting to have for example `\[[!sometagtype "blah"]]` as a shortcut for `\[[!tag sometagtype="blah"]]` (or, in the more general case, `\[[!somefieldname "blah"]]` as a shortcut for `\[[!meta fieldname="blah"]]`). + +>> I would like to point out however that there are some functional differences between categorization metadata vs other metadata that might suggest to keep fields and (my extended) tags separate. For examples, in feeds you'd want all categorization metadata to fall in one place, with some appropriate manipulation (which I still have to implement, by the way), while things like author or title would go to the corresponding feed item properties. Although it all would be possible with appropriate report or template juggling, having such default metadata handled natively looks like a bonus to me. + +>>> Whereas I prefer being able to control such things with templates, because it gives more flexibility AND control. - K.A. + +>>>> Flexibility and control is good for tuning and power-usage, but sensible defaults are a must for a platform to be usable out of the box without much intervention. Moreover, there's a possible problem with what kind of data must be passed over to templates. + +Aside from the name of the plugin (and thus of the main directive), which could be `tag`, `meta`, `field` or whatever (maybe extending `meta` would be the most sensible choice), the features we want are + +1. allow multiple values per type/attribute/field/whatever (fields currently only allows one) + * Agreed about multiple values; I've been considering whether I should add that to `field`. -- K.A. +2. allow both hidden and visible references (a la tag vs taglink) + * Hidden and visible references; that's fair enough too. My approach with `ymlfront` and `getfield` is that the YAML code is hidden, and the display is done with `getfield`, but there's no reason not to use additional approaches. -- K.A. +3. allow each type/attribute/field to be exposed under multiple queries (e.g. tags and categories; this is mostly important for backwards compatibility, not sure if it might have other uses too) + * I'm not sure what you mean here. -- K.A. + * Typical example is tags: they are accessible both as `tags` and as `categories`, although the way they are presented changes a little -- G.B. +4. allow arbitrary types/attributes/fields/whatever (even 'undefined' ones) + * Are you saying that these must be typed, or are you saying that they can be user-defined? -- K.A. + * I am saying that the user should be able to define (e.g. in the config) some set of types/fields/attributes/whatever, following the specification illustrated below, but also be able to use something like `\[[!meta somefield="somevalue"]]` where `somefield` was never defined before. In this case `somefield` will have some default values for the properties described in the spec below. -- G.B. + +Each type/attribute/field/whatever (predefined, user-defined, arbitrary) would thus have the following parameters: + +* `directive` : the name of the directive that can be used to set the value as a hidden reference; we can discuss whether, for pre- or user-defined types, it being undef means no directive or a default directive matching the attribute name would be defined. + * I still want there to be able to be enough flexibility in the concept to enable plugins such as `yamlfront`, which sets the data using YAML format, rather than using directives. -- K.A. + * The possibility to use a directive does not preclude other ways of defining the field values. IOW, even if the directive `somefield` is defined, the user would still be able to use the syntax `\[[!meta somefield="somevalue"]]`, or any other syntax (such as YAML). -- G.B. +* `linkdirective` : the name of the directive that can be used for a visible reference; no such directive would be defined by default +* `linktype` : link type for (hidden and visible) references + * Is this the equivalent to "field name"? -- K.A. + * This would be such by default, but it could be set to something different. [[Typed links|matching_different_kinds_of_links]] is a very recent ikiwiki feature. -- G.B. +* `linkbase` : akin to the tagbase parameter + * Is this a field-name -> directory mapping? -- K.A. + * yes, with each directory having one page per value. It might not make sense for all fields, of course -- G.B. + * (nods) I've been working on something similar with my unreleased `tagger` module. In that, by default, the field-name maps to the closest wiki-page of the same name. Thus, if one had the field "genre=poetry" on the page fiction/stories/mary/lamb, then that would map to fiction/genre/poetry if fiction/genre existed. --K.A. + * that's the idea. In your case you could have the linkbase of genre be fiction/genre, and it would be created if it was missing. -- G.B. +* `queries` : list of template queries this type/attribute/field/whatever is exposed to + * I'm not sure what you mean here. -- K.A. + * as mentioned before, some fields may be made accessible through different template queries, in different form. This is the case already for tags, that also come up in the `categories` query (used by Atom and RSS feeds). -- G.B. + * Ah, do you mean that the input value is the same, but the output format is different? Like the difference between TMPL_VAR NAME="FOO" and TMPL_VAR NAME="raw_FOO"; one is htmlized, and the other is not. -- K.A. + * Actually this is about the same information appearing in different queries (e.g. NAME="FOO" and NAME="BAR"). Example: say that I defined a "Rubrica" field. I would want both tags and categories to appear in `categories` template query, but only tags would appear in the `tags` query, and only Rubrica values to appear in `rubrica` queries. The issue of different output formats was presented in the next paragraph instead. -- G.B. + +Where this approach is limiting is on the kind of data that is passed to (template) queries. The value of the metadata fields might need some massaging (e.g. compare how tags are passed to tags queries vs cateogires queries, or also see what is done with the fields in the current `meta` plugin). I have problems on picturing an easy way to make this possible user-side (i.e. via templates and not in Perl modules). Suggestions welcome. + +One possibility could be to have the `queries` configuration allow a hash mapping query names to functions that would transform the data. Lacking that possibility, we might have to leave some predefined fields to have custom Perl-side treatment and leave custom fields to be untransformable. + +----- + +I've now updated the [[plugins/contrib/field]] plugin to have: + +* arrays (multi-valued fields) +* the "linkbase" option as mentioned above (called field_tags), where the linktype is the field name. + +I've also updated [[plugins/contrib/ftemplate]] and [[plugins/contrib/report]] to be able to use multi-valued fields, and [[plugins/contrib/ymlfront]] to correctly return multi-valued fields when they are requested. + +--[[KathrynAndersen]] diff --git a/doc/todo/OpenSearch.mdwn b/doc/todo/OpenSearch.mdwn index e63ded688..c35da54e1 100644 --- a/doc/todo/OpenSearch.mdwn +++ b/doc/todo/OpenSearch.mdwn @@ -15,4 +15,24 @@ contain the wiki title from `ikiwiki.setup`. --[[JoshTriplett]] +> I support adding this. I think all that is needed, beyond the simple task +> of adding the link header, is to make the search plugin write out +> the xml file, probably based on a template. +> +> One problem is that the +> [specification](http://www.opensearch.org/Specifications/OpenSearch/1.1#OpenSearch_description_document) +> for the XML file contains a number of silly limits to field lenghs. +> For example, it wants a "ShortName" that identifies the search engine, +> to be 16 characters or less. The Description is limited to 1024, +> the LongName to 48. This limits what existing config settings can be +> reused for those. +> +> Another semi-problem is that the specification saz: +> +>> OpenSearch description documents should include at least one Query element of role="example" that is expected to return search results. Search clients may use this example query to validate that the search engine is working properly. +> +> How should ikiwiki know what example query will return actual results? +> (How would a client know if a HTML page contains results or not, anyway?) +> Sillyness. Ignore this? --[[Joey]] + [[wishlist]] diff --git a/doc/todo/Option_to_make_title_an_h1__63__.mdwn b/doc/todo/Option_to_make_title_an_h1__63__.mdwn index f4023d6dd..8345cd010 100644 --- a/doc/todo/Option_to_make_title_an_h1__63__.mdwn +++ b/doc/todo/Option_to_make_title_an_h1__63__.mdwn @@ -11,4 +11,4 @@ Currently, the page title (either the name of the page or the title specified wi > latter, making `#` (only when on the first line) set the page title, removing it from > the page body. --[[JasonBlevins]], October 22, 2008 - [h1title]: http://code.jblevins.org/ikiwiki/plugins.git/plain/h1title.pm + [h1title]: http://jblevins.org/git/ikiwiki/plugins.git/plain/h1title.pm diff --git a/doc/todo/Raw_view_link.mdwn b/doc/todo/Raw_view_link.mdwn index fd64074c2..b62a9022b 100644 --- a/doc/todo/Raw_view_link.mdwn +++ b/doc/todo/Raw_view_link.mdwn @@ -4,7 +4,7 @@ The configuration setting for Mercurial could be something like this: rawurl => "http://localhost:8000//raw-file/tip/\[[file]]", -> What I want to do when I want to see if the raw source is either +> What I do when I want to see if the raw source is either > click on the edit link, or click on history and navigate to it in the > history browser (easier to do in viewvc than in gitweb, IIRC). > Not that I'm opposed to the idea of a plugin that adds a Raw link @@ -14,4 +14,6 @@ The configuration setting for Mercurial could be something like this: >> to gitweb/etc. I think Will's patch is a good approach, and have improved >> on it a bit in a git branch. +>>> Since that is merged in now, I'm marking this [[done]] --[[Joey]] + [[!tag wishlist]] diff --git a/doc/todo/Resolve_native_reStructuredText_links_to_ikiwiki_pages.mdwn b/doc/todo/Resolve_native_reStructuredText_links_to_ikiwiki_pages.mdwn new file mode 100644 index 000000000..6e0f32fd5 --- /dev/null +++ b/doc/todo/Resolve_native_reStructuredText_links_to_ikiwiki_pages.mdwn @@ -0,0 +1,333 @@ +_NB! this page has been refactored, hopefully it is clearer now_ +_I propose putting discussion posts somewhere in the vincity of +the secttion Individual reStructuredText Issues_ + +## Design ## + +**Goal** + +To be able to use rst as a first-class markup language in ikiwiki. I think +most believe this is almost impossible (ikiwiki is built around markdown). + +## Wikilinks ## + +**WikiLinks**, first and foremost, are needed for a wiki. rST already allows +specifying absolue and relative URL links, and relative links can be used to +tie together wiki of rst documents. + +1. Below are links to a small, working implementation for resolving + undefined rST references using ikiwiki's mechanism. This is **Proposal 1** + for rst WikiLinks. + +2. Looking over at rST-using systems such as trac and MoinMoin; I think it + would be wiser to implement wikilinks by the `:role:` mechanism, together + with allowing a custom URL scheme to point to wiki links. This is + **Proposal 2**. + + This is a simple wiki page, with :wiki:`WikiLinks` and other_ links + + .. _other: wiki:wikilink + + We can get rid of the role part as well for WikiLinks:: + + .. default-role:: wiki + + Enables `WikiLinks` but does not impact references such as ``other`` + This can be made the default for ikiwiki. + +Benefits of using a `:role:` and a `wiki: page/subpage` URL scheme are +following: + +1. rST documents taken out of the context (the wiki) will not fail as bad as + if they have lots of Proposal-1 links: They look just the same as valid + references, and you have to edit them all. + In contrast, should the `:wiki:` role disappear, one line is enough + to redefined it and silence all the warnings for the document: + + .. role:: wiki (title) + +### Implementation ### + +Implementation of Proposal-2 wikilinks are in the branch +[rst-wikilinks][rst-wl] + + + This is a simple wiki page, with :wiki:`WikiLinks` and |named| links + + .. |named| wiki:: Some Page + + We can get rid of the role part as well for WikiLinks:: + + .. default-role:: wiki + + Enables `WikiLinks` but does not impact references such as ``named`` + This can be made the default for ikiwiki. + +[rst-wl]: http://github.com/engla/ikiwiki/commits/rst-wikilinks + +**rst-wikilinks** patch series includes changes at the end to use ikiwiki's +'htmllink' for the links (which is the only sane thing to do to work in all configurations). +This means a :wiki:`Link` should render just exactly like [[Link]] whether +the target exists or not. + +On top of **rst-wikilinks** is [rst-customize][rst-custom] which adds two +power user features: Global (python) file to read in custom directives +(unsafe), and a wikifile as "header" file for all parsed .rst files (safe, +but disruptive since all .rst depend on it). Well, the customizations have +to be picked and chosen from this, but at least the global python file can +be very convenient. + +> Did you consider just including the global rst header text into an item +> in the setup file? --[[Joey]] +> +>> Then `rst_header` would not be much different from the python script +>> `rst_customize`. rst_header is as safe as other files (though disruptive +>> as noted), so it should/could be a editable file in the Wiki. A Python +>> script of course can not be. There is nothing you can do in the +>> rst_header (that you sensibly would do, I think) that couldn't be done in +>> the Python script. `rst_header` has very limited use, but it is another +>> possibility, mainly for the user-editable aspect. --[[ulrik]] +>> +>> (I foresaw only two things to be added to the rst_header: the default +>> role could be configured there (as with rst_wikirole), and if you have a +>> meta-role like :shortcut:, shortcuts could be defined there.) +> +> I have some discussion on the [docutils mailing list][dml], the developers +> of docutils seems to favor "Proposal 1", while I defend my ideas. They +> want all users of ReST to use only the basic featureset to remain +> compatible, of course. -- [[ulrik]] + +[dml]: http://thread.gmane.org/gmane.text.docutils.user/5376 + +Some rst-custom [examples are here](http://kaizer.se/wiki/rst_examples/) + +[rst-custom]: http://github.com/engla/ikiwiki/commits/rst-customize + +## Directives ## + +Now **Directives**: As it is now, ikiwiki goes though (roughly): +filter, preprocess, htmlize, format as major stages of content +transformation. rST has major problems to work with any HTML that enters the +picture before it. + +1. Formatting rST in `htmlize` (as is done now): Raw html can be escaped by + raw blocks: + + .. raw:: html + + \[[!inline and do stuff]] + + (This can be simplified to alias the above as `.. ikiwiki::`) + This escape method works, if ikwiki can be persuaded to maintain the + indent when inserting html, so that it stays inside the raw block. + +2. Formatting rST in `filter` (idea) + 1. rST does not have to see any HTML (raw not needed) + 2. rST directives can alias ikiwiki syntax: + + ..ikiwiki:: inline pages= ... + + 3. Using rST directives as ikiwiki directives can be complicated; + but rST directives allow a direct line (after :: on first line), + an option list, and a content block. + +> You've done a lot of work already, but ... +> +> The filter approach seems much simpler than the other approaches +> for users to understand, since they can just use identical ikiwiki +> markup on rst pages as they would use anywhere else. This is very desirable +> if the wiki allows rst in addition to mdwn, since then users don't have +> to learn two completly different ways of doing wikilinks and directives. +> I also wonder if even those familiar with rst would find entirely natural +> the ways you've found to shoehorn in wikilinks, named wikilinks, and ikiwiki +> directives? +> +> Htmlize in filter avoids these problems. It also leaves open the possibility +> that ikiwiki could become smarter about the rendering chain later, and learn +> to use a better order for rst (ie, htmlize first). If that later happened, +> the htmlize in filter hack could go away. --[[Joey]] + +> (BTW, the [[plugins/txt]] plugin already does html formatting +> in filter, for similar reasons.) --[[Joey]] + +>> Thank you for the comments! Forget the work, it's not so much. +>> I'd rank the :wiki: link addition pretty high, and the other changes way +>> behind that: +>> +>> The :wiki:`Wiki Link` syntax is *very* appropriate as rst syntax +>> since it fits well with other uses of roles (notice that :RFC:`822` +>> inserts a link to RFC822 etc, and that the default role is a *title* role +>> (title of some work); thus very appropriate for medium-specific links like +>> wiki links. So I'd rank :wiki: links a worthwhile addition regardless of +>> outcome here, since it's a very rst-like alternative for those who wish to +>> use more rst-like syntax (and documents degrades better outside the wiki as +>> noted). +>> +>>> Unsure about the degredation argument. It will work some of +>>> the time, but ikiwiki's [[ikiwiki/subpage/linkingrules]] +>>> are sufficiently different from normal html relative link +>>> rules that it often won't work. --[[Joey]] +>>> +>>>> With degradation I mean that if you take a file out of the wiki; the +>>>> links degrade to stylized text. If using default role, they degrade to +>>>> :title: which renders italicized text (which I find is exactly +>>>> appropriate). There is no way for them to degrade into links, except of +>>>> course if you reimplement the :wiki: role. You can also respecify +>>>> either the default role (the `wikilink` syntax) or the :wiki: role (the +>>>> :wiki:`wikilink` syntax) to any other markup, for example None. +>>>> --[[ulrik]] +>> +>> The named link syntax (just like the :wiki: role) are inspired from +>> [trac][tracrst] and a good fit, but only if the wiki is committed to +>> using only rst, which I don't think is the case. +>> +>> The rst-customize changes are very useful for custom directive +>> installations (like the sourcecode directive, or shortcut roles I show +>> in the examples page), but there might be a way for the user to inject +>> docutils addons that I'm missing (one very ugly way would be to stick +>> them in sitecustomize.py which affects all Python programs). +>> +>> With the presented changes, I already have a working RestructuredText +>> wiki, but I'm admitting that using .. raw:: html around all directives is +>> very ugly (I use few directives: inline, toggle, meta, tag, map) +>> +>> On filter/htmlize: Well **rst** is clearly antisocial: It can't see HTML, +>> and ikiwiki directives are wrappend in paragraph tags. (For wikilinks +>> this is probably no problem). So the suggestion about `.. ikiwiki:` is +>> partly because it looks good in rst syntax, but also since it would emit +>> a div to wrap around the element instead of a paragraph. +>> +>> I don't know if you mean that rst could be reordered to do htmlize before +>> other phases? rst must be before any preprocess hook to avoid seeing any +>> HTML. +>> +>>> One of my long term goals is to refactor all the code in ikiwiki +>>> that manually runs the various stages of the render pipeline, +>>> into one centralized place. Once that's done, that place can get +>>> smart about what order to run the stages, and use a different +>>> order for rst. --[[Joey]] +>> +>> If I'm thinking right, processing to HTML already in filter means any +>> processing in scan can be reused directly (or skipped if it's legal to +>> emit 'add_link' in filter.) +>> +>> -- [[ulrik]] + +>>> Seems it could be, yes. --[[Joey]] +>>> +>>>> It is not clear how we can work around reST wrapping directives with +>>>> paragraph tags. Also, some escaping of xml characters & <> might +>>>> happen, but I can't imagine right now what breakage can come from that. +>>>> -- [[ulrik]] + +[tracrst]: http://trac.edgewall.org/wiki/WikiRestructuredText + +### Implementation ### + +Preserving indents in the preprocessor are in branch [pproc-indent][ppi] + +(These simple patches come with a warning: _Those are the first lines of +Perl I've ever written!_) + +> This seems like a good idea, since it solves issues for eg, indented +> directives in mdwn as well. But, looking at the diff, I see a clear bug: +> +> - return "[[!$command <span class=\"error\">". +> + $result = "[[!$command <span class=\"error\">". +> +> That makes it go on and parse an infinitely nested directive chain, instead +> of immediatly throwing an error. +> +> Also, it seems that the "indent" matching in the regexps may be too broad, +> wouldn't it also match whitespace before a directive that was not at the beginning +> of a line, and treat it as an indent? With some bad luck, that could cause mdwn +> to put the indented output in a pre block. --[[Joey]] +> +>> You are probably right about the bug. I'm not quite sure what the nested +>> directives examples looks like, but I must have overlooked how the +>> recursion counter works; I thought simply changing if to elif the next +>> few lines would solve that. I'm sorry for that! +>> +>> We don't have to change the `$handle` function at all, if it is possible +>> to do the indent substitution all in one line instead of passing it to +>> handle, I don't know if it is possible to turn: +>> +>> $content =~ s{$regex}{$handle->($1, $2, $3, $4, $5)}eg; +>> +>> into +>> +>> $content =~ s{$regex}{s/^/$1/gm{$handle->($2, $3, $4, $5)}}eg; +>> +>> Well, no idea how that would be expressed, but I mean, replace the indent +>> directly in $handle's return value. +>> +>>> Yes, in effect just `indent($1, handle->($2,$,4))` --[[Joey]] +>> +>> The indent-catching regex is wrong in the way you mention, it has been +>> nagigng my mind a bit as well; I think matching start of line + spaces +>> and tabs is the only thing we want. +>> -- [[ulrik]] +>> +>>> Well, seems you want to match the indent at the start of the line containing +>>> the directive, even if the directive does not start the line. That would +>>> be quite hard to make a regexp do, though. --[[Joey]] +>> +>> I wasted a long time getting the simpler `indent($1, handle->($2,$,4))` to +>> work (remember, I don't know perl at all). Somehow `$1` does not arrive, I +>> made a simple testcase that worked, and I conclude something inside $handle +>> results in the value of $1 not arriving as it should! +>> +>> Anyway, instead a very simple incremental patch is in [pproc-indent][ppi] +>> where the indentation regex is `(^[ \t]+|)` instead, which seems to work +>> very well (and the regex is multiline now as well). I'm happy to rebase the +>> changes if you want or you can just squash the four patches 1+3 => 1+1 +>> -- [[ulrik]] + +[ppi]: http://github.com/engla/ikiwiki/commits/pproc-indent + +## Discussion ## + +I guess you (or someone) has been through this before and knows why it +simply won't work. But I hoped there was something original in the above; +and I know there are wiki installations where rST works. --ulrik + +**Individual reStructuredText Issues** + +* We resolve rST links without definition, we don't help resolving defined + relative links, so we don't support specifying link name and target + separately. + + * Resolved by |replacement| links with the wiki:: directive. + +**A first implementation: Resolving unmatched links** + +I have a working minimal implementation letting the rst renderer resolve +undefined native rST links to ikiwiki pages. I have posted it as one patch at: + +Preview commit: http://github.com/engla/ikiwiki/commit/486fd79e520da1d462f00f40e7a90ab07e9c6fdf +Repository: git://github.com/engla/ikiwiki.git + +Design issues of the patch: + +The page is rST-parsed once in 'scan' and once in 'htmlize' (the first to generate backlinks). Can the parse output be safely reused? + +> The page content fed to htmlize may be different than that fed to scan, +> as directives can change the content. If you cached the input and output +> at scan time, you could reuse the cached data at htmlize time for inputs +> that are the same -- but that could be a very big cache! --[[Joey]] + +>> I would propose using a simple heuristic: If you see \[[ anywhere on the +>> page, don't cache it. It would be an effective cache for pure-rst wikis +>> (without any ikiwiki directives or wikilinks). +>> However, I think that if the cache does not work for a big load, it should +>> not work at all; small loads are small so they don't matter. --ulrik + +----- + +Another possiblity is using empty url for wikilinks (gitit uses this approach), for example: + + `SomePage <>`_ + +Since it uses *empty* url, I would like to call it *proposal 0* :-) --[weakish] + +[weakish]: http://weakish.pigro.net diff --git a/doc/todo/Restrict_page_viewing.mdwn b/doc/todo/Restrict_page_viewing.mdwn new file mode 100644 index 000000000..9c1889d63 --- /dev/null +++ b/doc/todo/Restrict_page_viewing.mdwn @@ -0,0 +1,39 @@ +I'd like to have some pages of my wiki to be only viewable by some users. + +I could use htaccess for that, but it would force the users to have +2 authentication mecanisms, so I'd prefer to use openID for that too. + +* I'm thinking of adding a "show" parameter to the cgi script, thanks + to a plugin similar to goto. +* When called, it would check the credential using the session stuff + (that I don't understand yet). +* If not enough, it would serve a 403 error of course. +* If enough, it would read the file locally on the server side and + return this as a content. + +Then, I'd have to generate the private page the regular way with ikiwiki, +and prevent apache from serving them with an appropriate and +much more maintainable htaccess file. + +-- [[users/emptty]] + +> While I'm sure a plugin could do this, it adds so much scalability cost +> and is so counter to ikiwiki's design.. Have you considered using the +> [[plugins/httpauth]] plugin to unify around htaccess auth? --[[Joey]] + +>> I'm not speaking of rendering the pages on demand, but to serve them on demand. +>> They would still be compiled the regular way; +>> I'll have another look at [[plugins/httpauth]] but I really like the openID whole idea. +>> --[[emptty]] + +>>> How about +>>> [mod_auth_openid](http://trac.butterfat.net/public/mod_auth_openid), then? +>>> A plugin for ikiwiki to serve its own pages is far afield from ikiwiki's roots, +>>> as Joey pointed out, but might be a neat option to have anyway -- for unifying +>>> authentication across views and edits, for systems not otherwise running +>>> web servers, for systems with web servers you don't have access to, and +>>> doubtless for other purposes. Such a plugin would add quite a bit of flexibility, +>>> and in that sense (IMO, of course) it'd be in the spirit of ikiwiki. --[[schmonz]] + +>>>> Yes, I think this could probably be used in combination with ikiwiki's +>>>> httpauth and openid plugins. --[[Joey]] diff --git a/doc/todo/Separate_OpenIDs_and_usernames.mdwn b/doc/todo/Separate_OpenIDs_and_usernames.mdwn index 2cd52e8c4..a4940220a 100644 --- a/doc/todo/Separate_OpenIDs_and_usernames.mdwn +++ b/doc/todo/Separate_OpenIDs_and_usernames.mdwn @@ -6,6 +6,48 @@ I see this being implemented in one of two possible ways. The easiest seems like A slightly more complex next step would be to request sreg from the provider and, if provided, automatically set the identity's username and email address from the provided persona. If username login to accounts with blank passwords is disabled, then you have the best of both worlds. Passwordless signin, human-friendly attribution, automatic setting of preferences. +> Given that openids are a global user identifier, that can look as pretty +> as the user cares to make it look via delegation, I am not a fan of +> having a site-local identifier that layered on top of that. Perhaps +> partly because every site that I have seen that does that has openid +> implemented as a badly-done wart on the side of their regular login +> system. +> +> The openid plugin now attempts to get an email and a username, and stores +> them in the session database for later use (ie, when the user edits a +> page). +> +> I am considering displaying the userid or fullname, if available, +> instead of the munged openid url in recentchanges and comments. +> It would be nice for those nasty [[google_openids|forum/google_openid_broken?]]. +> But, I first have to find a way to encode the name in the VCS commit log, +> while still keeping the openid of the committer in there too. +> Perhaps something like this (for git): --[[Joey]] +> +> Author: Joey Hess <http://joey.kitenet.net/@web> +> +> Only problem with the above is that the openid will still be displayed +> by CIA. Other option is this, which solves that, but at the expense of +> having to munge the username to fit inside the email address, +> and generally seems backwards: --[[Joey]] +> +> Author: http://joey.kitenet.net/ <Joey_Hess@web> +> +> So, what needs to be done: +> +> * Change `rcs_commit` and `rcs_commit_staged` to take a session object, +> instead of just a userid. (For back-compat, if the parameter is +> not an object, it's a userid.) Bump ikiwiki plugin interface version. +> (done) +> * Modify all RCS plugins to include the session username somewhere +> in the commit, and parse it back out in `rcs_recentchanges`. +> (done for git only so far) +> * Modify recentchanges plugin to display the username instead of the +> `openiduser`. +> (done) +> * Modify comment plugin to put the session username in the comment +> template instead of the `openiduser`. (done) + Unfortunately I don't speak Perl, so hopefully someone thinks these suggestions are good enough to code up. I've hacked on openid code in Ruby before, so hopefully these changes aren't all that difficult to implement. Even if you don't get any data via sreg, you're no worse off than where you are now, so I don't think there'd need to be much in the way of error/sanity-checking of returned data. If it's null or not available then no big deal, typing in a username is no sweat. -[[!tag wishlist]] +[[!tag wishlist done]] diff --git a/doc/todo/Set_templates_for_whole_sections_of_the_site.mdwn b/doc/todo/Set_templates_for_whole_sections_of_the_site.mdwn new file mode 100644 index 000000000..b130f4ec5 --- /dev/null +++ b/doc/todo/Set_templates_for_whole_sections_of_the_site.mdwn @@ -0,0 +1,33 @@ +At the moment, IkiWiki allows you to set the template for individual pages using the [[plugins/pagetemplate]] directive/plugin, but not for whole sections of the wiki. + +I've written a new plugin, sectiontemplate, available in the `page_tmpl` branch of my git repository that allows setting the template by pagespec in the config file. + +-- [[Will]] + +> This is an excellent idea and looks ready for merging. +> Before I do though, would it perhaps make sense to merge +> this in with the pagetemplate plugin? They do such a similar thing, +> and unless letting users control the template by editing a page is not +> wanted, I can't see any reason not to combine them. --[[Joey]] + +> One other idea is moving the pagespec from setup file to a directive. +> The edittemplate plugin does that, and it might be nice +> to be consistent with that. To implement it, you'd have to +> make the scan hook store the template pagespecs in `%pagestate`, +> so a bit more complicated. --[[Joey]] + +>> I started with the pagetemplate plugin, which is why they're so similar. I guess they could be combined. +>> I had a brief think about having the specs and templates defined in a directive rather than the config file, but it got tricky. +>> How do you know what needs to be rebuilt when? There is probably a solution, maybe even an obvious one, but I thought that getting something done was more important than polishing it. +>> In the worst case, admins can always use the web interface to change the setup :). + +>> I wanted this to put comments on my new blog, and was more interested in that goal than this subgoal. I've moved most of my web pages to IkiWiki and there is only one small part that is the blog. +>> I wanted to use [[Disqus comments|tips/Adding_Disqus_to_your_wiki/]], but only on the blog pages. (I'm using Disqus rather than IkiWiki comments because I don't want to have to deal with spam, security, etc. I'll happily just let someone else host the comments.) -- [[Will]] + +>>> Yes, handing the rebuild is a good reason not to use directives for +>>> this. +>>> +>>> I do still think combining this with pagetemplate would be good. +>>> --[[Joey]] + +>>>> This is exactly what I was looking for and it took me a while to find it. I very much support the idea to provide this as a regular plugin, be it merged with pagetemplate or stand-alone. Thank you for your work and code! --BenTo diff --git a/doc/todo/Suggested_location_should_be_subpage_if_siblings_exist.mdwn b/doc/todo/Suggested_location_should_be_subpage_if_siblings_exist.mdwn index 0fb14bafe..4489dd5d2 100644 --- a/doc/todo/Suggested_location_should_be_subpage_if_siblings_exist.mdwn +++ b/doc/todo/Suggested_location_should_be_subpage_if_siblings_exist.mdwn @@ -23,4 +23,4 @@ that we're at the root of a (sub-)hierarchy. >> This sounds like WONTFIX to me? --[[smcv]] -[[!tag wishlist]] +[[!tag wishlist done]] diff --git a/doc/todo/Support_XML-RPC-based_blogging.mdwn b/doc/todo/Support_XML-RPC-based_blogging.mdwn index f9685be73..6a0593b17 100644 --- a/doc/todo/Support_XML-RPC-based_blogging.mdwn +++ b/doc/todo/Support_XML-RPC-based_blogging.mdwn @@ -9,6 +9,9 @@ blog names would work. --[[JoshTriplett]] >> I'd love to see support for this and would be happy to contribute towards a bounty (say US$100) :-). [PmWiki](http://www.pmwiki.org/) has a plugin which [implements this](http://www.pmwiki.org/wiki/Cookbook/XMLRPC) in a way which seems fairly sensible as an end user. --[[AdamShand]] +>>> Bump. This would be a nice feature, and with the talent on this project I'm sure it could be done safely, too. + + [[!tag soc]] [[!tag wishlist]] diff --git a/doc/todo/a_navbar_based_on_page_properties.mdwn b/doc/todo/a_navbar_based_on_page_properties.mdwn index 09be409ac..091e0a39b 100644 --- a/doc/todo/a_navbar_based_on_page_properties.mdwn +++ b/doc/todo/a_navbar_based_on_page_properties.mdwn @@ -38,4 +38,11 @@ well. There is a problem though if this navbar were included in a sidebar (the logical place): if a page is updated, the navbar needs to be rebuilt which causes the sidebar to be rebuilt, which causes the whole site to be rebuilt. Unless we can subscribe only to title changes, this will be pretty bad... --[[madduck]] + +> I've just written a plugin for a automatically created menu for use +> [here](http://www.ff-egersdorf-wachendorf.de/). The source is at +> [gitorious](http://gitorious.org/ikiwiki-plugins/automenu). The problem with +> rebuilding remains unsolved but doesn't matter that much for me as I always +> generate the web site myself, ie it's not really a wiki. --[[lnussel]] + [[!tag wishlist]] diff --git a/doc/todo/abbreviation.mdwn b/doc/todo/abbreviation.mdwn index d24166710..f2880091c 100644 --- a/doc/todo/abbreviation.mdwn +++ b/doc/todo/abbreviation.mdwn @@ -2,4 +2,6 @@ We might want some kind of abbreviation and acronym plugin. --[[JoshTriplett]] * Not sure if this is what you mean, but I'd love a way to make works which match existing page names automatically like (eg. if there is a page called "MySQL" then any time the word MySQL is mentioned it should become a link to that page). -- [[AdamShand]] + * The python-markdown-extras package has support for [abbreviations](http://www.freewisdom.org/projects/python-markdown/Abbreviations), with the syntax that you just use the abbreviation in text (e.g. HTML) and then define the abbreviations at the end (like "footnote-style" links). For consistency, it might be good to use the same syntax, which apparently derives from [PHP-markdown-extra](http://michelf.com/projects/php-markdown/extra/#abbr). + [[wishlist]] diff --git a/doc/todo/adjust_commit_message_for_rename__44___remove.mdwn b/doc/todo/adjust_commit_message_for_rename__44___remove.mdwn new file mode 100644 index 000000000..3d0d1aff4 --- /dev/null +++ b/doc/todo/adjust_commit_message_for_rename__44___remove.mdwn @@ -0,0 +1,5 @@ +When you rename or remove pages using the relevant plugins, a commit message is generated automatically by the plugin. + +It would be nice to provide a text field in the remove/rename form, pre-populated with the automatic message, so that a user may customize or append to the message (modulo VCS support) + +-- [[Jon]] diff --git a/doc/todo/allow_displaying_number_of_comments.mdwn b/doc/todo/allow_displaying_number_of_comments.mdwn new file mode 100644 index 000000000..02d55fc9b --- /dev/null +++ b/doc/todo/allow_displaying_number_of_comments.mdwn @@ -0,0 +1,30 @@ +My `numcomments` Git branch adds a `NUMCOMMENTS` `TMPL_VAR`, which is +useful to add to the `forumpage.tmpl` template to emulate (the nice +bits of) a more usual webforum. + +Please review... and pull :) + +-- [[intrigeri]] + +> How is having this variable for showing a count of the comments +> better (or more forum-ish) than the COMMENTSLINK variable which +> includes a count and a link to the comments, and is already displayed +> in inlinepage.tmpl? +> +> `num_comments` will never return undef. +> +> I see no need to add a second pagetemplate hook. +> The existing one can be added to. Probably inside its `if ($shown)` +> block. +> +> It may also be a good idea to either combine the calls to `num_comments` +> used for this and for the commentslink, +> or to memoize it. I'm thinking generally memoizing it may be a good idea +> since the comments for a page will typically be counted twice when it's +> inlined. +> --[[Joey]] + +[[patch]] + +>> Well, the COMMENTSLINK variable fits my needs. Sorry for +>> the disturbance. [[done]] --[[intrigeri]] diff --git a/doc/todo/allow_plugins_to_add_sorting_methods.mdwn b/doc/todo/allow_plugins_to_add_sorting_methods.mdwn new file mode 100644 index 000000000..b523cd19f --- /dev/null +++ b/doc/todo/allow_plugins_to_add_sorting_methods.mdwn @@ -0,0 +1,304 @@ +[[!tag patch]] + +The available [[ikiwiki/pagespec/sorting]] methods are currently hard-coded in +IkiWiki.pm, making it difficult to add any extra sorting mechanisms. I've +prepared a branch which adds 'sort' as a hook type and uses it to implement a +new `meta_title` sort type. + +Someone could use this hook to make `\[[!inline sort=title]]` prefer the meta +title over the page name, but for compatibility, I'm not going to (I do wonder +whether it would be worth making sort=name an alias for the current sort=title, +and changing the meaning of sort=title in 4.0, though). + +> What compatability concerns, exactly, are there that prevent making that +> change now? --[[Joey]] + +*[sort-hooks branch now withdrawn in favour of sort-package --s]* + +I briefly tried to turn *all* the current sort types into hook functions, and +have some of them pre-registered, but decided that probably wasn't a good idea. +That earlier version of the branch is also available for comparison: + +*[also withdrawn in favour of sort-package --s]* + +>> I wonder if IkiWiki would benefit from the concept of a "sortspec", like a [[ikiwiki/PageSpec]] but dedicated to sorting lists of pages rather than defining lists of pages? Rather than defining a sort-hook, define a SortSpec class, and enable people to add their own sort methods as functions defined inside that class, similarly to the way they can add their own pagespec definitions. --[[KathrynAndersen]] + +>>> [[!template id=gitbranch branch=smcv/ready/sort-package author="[[Simon_McVittie|smcv]]"]] +>>> I'd be inclined to think that's overkill, but it wasn't very hard to +>>> implement, and in a way is more elegant. I set it up so sort mechanisms +>>> share the `IkiWiki::PageSpec` package, but with a `cmp_` prefix. Gitweb: +>>> <http://git.pseudorandom.co.uk/smcv/ikiwiki.git?a=shortlog;h=refs/heads/sort-package> + +>>>> I agree it seems more elegant, so I have focused on it. +>>>> +>>>> I don't know about reusing `IkiWiki::PageSpec` for this. +>>>> --[[Joey]] + +>>>>> Fair enough, `IkiWiki::SortSpec::cmp_foo` would be just +>>>>> as easy, or `IkiWiki::Sorting::cmp_foo` if you don't like +>>>>> introducing "sort spec" in the API. I took a cue from +>>>>> [[ikiwiki/pagespec/sorting]] being a subpage of +>>>>> [[ikiwiki/pagespec]], and decided that yes, sorting is +>>>>> a bit like a pagespec :-) Which name would you prefer? --s + +>>>>>> `SortSpec` --[[Joey]] + +>>>>>>> [[Done]]. --s + +>>>> I would be inclined to drop the `check_` stuff. --[[Joey]] + +>>>>> It basically exists to support `title_natural`, to avoid +>>>>> firing up the whole import mechanism on every `cmp` +>>>>> (although I suppose that could just be a call to a +>>>>> memoized helper function). It also lets sort specs that +>>>>> *must* have a parameter, like +>>>>> [[field|plugins/contrib/field/discussion]], fail early +>>>>> (again, not so valuable). +>>>>> +>>>>>> AFAIK, `use foo` has very low overhead when the module is already +>>>>>> loaded. There could be some evalation overhead in `eval q{use foo}`, +>>>>>> if so it would be worth addressing across the whole codebase. +>>>>>> --[[Joey]] +>>>>>> +>>>>>>> check_cmp_foo now dropped. --s +>>>>> +>>>>> The former function could be achieved at a small +>>>>> compatibility cost by putting `title_natural` in a new +>>>>> `sortnatural` plugin (that fails to load if you don't +>>>>> have `title_natural`), if you'd prefer - that's what would +>>>>> have happened if `title_natural` was written after this +>>>>> code had been merged, I suspect. Would you prefer this? --s + +>>>>>> Yes! (Assuming it does not make sense to support +>>>>>> natural order sort of other keys than the title, at least..) +>>>>>> --[[Joey]] + +>>>>>>> Done. I added some NEWS.Debian for it, too. --s + +>>>> Wouldn't it make sense to have `meta(title)` instead +>>>> of `meta_title`? --[[Joey]] + +>>>>> Yes, you're right. I added parameters to support `field`, +>>>>> and didn't think about making `meta` use them too. +>>>>> However, `title` does need a special case to make it +>>>>> default to the basename instead of the empty string. +>>>>> +>>>>> Another special case for `title` is to use `titlesort` +>>>>> first (the name `titlesort` is derived from Ogg/FLAC +>>>>> tags, which can have `titlesort` and `artistsort`). +>>>>> I could easily extend that to other metas, though; +>>>>> in fact, for e.g. book lists it would be nice for +>>>>> `field(bookauthor)` to behave similarly, so you can +>>>>> display "Douglas Adams" but sort by "Adams, Douglas". +>>>>> +>>>>> `meta_title` is also meant to be a prototype of how +>>>>> `sort=title` could behave in 4.0 or something - sorting +>>>>> by page name (which usually sorts in approximately the +>>>>> same place as the meta-title, but occasionally not), while +>>>>> displaying meta-titles, does look quite odd. --s + +>>>>>> Agreed. --[[Joey]] + +>>>>>>> I've implemented meta(title). meta(author) also has the +>>>>>>> `sortas` special case; meta(updated) and meta(date) +>>>>>>> should also work how you'd expect them to (but they're +>>>>>>> earliest-first, unlike age). --s + +>>>> As I read the regexp in `cmpspec_translate`, the "command" +>>>> is required to have params. They should be optional, +>>>> to match the documentation and because most sort methods +>>>> do not need parameters. --[[Joey]] + +>>>>> No, `$2` is either `\w+\([^\)]*\)` or `[^\s]+` (with the +>>>>> latter causing an error later if it doesn't also match `\w+`). +>>>>> This branch doesn't add any parameterized sort methods, +>>>>> in fact, although I did provide one on +>>>>> [[field's_discussion_page|plugins/contrib/report/discussion]]. --s + +>>>> I wonder if it would make sense to add some combining keywords, so +>>>> a sortspec reads like `sort="age then ascending title"` +>>>> In a way, this reduces the amount of syntax that needs to be learned. +>>>> I like the "then" (and it could allow other operations than +>>>> simple combination, if any others make sense). Not so sure about the +>>>> "ascending", which could be "reverse" instead, but "descending age" and +>>>> "ascending age" both seem useful to be able to explicitly specify. +>>>> --[[Joey]] + +>>>>> Perhaps. I do like the simplicity of [[KathrynAndersen]]'s syntax +>>>>> from [[plugins/contrib/report]] (which I copied verbatim, except for +>>>>> turning sort-by-`field` into a parameterized spec). +>>>>> +>>>>> If we're getting into English-like (or at least SQL-like) queries, +>>>>> it might make sense to change the signature of the hook function +>>>>> so it's a function to return a key, e.g. +>>>>> `sub key_age { return -%pagemtime{$_[0]) }`. Then we could sort like +>>>>> this: +>>>>> +>>>>> field(artistsort) or field(artist) or constant(Various Artists) then meta(titlesort) or meta(title) or title +>>>>> +>>>>> with "or" binding more closely than "then". Does this seem valuable? +>>>>> I think the implementation would be somewhat more difficult. and +>>>>> it's probably getting too complicated to be worthwhile, though? +>>>>> (The keys that actually benefit from this could just +>>>>> have smarter cmp functions, I think.) +>>>>> +>>>>> If the hooks return keys rather than cmp results, then we could even +>>>>> have "lowercase" as an adjective used like "ascending"... maybe. +>>>>> However, there are two types of adjective here: "lowercase" +>>>>> really applies to the keys, whereas "ascending" applies to the "cmp" +>>>>> result. Again, I think this is getting too complex, and could just +>>>>> be solved with smarter cmp functions. +>>>>> +>>>>>> I agree. (Also, I think returning keys may make it harder to write +>>>>>> smarter cmp functions.) --[[Joey]] +>>>>> +>>>>> Unfortunately, `sort="ascending mtime"` actually sorts by *descending* +>>>>> timestamp (but`sort=age` is fine, because `age` could be defined as +>>>>> now minus `ctime`). `sort=freshness` isn't right either, because +>>>>> "sort by freshness" seems as though it ought to mean freshest first, +>>>>> but "sort by ascending freshness" means put the least fresh first. If +>>>>> we have ascending and descending keywords which are optional, I don't +>>>>> think we really want different sort types to have different default +>>>>> directions - it seems clearer to have `ascending` always be a no-op, +>>>>> and `descending` always negate. +>>>>> +>>>>>> I think you've convinced me that ascending/descending impose too +>>>>>> much semantics on it, so "-" is better. --[[Joey]] + +>>>>>>> I've kept the semantics from `report` as-is, then: +>>>>>>> e.g. `sort="age -title"`. --s + +>>>>> Perhaps we could borrow from `meta updated` and use `update_age`? +>>>>> `updateage` would perhaps be a more normal IkiWiki style - but that +>>>>> makes me think that updateage is a quantity analagous to tonnage or +>>>>> voltage, with more or less recently updated pages being said to have +>>>>> more or less updateage. I don't know whether that's good or bad :-) +>>>>> +>>>>> I'm sure there's a much better word, but I can't see it. Do you have +>>>>> a better idea? --s + +[Regarding the `meta title=foo sort=bar` special case] + +> I feel it sould be clearer to call that "sortas", since "sort=" is used +> to specify a sort method in other directives. --[[Joey]] +>> Done. --[[smcv]] + +## speed + +I notice the implementation does not use the magic `$a` and `$b` globals. +That nasty perl optimisation is still worthwhile: + + perl -e 'use warnings; use strict; use Benchmark; sub a { $a <=> $b } sub b ($$) { $_[0] <=> $_[1] }; my @list=reverse(1..9999); timethese(10000, {a => sub {my @f=sort a @list}, b => sub {my @f=sort b @list}, c => => sub {my @f=sort { b($a,$b) } @list}})' + Benchmark: timing 10000 iterations of a, b, c... + a: 80 wallclock secs (76.74 usr + 0.05 sys = 76.79 CPU) @ 130.23/s (n=10000) + b: 112 wallclock secs (106.14 usr + 0.20 sys = 106.34 CPU) @ 94.04/s (n=10000) + c: 330 wallclock secs (320.25 usr + 0.17 sys = 320.42 CPU) @ 31.21/s (n=10000) + +Unfortunatly, I think that c is closest to the new implementation. +--[[Joey]] + +> Unfortunately, `$a` isn't always `$main::a` - it's `$Package::a` where +> `Package` is the call site of the sort call. This was a showstopper when +> `sort` was a hook implemented in many packages, but now that it's a +> `SortSpec`, I may be able to fix this by putting a `sort` wrapper in the +> `SortSpec` namespace, so it's like this: +> +> sub sort ($@) +> { +> my $cmp = shift; +> return sort $cmp @_; +> } +> +> which would mean that the comparison used `$IkiWiki::SortSpec::a`. +> --s + +>> I've now done this. On a wiki with many [[plugins/contrib/album]]s +>> (a full rebuild takes half an hour!), I tested a refresh after +>> `touch tags/*.mdwn` (my tag pages contain inlines of the form +>> `tagged(foo)` sorted by date, so they exercise sorting). +>> I also tried removing sorting from `pagespec_match_list` +>> altogether, as an upper bound for how fast we can possibly make it. +>> +>> * `master` at branch point: 63.72user 0.29system +>> * `master` at branch point: 63.91user 0.37system +>> * my branch, with `@_`: 65.28user 0.29system +>> * my branch, with `@_`: 65.21user 0.28system +>> * my branch, with `$a`: 64.09user 0.28system +>> * my branch, with `$a`: 63.83user 0.36system +>> * not sorted at all: 58.99user 0.29system +>> * not sorted at all: 58.92user 0.29system +>> +>> --s + +> I do notice that `pagespec_match_list` performs the sort before the +> filter by pagespec. Is this a deliberate design choice, or +> coincidence? I can see that when `limit` is used, this could be +> used to only run the pagespec match function until `limit` pages +> have been selected, but the cost is that every page in the wiki +> is sorted. Or, it might be useful to do the filtering first, then +> sort the sub-list thus produced, then finally apply the limit? --s + +>> Yes, it was deliberate, pagespec matching can be expensive enough that +>> needing to sort a lot of pages seems likely to be less work. (I don't +>> remember what benchmarking was done though.) --[[Joey]] + +>>> We discussed this on IRC and Joey pointed out that this also affects +>>> dependency calculation, so I'm not going to get into this now... --s + +Joey pointed out on IRC that the `titlesort` feature duplicates all the +meta titles. I did that in order to sort by the unescaped version, but +I've now changed the branch to only store that if it makes a difference. +--s + +## Documentation from sort-package branch + +### advanced sort orders (conditionally added to [[ikiwiki/pagespec/sorting]]) + +* `title_natural` - Orders by title, but numbers in the title are treated + as such, ("1 2 9 10 20" instead of "1 10 2 20 9") +* `meta(title)` - Order according to the `\[[!meta title="foo" sortas="bar"]]` + or `\[[!meta title="foo"]]` [[ikiwiki/directive]], or the page name if no + full title was set. `meta(author)`, `meta(date)`, `meta(updated)`, etc. + also work. + +### Multiple sort orders (added to [[ikiwiki/pagespec/sorting]]) + +In addition, you can combine several sort orders and/or reverse the order of +sorting, with a string like `age -title` (which would sort by age, then by +title in reverse order if two pages have the same age). + +### meta sortas parameter (added to [[ikiwiki/directive/meta]]) + +[in title] + +An optional `sort` parameter will be used preferentially when +[[ikiwiki/pagespec/sorting]] by `meta(title)`: + + \[[!meta title="The Beatles" sort="Beatles, The"]] + + \[[!meta title="David Bowie" sort="Bowie, David"]] + +[in author] + + An optional `sortas` parameter will be used preferentially when + [[ikiwiki/pagespec/sorting]] by `meta(author)`: + + \[[!meta author="Joey Hess" sortas="Hess, Joey"]] + +### Sorting plugins (added to [[plugins/write]]) + +Similarly, it's possible to write plugins that add new functions as +[[ikiwiki/pagespec/sorting]] methods. To achieve this, add a function to +the IkiWiki::SortSpec package named `cmp_foo`, which will be used when sorting +by `foo` or `foo(...)` is requested. + +The names of pages to be compared are in the global variables `$a` and `$b` +in the IkiWiki::SortSpec package. The function should return the same thing +as Perl's `cmp` and `<=>` operators: negative if `$a` is less than `$b`, +positive if `$a` is greater, or zero if they are considered equal. It may +also raise an error using `error`, for instance if it needs a parameter but +one isn't provided. + +The function will also be passed one or more parameters. The first is +`undef` if invoked as `foo`, or the parameter `"bar"` if invoked as `foo(bar)`; +it may also be passed additional, named parameters. diff --git a/doc/todo/allow_site-wide_meta_definitions.mdwn b/doc/todo/allow_site-wide_meta_definitions.mdwn index 70ccc2b68..82670250e 100644 --- a/doc/todo/allow_site-wide_meta_definitions.mdwn +++ b/doc/todo/allow_site-wide_meta_definitions.mdwn @@ -5,8 +5,159 @@ I'd like to define [[plugins/meta]] values to apply across all pages site-wide unless the pages define their own: default values for meta definitions essentially. -Here's a patch to achieve this (also in the "defaultmeta" branch of -my github ikiwiki fork): + <snip old patch, see below for latest> + +-- [[Jon]] + +> This doesn't support multiple-argument meta directives like +> `link=x rel=y`, or meta directives with special side-effects like +> `updated`. +> +> The first could be solved (if you care) by a syntax like this: +> +> meta_defaults => [ +> { copyright => "© me" }, +> { link => "about:blank", rel => "silly", }, +> ] +> +> The second could perhaps be solved by invoking `meta::preprocess` from within +> `scan` (which might be a simplification anyway), although this is complicated +> by the fact that some (but not all!) meta headers are idempotent. +> +> --[[smcv]] + +>> Thanks for your comment. I've revised the patch to use the config syntax +>> you suggest. I need to perform some more testing to make sure I've +>> addressed the issues you highlight. +>> +>> I had to patch part of IkiWiki core, the merge routine in Setup, because +>> the use of `possibly_foolish_untaint` was causing the hashrefs at the deep +>> end of the data structure to be converted into strings. The specific change +>> I've made may not be acceptable, though -- I'd appreciate someone providing +>> some feedback on that hunk! + +>>> Well, re that hunk, taint checking is currently disabled, but +>>> if the perl bug that disallows it is fixed and it is turned back on, +>>> the hash values will remain tainted, which will probably lead to +>>> problems. +>>> +>>> I'm also leery of using such a complex data structure in config. +>>> The websetup plugin would be hard pressed to provide a UI for such a +>>> data structure. (It lacks even UI for a single hash ref yet, let alone +>>> a list.) +>>> +>>> Also, it seems sorta wrong to have two so very different syntaxes to +>>> represent the same meta data. A user without a lot of experience will +>>> be hard pressed to map from a directive to this in the setup file. +>>> +>>> All of which leads me to think the setup file could just contain +>>> a text that could hold meta directives. Which generalizes really to +>>> a text that contains any directives, and is, perhaps appended to the +>>> top of every page. Which nearly generalizes to the sidebar plugin, +>>> or perhaps something more general than that... +>>> +>>> However, excessive generalization is the root of all evil, so +>>> I'm not necessarily saying that's a good idea. Indeed, my memory +>>> concerns below invalidate this idea pretty well. --[[Joey]] + + diff --git a/IkiWiki/Plugin/meta.pm b/IkiWiki/Plugin/meta.pm + index 6fe9cda..2f8c098 100644 + --- a/IkiWiki/Plugin/meta.pm + +++ b/IkiWiki/Plugin/meta.pm + @@ -13,6 +13,8 @@ sub import { + hook(type => "needsbuild", id => "meta", call => \&needsbuild); + hook(type => "preprocess", id => "meta", call => \&preprocess, scan => 1); + hook(type => "pagetemplate", id => "meta", call => \&pagetemplate); + + hook(type => "scan", id => "meta", call => \&scan) + + if $config{"meta_defaults"}; + } + + sub getsetup () { + @@ -305,6 +307,15 @@ sub match { + } + } + + +sub scan() { + + my %params = @_; + + my $page = $params{page}; + + foreach my $default (@{$config{"meta_defaults"}}) { + + preprocess(%$default, page => $page, + + destpage => $page, preview => 0); + + } + +} + + + package IkiWiki::PageSpec; + + sub match_title ($$;@) { + diff --git a/IkiWiki/Setup.pm b/IkiWiki/Setup.pm + index 8a25ecc..e4d50c9 100644 + --- a/IkiWiki/Setup.pm + +++ b/IkiWiki/Setup.pm + @@ -51,7 +51,13 @@ sub merge ($) { + $config{$c}=$setup{$c}; + } + else { + - $config{$c}=[map { IkiWiki::possibly_foolish_untaint($_) } @{$setup{$c}}] + + $config{$c}=[map { + + if(ref $_ eq 'HASH') { + + $_ + + } else { + + IkiWiki::possibly_foolish_untaint($_) + + } + + } @{$setup{$c}}]; + } + } + elsif (ref $setup{$c} eq 'HASH') { + diff --git a/doc/ikiwiki/directive/meta.mdwn b/doc/ikiwiki/directive/meta.mdwn + index 000f461..8d34ee4 100644 + --- a/doc/ikiwiki/directive/meta.mdwn + +++ b/doc/ikiwiki/directive/meta.mdwn + @@ -12,6 +12,16 @@ also specifies some additional sub-parameters. + The field values are treated as HTML entity-escaped text, so you can include + a quote in the text by writing `"` and so on. + + +You can also define site-wide defaults for meta values by including them + +in your setup file. The key used is `meta_defaults` and the value is a list + +of hashes, one per meta directive. e.g.: + + + + meta_defaults = [ + + { copyright => "Copyright 2007 by Joey Hess" }, + + { license => "GPL v2+" }, + + { link => "somepage", rel => "site entrypoint", }, + + ], + + + Supported fields: + + * title + +-- [[Jon]] + +>> Ok, I've had a bit of a think about this. There are currently 15 supported +>> meta fields. Of these: title, licence, copyright, author, authorurl, +>> and robots might make sense to define globally and override on a per-page +>> basis. +>> +>> Less so, description (due to its impact on map); openid (why would +>> someone want more than one URI to act as an openid endpoint to the same +>> place?); updated. I can almost see why someone might want to set a global +>> updated value. Almost. +>> +>> Not useful are permalink, date, stylesheet (you already have a global +>> stylesheet), link, redir, and guid. +>> +>> In other words, the limitations of my first patch that [[smcv]] outlined +>> are only relevant to defined fields that you wouldn't want to specify a +>> global default for anyway. +>> +>>> I generally agree with this. It is *possible* that meta would have a new +>>> field added, that takes parameters and make sense to use globally. +>>> --[[Joey]] +>> +>> Due to this, and the added complexity of the second patch (having to adjust +>> `IkiWiki/Setup.pm`), I think the first patch makes more sense. I've thus +>> reverted to it here. +>> +>> Is this merge-worthy? diff --git a/IkiWiki/Plugin/meta.pm b/IkiWiki/Plugin/meta.pm index b229592..3132257 100644 @@ -56,19 +207,40 @@ my github ikiwiki fork): -- [[Jon]] -> This doesn't support multiple-argument meta directives like -> `link=x rel=y`, or meta directives with special side-effects like -> `updated`. -> -> The first could be solved (if you care) by a syntax like this: -> -> meta_defaults => [ -> { copyright => "© me" }, -> { link => "about:blank", rel => "silly", }, -> ] -> -> The second could perhaps be solved by invoking `meta::preprocess` from within -> `scan` (which might be a simplification anyway), although this is complicated -> by the fact that some (but not all!) meta headers are idempotent. -> -> --[[smcv]] +>>> Merry Christmas/festive season/happy new year folks. I've been away from +>>> ikiwiki for the break, and now I've returned to watching recentchanges. +>>> Hopefully I'll be back in the mix soon, too. In the meantime, Joey, have +>>> you had a chance to look at this yet? -- [[Jon]] + +>>>> Ping :) Hi. [[Joey]], would you consider this patch for the next +>>>> ikiwiki release? -- [[Jon]] + +>>> For this to work with websetup and --dumpsetup, it needs to define the +>>> `meta_*` settings in the getsetup function. +>>>> +>>>> I think this will be problematic with the current implementation of this +>>>> patch. The datatype here is an array of hash references, with each hash +>>>> having a variable (and arbitrary) number of key/value pairs. I can't +>>>> think of an intuitive way of implementing a way of editing such a +>>>> datatype in the web interface, let alone registering the option in +>>>> getsetup. +>>>> +>>>> Perhaps a limited set of defined meta values could be exposed via +>>>> websetup (the obvious ones: author, copyright, license, etc.) -- [[Jon]] +>>> +>>> I also have some concerns about both these patches, since both throw +>>> a lot of redundant data at meta, which then stores it in a very redundant +>>> way. Specifically, meta populates a per-page `%metaheaders` hash +>>> as well as storing per-page metadata in `%pagestate`. So, if you have +>>> a wiki with 10 thousand pages, and you add a 1k site-wide license text, +>>> that will bloat the memory usage of ikiwiki by in excess of 2 +>>> megabytes. It will also cause ikiwiki to write a similar amount more data +>>> to its state file which has to be loaded back in each +>>> run. +>>> +>>> Seems that this could be managed much more efficiently by having +>>> meta special-case the site-wide settings, not store them in these +>>> per-page data structures, and just make them be used if no per-page +>>> metadata of the given type is present. --[[Joey]] +>>>> +>>>> that should be easy enough to do. I will work on a patch. -- [[Jon]] diff --git a/doc/todo/anon_push_of_comments.mdwn b/doc/todo/anon_push_of_comments.mdwn new file mode 100644 index 000000000..b472ea13f --- /dev/null +++ b/doc/todo/anon_push_of_comments.mdwn @@ -0,0 +1,14 @@ +It should be possible to use anonymous git push to post comments +(created, say, by a ikiwiki-comment program). Currently, that is not +allowed, because users cannot edit, or create internal page files. +But, comments in allowed locations are an exception to that rule, and +that exception should be communicated somehow to `IkiWiki::Receive`. +--[[Joey]] + +> Complications include: +> +> * Hard to see a way to prevent users from committing a comment that +> claims to be written by someone else. +> * `checkcontent` hooks need to be run, but can't accept a comment +> for later moderation, since it's coming in as part of a commit. +> Best they could do is reject the commit. diff --git a/doc/todo/auto-create_tag_pages_according_to_a_template.mdwn b/doc/todo/auto-create_tag_pages_according_to_a_template.mdwn index f1d33114f..7eb404910 100644 --- a/doc/todo/auto-create_tag_pages_according_to_a_template.mdwn +++ b/doc/todo/auto-create_tag_pages_according_to_a_template.mdwn @@ -4,7 +4,7 @@ Tags are mainly specific to the object to which they’re stuck. However, I ofte Also see: <http://madduck.net/blog/2008.01.06:new-blog/> and <http://users.itk.ppke.hu/~cstamas/code/ikiwiki/autocreatetagpage/> -[[!tag wishlist plugins/tag patch]] +[[!tag wishlist plugins/tag patch patch/core]] I would love to see this as well. -- dato @@ -15,88 +15,9 @@ A new setting is used to enable or disable auto-create tag pages, `tag_autocreat The new tag file is created during the preprocess phase. The new tag file is then complied during the change phase. -_tag.pm from version 3.01_ - - - --- tag.pm 2009-02-06 10:26:03.000000000 -0700 - +++ tag_new.pm 2009-02-06 12:17:19.000000000 -0700 - @@ -14,6 +14,7 @@ - hook(type => "preprocess", id => "tag", call => \&preprocess_tag, scan => 1); - hook(type => "preprocess", id => "taglink", call => \&preprocess_taglink, scan => 1); - hook(type => "pagetemplate", id => "tag", call => \&pagetemplate); - + hook(type => "change", id => "tag", call => \&change); - } - - sub getopt () { - @@ -36,6 +37,36 @@ - safe => 1, - rebuild => 1, - }, - + tag_autocreate => { - + type => "boolean", - + example => 0, - + description => "Auto-create the new tag pages, uses autotagpage.tmpl ", - + safe => 1, - + rebulid => 1, - + }, - +} - + - +my $autocreated_page = 0; - + - +sub gen_tag_page($) { - + my $tag=shift; - + - + my $tag_file=$tag.'.'.$config{default_pageext}; - + return if (-f $config{srcdir}.$tag_file); - + - + my $template=template("autotagpage.tmpl"); - + $template->param(tag => $tag); - + writefile($tag_file, $config{srcdir}, $template->output); - + $autocreated_page = 1; - + - + if ($config{rcs}) { - + IkiWiki::disable_commit_hook(); - + IkiWiki::rcs_add($tag_file); - + IkiWiki::rcs_commit_staged( - + gettext("Automatic tag page generation"), - + undef, undef); - + IkiWiki::enable_commit_hook(); - + } - } - - sub tagpage ($) { - @@ -47,6 +78,10 @@ - $tag=~y#/#/#s; # squash dups - } - - + if (defined $config{tag_autocreate} && $config{tag_autocreate} ) { - + gen_tag_page($tag); - + } - + - return $tag; - } - - @@ -125,4 +160,18 @@ - } - } - - +sub change(@) { - + return unless($autocreated_page); - + $autocreated_page = 0; - + - + # This refresh/saveindex is to complie the autocreated tag pages - + IkiWiki::refresh(); - + IkiWiki::saveindex(); - + - + # This refresh/saveindex is to fix the Tags link - + # With out this additional refresh/saveindex the tag link displays ?tag - + IkiWiki::refresh(); - + IkiWiki::saveindex(); - +} - + +*see git history of this page if you want the patch --[[smcv]]* - -This uses a [[template|wikitemplates]] called `autotagpage.tmpl`, here is my template file: +This uses a [[template|templates]] called `autotagpage.tmpl`, here is my template file: \[[!inline pages="link(<TMPL_VAR TAG>)" archive="yes"]] @@ -123,3 +44,208 @@ On the second extra pass, it doesn't notice that it has to update the "?"-link. } is not satisfied for the newly created tag page. I shall put debug msgs into Render.pm to find out better how it works. --Ivan Z. + +--- + +I've made another attempt at fixing this + +The current progress can be found at my [git repository][gitweb] on branch +`autotag`: + + git://git.liegesta.at/git/ikiwiki + +[gitweb]: http://git.liegesta.at/?p=ikiwiki.git;a=shortlog;h=refs/heads/autotag (gitweb for branch autotag) + +It's not entirely finished yet, but already quite usable. Testing and comments +on code quality, implementation details, as well as other patches would be +appreciated. + +Here's what it does right now: + +* enabled by setting `tag_autocreate=1` in the configuration. +* Tag pages will be created in `tagbase` from the template `autotag.tmpl`. +* Will correctly render all links, and dependencies. Well, AFAIK. +* When a tag page is deleted it will automatically recreated from template. (I +consider this a feature, not a bug) +* Requires a rebuild on first use. +* Adds a function `add_autofile()` to the plugin API, to do all this. + +Todo/Bugs: + +* Will still create a page even if there's a page other than `$tag` under +`tagbase` satisfying the tag link. (details? --[[Joey]]) +* Call from `IkiWiki.pm` to `Render.pm`, which adds a module dependency in the +wrong direction. (fixed --[[Joey]] ) +* Add files to RCS. +* Unit tests. +* Proper documentation. (fixed (mostly) --[[Joey]]) + +--[[David_Riebenbauer]] + +> Starting review of this. Some of your commits are to very delicate, +> optimised, and security-sensitive ground, so I have to look at them very +> carefully. --[[Joey]] + +>> First of, sorry that it took me so damn long to answer. I didn't lose +>> interest but it took a while for me to find the time and motivation +>> to address you suggestions. --[[David_Riebenbauer]] + +> * In the refactoring in [f3abeac919c4736429bd3362af6edf51ede8e7fe][], +> you introduced at least 2 bugs, one a possible security hole. +> Now one part of the code tests `if ($file)` and the other +> caller tests `if ($f)`. These two tests both tested `if (! defined $f)` +> before. Notice that the variable needs to be the untainted variable +> for both. Also notice that `if ($f)` fails if `$f` contains `0`, +> which is a very common perl gotcha. +> * Your refactored code changes `-l $_ || -d _` to `-l $file || -d $file`. +> The latter makes one more stat system call; note the use of a +> bare `_` in the first to make perl reuse the stat buffer. +> * (As a matter of style, could you put a space after the commas in your +> perl?) + +>> The first two points should be addressed in +>> [da5d29f95f6e693e8c14be1b896cf25cf4fdb3c0][]. And sure, I can add the +>> spaces. --[[David_Riebenbauer]] + +> I'd like to cherry-pick the above commit, once it's in shape, before +> looking at the rest in detail. So just a few other things that stood out. +> +> * Commit [4af4d26582f0c2b915d7102fb4a604b176385748][] seems unnecessary. +> `srcfile($file, 1)` already is documented to return undef if the +> file does not exist. (But without the second parameter, it throws +> an error.) + +>> You're right. I must have been some confused by some other promplem I +>> introduced then. Reverted. --[[David_Riebenbauer]] + +> * Commit [f58f3e1bec41ccf9316f37b014ce0b373c8e49e1][] adds a line +> that is intented by a space, not a tab. + +>> Sorry, That one was reverted anyway. --[[David_Riebenbauer]] + +> * Commit [f58f3e1bec41ccf9316f37b014ce0b373c8e49e1][] says that auto-added +> files will be recreated if the user deletes them. That seems bad. +> `autoindex` goes to some trouble to not recreate deleted files. + +>> I reverted the commit and addressed the issue in +>> [a358d74bef51dae31332ff27e897fe04834571e6][] and +>> [981400177d68a279f485727be3f013e68f0bf691][]. + --[[David_Riebenbauer]] + +>>> This doesn't seem to have all the refinements that autoindex has: +>>> +>>> * `autoindex` attaches the record of deletions to the `index` page, which +>>> is (nearly) guaranteed to exist; this one attaches the record of +>>> deletions to the deleted page's page state. Won't that tend to result +>>> in losing the record along with the deleted page? + +>>>> This is probably on of the harder things to do, 'cause there are (most of the +>>>> time) several pages that are responsible for the creation of a single tag page. +>>>> Of course I could attach the info to all of them. + +>>>> With current behaviour I think the information in `%pagestate` is kept around +>>>> regardless whether the corresponding page exists or not. +>>>> --[[David_Riebenbauer]] + +>>>>> Sorry, I'll try to be clearer: `autoindex` hard-codes that the [[index]] page +>>>>> of the entire wiki is the one responsible for storing the page state. That +>>>>> page isn't responsible for the creation of the tag page, it's just an +>>>>> arbitrary page that's (more or less) guaranteed to exist. --[[smcv]] + +>>>>> I don't like that [[plugins/autoindex]] has to do that, +>>>>> but `%pagestate` values are only stored for pages that exist, +>>>>> so it was necessary. (Another way to look at this is that +>>>>> `%pagestate` is not the ideal data structure.) --[[Joey]] + +>>>>>> Aha! Having looked at [[plugins/write]] again, it turns out that what this +>>>>>> feature should really use is `%wikistate`, I think? :-) --[[smcv]] + +>>>>>>> Ah, indeed, that came after I wrote autoindex. I've fixed autoindex to +>>>>>>> use it. --[[Joey]] + +>>>>> Ok, now I know what you mean. --[[David_Riebenbauer]] + +>>> * `autoindex` forgets that a page was deleted when that page is +>>> re-created + +>>>> Yes, I forgot about that and that is a bug. I'll fix that. +>>>> --[[David_Riebenbauer]] + +>>>>> In my branch, it keeps a list of autofiles that were created, +>>>>> not deleted. And I think that turns out to be necessary, really. +>>>>> However, I see no way to clean out that list on deletion and +>>>>> manual recreation -- it still needs to remember it was once an autofile, +>>>>> in order to avoid recreating it if it's deleted yet again. --[[Joey]] + +>>>>>> Are these really the semantics we want? It seems strange to me +>>>>>> that this: +>>>>>> +>>>>>> * tag a page as foo +>>>>>> * tags/foo automatically appears +>>>>>> * delete tags/foo +>>>>>> * create tags/foo manually +>>>>>> * delete tags/foo again +>>>>>> * tags/foo isn't automatically created +>>>>>> +>>>>>> isn't the same as this: +>>>>>> +>>>>>> * create tags/foo +>>>>>> * delete tags/foo +>>>>>> * tag a page as foo +>>>>>> * tags/foo automatically appears +>>>>>> +>>>>>> or even this: +>>>>>> +>>>>>> * create tags/foo +>>>>>> * tag a page as foo +>>>>>> * delete tags/foo +>>>>>> * tags/foo automatically appears (?) +>>>>>> +>>>>>> --[[smcv]] + +>>>>>>> I agree that the last of these is not desired. It could be avoided +>>>>>>> by extending the list of autofiles to include those that were not +>>>>>>> created due to the file/page already existing. +>>>>>>> +>>>>>>> Hmm, that would fix the previous scenario too. --[[Joey]] + +>>> * `autoindex` forgets that a page was deleted when it's no longer needed +>>> anyway (this may be harder for `autotag`?) + +>>>> I don't think so. AFAIK ikiwiki can detect whether there are taglinks to a page +>>>> anyway, so it should be quite easy. I'll try to implement that too. +>>>> --[[David_Riebenbauer]] + +>>> It'd probably be an interesting test of the core change to port +>>> `autoindex` to use it? (Adding the file to the RCS would be +>>> necessary to get parity with `autoindex`.) --[[smcv]] + +>>>> Good suggestion. Adding the files to RCS is on my todo list anyway. +>>>> --[[David_Riebenbauer]] + +>>>>> I think it may be better to allow the `add_autofile` caller +>>>>> to specify if it is added to RCS. In my branch, it can do +>>>>> so by just making the callback it registers call `rcs_add`; +>>>>> and I have tag do this. Other plugins might want autofiles +>>>>> that do not get checked in, conceivably. +>>>>> --[[Joey]] + +> Regarding the call from `IkiWiki.pm` to `Render.pm`, wouldn't this be +> quite easy to solve by moving `verify_src_file` to IkiWiki.pm? --[[smcv]] + +>> True. I'll do that. --[[David_Riebenbauer]] +>> Fixed in my branch --[[Joey]] + +[[!template id=gitbranch branch=origin/autotag author="[[Joey]]"]] +I've pushed an autotag branch of my own, which refactors +things a bit and fixes bugs around deletion/recreation. +I've tested it fairly thouroughly. --[[Joey]] + +[f3abeac919c4736429bd3362af6edf51ede8e7fe]: http://git.liegesta.at/?p=ikiwiki.git;a=commitdiff;h=f3abeac919c4736429bd3362af6edf51ede8e7fe (commitdiff for f3abeac919c4736429bd3362af6edf51ede8e7fe) +[4af4d26582f0c2b915d7102fb4a604b176385748]: http://git.liegesta.at/?p=ikiwiki.git;a=commitdiff;h=4af4d26582f0c2b915d7102fb4a604b176385748 (commitdiff for 4af4d26582f0c2b915d7102fb4a604b176385748) +[f58f3e1bec41ccf9316f37b014ce0b373c8e49e1]: http://git.liegesta.at/?p=ikiwiki.git;a=commitdiff;h=f58f3e1bec41ccf9316f37b014ce0b373c8e49e1 (commitdiff for f58f3e1bec41ccf9316f37b014ce0b373c8e49e1) +[da5d29f95f6e693e8c14be1b896cf25cf4fdb3c0]: http://git.liegesta.at/?p=ikiwiki.git;a=commitdiff;h=da5d29f95f6e693e8c14be1b896cf25cf4fdb3c0 (commitdiff for da5d29f95f6e693e8c14be1b896cf25cf4fdb3c0) +[a358d74bef51dae31332ff27e897fe04834571e6]: http://git.liegesta.at/?p=ikiwiki.git;a=commitdiff;h=a358d74bef51dae31332ff27e897fe04834571e6 (commitdiff for a358d74bef51dae31332ff27e897fe04834571e6) +[981400177d68a279f485727be3f013e68f0bf691]: http://git.liegesta.at/?p=ikiwiki.git;a=commitdiff;h=981400177d68a279f485727be3f013e68f0bf691 (commitdiff for 981400177d68a279f485727be3f013e68f0bf691) + +[[!tag done]] diff --git a/doc/todo/auto_getctime_on_fresh_build.mdwn b/doc/todo/auto_getctime_on_fresh_build.mdwn new file mode 100644 index 000000000..760c56fa1 --- /dev/null +++ b/doc/todo/auto_getctime_on_fresh_build.mdwn @@ -0,0 +1,13 @@ +[[!tag wishlist]] + +It might be a good idea to enable --gettime when `.ikiwiki` does not +exist. This way a new checkout of a `srcdir` would automatically get +ctimes right. (Running --gettime whenever a rebuild is done would be too +slow.) --[[Joey]] + +Could this be too annoying in some cases, eg, checking out a large wiki +that needs to get set up right away? --[[Joey]] + +> Not for git with the new, optimised --getctime. For other VCS.. well, +> pity they're not as fast as git ;), but it is a one-time expense... +> [[done]] --[[Joey]] diff --git a/doc/todo/auto_publish_expire.mdwn b/doc/todo/auto_publish_expire.mdwn new file mode 100644 index 000000000..7a5a17517 --- /dev/null +++ b/doc/todo/auto_publish_expire.mdwn @@ -0,0 +1,33 @@ +It could be nice to mark some page such that: + +* the page is automatically published on some date (i.e. build, linked, syndicated, inlined/mapped, etc.) +* the page is automatically unpublished at some other date (i.e. removed) + +I know that ikiwiki is a wiki compiler so that something has to refresh the wiki periodically to enforce the rules (a cronjob for instance). It seems to me that the calendar plugin rely on something similar. + +The date for publishing and expiring could be set be using some new directives; an alternative could be to expand the [[plugin/meta]] plugin with [<span/>[!meta date="auto publish date"]] and [<span/>[!meta expires="auto expire date"]]. + +--[[JeanPrivat]] + +> This is a duplicate, and expansion, of +> [[todo/tagging_with_a_publication_date]]. +> There, I suggest using a branch to develop +> prepublication versions of a site, and merge from it +> when the thing is published. +> +> Another approach I've seen used is to keep such pages in a pending/ +> directory, and move them via cron job when their publication time comes. +> But that requires some familiarity with, and access to, cron. +> +> On [[todo/tagging_with_a_publication_date]], I also suggested using meta +> date to set a page's date into the future, +> and adding a pagespec that matches only pages with dates in the past, +> which would allow filtering out the unpublished ones. +> Sounds like you are thinking along these lines, but possibly using +> something other than the page's creation or modification date to do it. +> +> I do think the general problem with that approach is that you have to be +> careful to prevent the unpublished pages from leaking out in any +> inlines, maps, etc. --[[Joey]] + +[[!tag wishlist]] diff --git a/doc/todo/auto_rebuild_on_template_change.mdwn b/doc/todo/auto_rebuild_on_template_change.mdwn new file mode 100644 index 000000000..ea990b877 --- /dev/null +++ b/doc/todo/auto_rebuild_on_template_change.mdwn @@ -0,0 +1,78 @@ +If `page.tmpl` is changed, it would be nice if ikiwiki automatically +noticed, and rebuilt all pages. If `inlinepage.tmpl` is changed, a rebuild +of all pages using it in an inline would be stellar. + +This would allow setting: + + templatedir => "$srcdir/templates", + +.. and then the [[templates]] are managed like other wiki files; and +like other wiki files, a change to them automatically updates dependent +pages. + +Originally, it made good sense not to have the templatedir inside the wiki. +Those templates can be used to bypass the htmlscrubber, and you don't want +just anyone to edit them. But the same can be said of `style.css` and +`ikiwiki.js`, which *are* in the wiki. We rely on `allowed_attachments` +being set to secure those to prevent users uploading replacements. And we +assume that users who can directly (non-anon) commit *can* edit them, and +that's ok. + +So, perhaps the easiest way to solve this [[wishlist]] would be to +make templatedir *default* to "$srcdir/templates/, and make ikiwiki +register dependencies on `page.tmpl`, `inlinepage.tmpl`, etc, as they're +used. Although, having every page declare an explicit dep on `page.tmpl` +is perhaps a bit much; might be better to implement a special case for that +one. Also, having the templates be copied to `destdir` is not desirable. +In a sense, these template would be like internal pages, except not wiki +pages, but raw files. + +The risk is that a site might have `allowed_attachments` set to +`templates/*` or `*.tmpl` something like that. I think such a configuration +is the *only* risk, and it's unlikely enough that a NEWS warning should +suffice. + +(This would also help to clear up the tricky disctinction between +wikitemplates and in-wiki templates.) + +Note also that when using templates from "$srcdir/templates/", `no_includes` +needs to be set. Currently this is done by the two plugins that use +such templates, while includes are allowed in `templatedir`. + +Have started working on this. +[[!template id=gitbranch branch=origin/templatemove author="[[Joey]]"]] + +> But would this require that templates be parseable as wiki pages? Because that would be a nuisance. --[[KathrynAndersen]] + +>> It would be better for them not to be rendered separately at all. +>> --[[Joey]] + +>>> I don't follow you. --[[KathrynAndersen]] + +>>>> If they don't render to output files, they clearly don't +>>>> need to be treated as wiki pages. (They need to be treated +>>>> as raw files anyway, because you don't want random users editing them +>>>> in the online editor.) --[[Joey]] + +>>>>> Just to be clear, the raw files would not be copied across to the output +>>>>> directory? -- [[Jon]] + +>>>>>> Without modifying ikiwiki, they'd be copied to the output directory as +>>>>>> (e.g.) http://ikiwiki.info/templates/inlinepage.tmpl; to not copy them, +>>>>>> it'd either be necessary to make them be internal pages +>>>>>> (templates/inlinepage._tmpl) or special-case them in some other way. +>>>>>> --[[smcv]] + +>>>>>>> In my branch, I left in support for the templatedir, and also +>>>>>>> /usr/share/ikiwiki/templates. So, users do not have to put their +>>>>>>> custom templates in templates/ in the wiki. If they do, +>>>>>>> the templates are copied to the destdir like other non-wiki page files +>>>>>>> are. The templates are not wiki pages, except those used by a few +>>>>>>> things like the [[plugins/template]] plugin. +>>>>>>> +>>>>>>> That seems acceptable, since users probably don't need to modify +>>>>>>> many templates, so the clutter is small. (Especially when +>>>>>>> compared to the other clutter the basewiki always puts in destdir.) +>>>>>>> This could be revisted later. --[[Joey]] + +[[done]] diff --git a/doc/todo/avatar.mdwn b/doc/todo/avatar.mdwn index b8aa2327f..91f924fa1 100644 --- a/doc/todo/avatar.mdwn +++ b/doc/todo/avatar.mdwn @@ -1,38 +1,19 @@ [[!tag wishlist]] It would be nice if ikiwiki, particularly [[plugins/comments]] -supported user avatar icons. I was considering adding a directive for this, -as designed below. +(but also, ideally, recentchanges) supported user avatar icons. -However, there is no *good* service for mapping openids to avatars -- -openavatar has many issues, including not supporting delegated openids, and -after trying it, I don't trust it to push users toward. -Perhaps instead ikiwiki could get the email address from the openid -provider, though I think the perl openid modules don't support the openid -2.x feature that allows that. - -At the moment, working on this doesn't feel like a good use of my time. ---[[Joey]] - -Hmm.. unless is just always used a single provider (gravatar) and hashed -the openid. Then wavatars could be used to get a unique avatar per openid -at least. --[[Joey]] - ----- - -The directive displays a small avatar image for a user. Pass it the -email address, openid, or wiki username of the user. +Idea is to add a directive that displays a small avatar image for a user. +Pass it a user's the email address, openid, username, or the md5 hash +of their email address: \[[!avatar user@example.com]] \[[!avatar http://joey.kitenet.net/]] \[[!avatar user]] + \[[!avatar hash]] -The avatars are provided by various sites. For email addresses, it uses a -[gravatar](http://gravatar.com/). For openid, -[openavatar](http://www.openvatar.com/) is used. For a wiki username, the -user's email address is looked up and the gravatar for that user is -displayed. (Of course, the user has to have filled in their email address -on their Preferences page for that to work.) +These directives can then be hand-inserted onto pages, or more likely, +included in eg, a comment post via a template. An optional second parameter can be included, containing additional options to pass in the @@ -45,3 +26,40 @@ not have a gravatar, uses a cute auto-generated "wavatar" avatar. The `gravitar_options` setting in the setup file can be used to specify additional options to pass. So for example if you want to use wavatars everywhere, set it to "default=wavatar". + +The avatars are provided by various sites. For email addresses, it uses a +[gravatar](http://gravatar.com/). For a wiki username, the +user's email address is looked up and the gravatar for that user is +displayed. (Of course, the user has to have filled in their email address +on their Preferences page for that to work. Also, when the user changes +their email address in Preferences, the gravatar won't change until the +wiki is rebuilt.) + +For openid, openavatar sucked and is now dead. So we need to use an email +address instead, I guess. Problem is that the email address of a given +openid is only known when that user is logged in and making a change. +And we don't want to leak an openid user's email into a page either. +Hmm. Suppose the gravatar hash could be calculated from the email address +and embedded instead of the openid? That would work for comments, +but not if the directive were used elsewhere. + +Or, for openid, could use <http://paulisageek.com/openidavatar>. Which +works fine, but users are not likely to figure out what they need to do to +get an avatar associated with their openid. + +--- + +Alternative, not overdesigned approach: + +Modify comments plugin to have an option to display avatars. + +When posting a comment, fill in the avatarhash field in the template. +The hash is calculated from the user's email address. If the user's email +is not known, skip it. + +End. :P + +--- + +[libravatar](https://launchpad.net/libravatar) is a federated avatar +system. Young but might be the right way to get avatars eventually. diff --git a/doc/todo/avoid_attachement_ui_if_upload_not_allowed.mdwn b/doc/todo/avoid_attachement_ui_if_upload_not_allowed.mdwn new file mode 100644 index 000000000..487915850 --- /dev/null +++ b/doc/todo/avoid_attachement_ui_if_upload_not_allowed.mdwn @@ -0,0 +1,25 @@ +Any way to make it so an edit page doesn't offer the attachment capability +unless it matches a specific user, is an admin, and/or is an allowed page? +(For now, I have it on all pages, and then it prohibits after I submit +based on the allowed_attachments.) + +> To do that, ikiwiki would have to try to match the `allowed_attachments` +> pagespec against a sort of dummy upload to the current page. Then if it +> failed, assume all real uploads would fail. Now consider a pagespec like +> "user(joey) and mimetype(audio/mpeg)" -- it'd be hard to make a dummy +> upload to test this pagespec against. +> +> So, there would need to be some sort of test mode, where terms like +> `mimetype()` always succeed. But then consider a pagespec like +> "user(joey) and !mimetype(video/mpeg)" -- if mimetype succeeds, this +> fails. +> +> So, maybe we can instead just filter out all the pagespec terms aside +> from `user()`, `ip()`, and `admin()`. Transforming that into just +> "user(joey)", which would succeed in the test. +> +> That'd work, I guess. Pulling a pagespec apart, filtering out terms, and +> putting it back together is nontrivial, but doable. +> +> Other approach would be to have a separate pagespec that explicitly +> controlls what pages to show the attachment UI on. --[[Joey]] diff --git a/doc/todo/backlinks_result_is_lossy.mdwn b/doc/todo/backlinks_result_is_lossy.mdwn index 11b5fbcae..2a9fc4a0a 100644 --- a/doc/todo/backlinks_result_is_lossy.mdwn +++ b/doc/todo/backlinks_result_is_lossy.mdwn @@ -1,5 +1,4 @@ [[!tag patch patch/core]] -[[!template id=gitbranch branch=smcv/ready/among author="[[smcv]]"]] IkiWiki::backlinks returns a form of $backlinks{$page} that has undergone a lossy transformation (to get it in the form that page templates want), making diff --git a/doc/todo/beef_up_sidebar_to_allow_for_multiple_sidebars.mdwn b/doc/todo/beef_up_sidebar_to_allow_for_multiple_sidebars.mdwn index 8a4b852b7..fdaa09f26 100644 --- a/doc/todo/beef_up_sidebar_to_allow_for_multiple_sidebars.mdwn +++ b/doc/todo/beef_up_sidebar_to_allow_for_multiple_sidebars.mdwn @@ -10,4 +10,116 @@ those contents instead. --[[madduck]] + +> In mine I just copied sidebar out and made some extra "sidebars", but they go elsewhere. Ugly hack, but it works. --[[simonraven]] + +>> Here a simple [[patch]] for multiple sidebars. Not too fancy but better than having multiple copies of the sidebar plugin. --[[jeanprivat]] + +>>> I made a [[git]] branch for it [[!template id=gitbranch branch="privat/multiple_sidebars" author="[[jeanprivat]]"]] --[[jeanprivat]] + +>>>> Ping for [[Joey]]. Do you have any comment? I could improve it if there is things you do not like. I prefer to have such a feature integrated upstream. --[[JeanPrivat]] + +>>>>> The code is fine. +>>>>> +>>>>> I did think about having it examine +>>>>> the `page.tmpl` for parameters with names like `FOO_SIDEBAR` +>>>>> and automatically enable page `foo` as a sidebar in that case, +>>>>> instead of using the setup file to enable. But I'm not sure about +>>>>> that idea.. +>>>>> +>>>>> The full compliment of sidebars would be a header, a footer, +>>>>> a left, and a right sidebar. It would make sense to go ahead +>>>>> and add the parameters to `page.tmpl` so enabling each just works, +>>>>> and add whatever basic CSS makes sense. Although I don't know +>>>>> if I want to try to get a 3 column CSS going, so perhaps leave the +>>>>> left sidebar out of that. + +------------------- + +<pre> +--- /usr/share/perl5/IkiWiki/Plugin/sidebar.pm 2010-02-11 22:53:17.000000000 -0500 ++++ plugins/IkiWiki/Plugin/sidebar.pm 2010-02-27 09:54:12.524412391 -0500 +@@ -19,12 +19,20 @@ + safe => 1, + rebuild => 1, + }, ++ active_sidebars => { ++ type => "string", ++ example => qw(sidebar banner footer), ++ description => "Which sidebars must be activated and processed.", ++ safe => 1, ++ rebuild => 1 ++ }, + } + +-sub sidebar_content ($) { ++sub sidebar_content ($$) { + my $page=shift; ++ my $sidebar=shift; + +- my $sidebar_page=bestlink($page, "sidebar") || return; ++ my $sidebar_page=bestlink($page, $sidebar) || return; + my $sidebar_file=$pagesources{$sidebar_page} || return; + my $sidebar_type=pagetype($sidebar_file); + +@@ -49,11 +57,17 @@ + + my $page=$params{page}; + my $template=$params{template}; +- +- if ($template->query(name => "sidebar")) { +- my $content=sidebar_content($page); +- if (defined $content && length $content) { +- $template->param(sidebar => $content); ++ ++ my @sidebars; ++ if (defined $config{active_sidebars} && length $config{active_sidebars}) { @sidebars = @{$config{active_sidebars}}; } ++ else { @sidebars = qw(sidebar); } ++ ++ foreach my $sidebar (@sidebars) { ++ if ($template->query(name => $sidebar)) { ++ my $content=sidebar_content($page, $sidebar); ++ if (defined $content && length $content) { ++ $template->param($sidebar => $content); ++ } + } + } + } +</pre> + +---------------------------------------- +## Further thoughts about this + +(since the indentation level was getting rather high.) + +What about using pagespecs in the config to map pages and sidebar pages together? Something like this: + +<pre> + sidebar_pagespec => { + "foo/*" => 'sidebars/foo_sidebar', + "bar/* and !bar/*/*' => 'bar/bar_top_sidebar', + "* and !foo/* and !bar/*" => 'sidebars/general_sidebar', + }, +</pre> + +One could do something similar for *pageheader*, *pagefooter* and *rightbar* if desired. + +Another thing which I find compelling - but probably because I am using [[plugins/contrib/field]] - is to be able to treat the included page as if it were *part* of the page it was included into, rather than as an included page. I mean things like \[[!if ...]] would test against the page name of the page it's included into rather than the name of the sidebar/header/footer page. It's even more powerful if one combines this with field/getfield/ftemplate/report, since one could make "generic" headers and footers that could apply to a whole set of pages. + +Header example: +<pre> +#{{$title}} +\[[!ftemplate id="nice_data_table"]] +</pre> + +Footer example: +<pre> +------------ +\[[!report template="footer_trail" trail="trailpage" here_only=1]] +</pre> + +(Yes, I am already doing something like this on my own site. It's like the PmWiki concept of GroupHeader/GroupFooter) + +-- [[KathrynAndersen]] + [[!tag wishlist]] diff --git a/doc/todo/blocking_ip_ranges.mdwn b/doc/todo/blocking_ip_ranges.mdwn index 95026eef1..ac2344ece 100644 --- a/doc/todo/blocking_ip_ranges.mdwn +++ b/doc/todo/blocking_ip_ranges.mdwn @@ -2,3 +2,6 @@ Admins need the ability to block IP ranges. They can already ban users. See [[fileupload]] for a propsal that grew to encompass the potential to do this. + +[[done]] (well, there is no pagespec for IP ranges yet, but we can block +individual IPs) diff --git a/doc/todo/cache_backlinks.mdwn b/doc/todo/cache_backlinks.mdwn new file mode 100644 index 000000000..dc13d464e --- /dev/null +++ b/doc/todo/cache_backlinks.mdwn @@ -0,0 +1,25 @@ +I'm thinking about caching the backlinks between runs. --[[Joey]] + +* It would save some time (spent resolving every single link + on every page, every run). The cached backlinks could be + updated by only updating backlinks from changed pages. + (Saved time is less than 1/10th of a second for docwiki.) + +* It may allow attacking [[bugs/bestlink_change_update_issue]], + since that seems to need a copy of the old backlinks. + Actually, just the next change will probably solve that: + +* It should allow removing the `%oldlink_targets`, `%backlinkchanged`, + and `%linkchangers` calculation code. Instead, just generate + a record of which pages' backlinks have changed when updating + the backlinks, and then rebuild those pages. + +Proposal: + +* Store a page's backlinks in the index, same as everything else. + +* Do *something* to generate or store the `%brokenlinks` data. + This is currently generated when calculating backlinks, and + is only used by the brokenlinks plugin. It's not the right + "shape" to be stored in the index, but could be changed around + to fit. diff --git a/doc/todo/cas_authentication.mdwn b/doc/todo/cas_authentication.mdwn index 8bf7042df..ed8010518 100644 --- a/doc/todo/cas_authentication.mdwn +++ b/doc/todo/cas_authentication.mdwn @@ -21,6 +21,19 @@ follows) ? > license statement at the top. I have a few questions that I'll insert > inline with the patch below. --[[Joey]] +>> I have made some corrections to this patch (my cas plugin) in order to use +>> IkiWiki 3.00 interface and take your comments into account. It should work +>> fine now. +>> +>> You can pull it from my git repo at +>> http://git.boulgour.com/bbb/ikiwiki.git/ and maybe add it to your main +>> repo. +>> +>> I will add GNU GPL copyright license statement as soon as I get some free +>> time. +>> +>> --[[/users/bbb]] + ------------------------------------------------------------------------------ diff --git a/IkiWiki/Plugin/cas.pm b/IkiWiki/Plugin/cas.pm new file mode 100644 diff --git a/doc/todo/cdate_and_mdate_available_for_templates.mdwn b/doc/todo/cdate_and_mdate_available_for_templates.mdwn new file mode 100644 index 000000000..70d8fc8c9 --- /dev/null +++ b/doc/todo/cdate_and_mdate_available_for_templates.mdwn @@ -0,0 +1,15 @@ +[[!tag wishlist]] + +`CDATE_3339`, `CDATE_822`, `MDATE_3339` and `MDATE_822` template variables would be useful for evey page, at least for my templates with Dublin Core metadata. + +I tried to pick the relevant lines of the [[inline|plugins/inline]] plugin and hack it into a custom plugin, but it failed miserably because of my obvious lack of perl litteracy... + +Anyway, I'm sure this is almost nothing... + +* `sub date_822 ($) {}` +* `sub date_3339 ($) {}` +* and something like `$template->param('cdate_822' => date_822($IkiWiki::pagectime{$page}));` + +Anyone can fill the missing lines? + +-- [[nil]] diff --git a/doc/todo/comment_moderation_feed.mdwn b/doc/todo/comment_moderation_feed.mdwn new file mode 100644 index 000000000..267706b1b --- /dev/null +++ b/doc/todo/comment_moderation_feed.mdwn @@ -0,0 +1,9 @@ +There should be a way to generate a feed that is updated whenever a new +comment needs moderation. Otherwise, it can be hard to remember to check +sites, which may rarely get comments. + +The feed should not include the comment subject or body, but could mention +the author. It would be especially handy if it was generated statically. +One way would be to generate internal pages corresponding to each comment +that needs moderation; then the feed could be constructed via a usual +inline. diff --git a/doc/todo/conflict_free_comment_merges.mdwn b/doc/todo/conflict_free_comment_merges.mdwn new file mode 100644 index 000000000..e84400c17 --- /dev/null +++ b/doc/todo/conflict_free_comment_merges.mdwn @@ -0,0 +1,23 @@ +Currently, new comments are named with an incrementing ID (comment_N). So +if a wiki has multiple disconnected servers, and comments are made to the +same page on both, merging is guaranteed to result in conflicts. + +I propose avoiding such merge problems by naming a comment with a sha1sum +of its (full) content. Keep the incrementing ID too, so there is an +-ordering. And so duplicate comments are allowed..) +So, "comment_N_SHA1". + +Note: The comment body will need to use meta title in the case where no +title is specified, to retain the current behavior of the default title +being "comment N". + +What do you think [[smcv]]? --[[Joey]] + +> I had to use md5sums, as sha1sum perl module may not be available and I +> didn't want to drag it in. But I think that's ok; this doesn't need to be +> cryptographically secure and even the chances of being able to +> purposefully cause a md5 collision and thus an undesired merge conflict +> are quite low since it modifies the input text and adds a date stamp to +> it. +> +> Anyway, I think it's good, [[done]] --[[Joey]] diff --git a/doc/todo/dependency_types.mdwn b/doc/todo/dependency_types.mdwn new file mode 100644 index 000000000..4db633ead --- /dev/null +++ b/doc/todo/dependency_types.mdwn @@ -0,0 +1,579 @@ +Ikiwiki currently only has one type of dependency between pages +(plus wikilinks special cased in on the side). This has resulted in various +problems, and it's seemed for a long time to me that ikiwiki needs to get +smarter about what types of dependencies are supported. + +### unnecessary work + +The current single dependency type causes the depending page to be rebuilt +whenever a matching dependency is added, removed, or *modified*. But a +great many things don't care about the modification case, and often cause +unnecessary page rebuilds: + +* map only cares if the pages are added or removed. Content change does + not matter (unless show=title is used). +* brokenlinks, orphans, pagecount, ditto (generally) +* inline in archive mode cares about page title, author changing, but + not content. (Ditto for meta with show=title.) +* Causes extra work when solving the [[bugs/transitive_dependencies]] + problem. + +### two types of dependencies needed for [[tracking_bugs_with_dependencies]] + +>> it seems that there are two types of dependency, and ikiwiki +>> currently only handles one of them. The first type is "Rebuild this +>> page when any of these other pages changes" - ikiwiki handles this. +>> The second type is "rebuild this page when set of pages referred to by +>> this pagespec changes" - ikiwiki doesn't seem to handle this. I +>> suspect that named pagespecs would make that second type of dependency +>> more important. I'll try to come up with a good example. -- [[Will]] + +>>> Hrm, I was going to build an example of this with backlinks, but it +>>> looks like that is handled as a special case at the moment (line 458 of +>>> render.pm). I'll see if I can breapk +>>> things another way. Fixing this properly would allow removal of that special case. -- [[Will]] + +>>>> I can't quite understand the distinction you're trying to draw +>>>> between the two types of dependencies. Backlinks are a very special +>>>> case though and I'll be suprised if they fit well into pagespecs. +>>>> --[[Joey]] + +>>>>> The issue is that the existential pagespec matching allows you to build things that have similar +>>>>> problems to backlinks. +>>>>> e.g. the following inline: + + \[[!inline pages="define(~done, link(done)) and link(~done)" archive=yes]] + +>>>>> includes any page that links to a page that links to done. Now imagine I add a new link to 'done' on +>>>>> some random page somewhere - a page which some other page links to which didn't previously get included - the set of pages accepted by the pagespec, and hence the set of +>>>>> pages inlined, will change. But, there is no dependency anywhere on the page that I altered, so +>>>>> ikiwiki will not rebuild the page with the inline in it. What is happening is that the page that I altered affects +>>>>> the set of pages matched by the pagespec without itself being matched by the pagespec, and hence included in the dependency list. + +>>>>> To make this work well, I think you need to recognise two types of dependencies for each page (and no +>>>>> special cases for particular types of links, eg backlinks). The first type of dependency says, "The content of +>>>>> this page depends upon the content of these other pages". The `add_depends()` in the shortcuts +>>>>> plugin is of this form: any time the shortcuts page is edited, any page with a shortcut on it +>>>>> is rebuilt. The inline plugin also needs to add dependencies of this form to detect when the inlined +>>>>> content changes. By contrast, the map plugin does not need a dependency of this form, because it +>>>>> doesn't actually care about the content of any pages, just which pages it needs to include (which we'll handle next). + +>>>>> The second type of dependency says, "The content of this page depends upon the exact set of pages matched +>>>>> by this pagespec". The first type of dependency was about the content of some pages, the second type is about +>>>>> which pages get matched by a pagespec. This is the type of dependency tracking that the map plugin needs. +>>>>> If the set of pages matched by map pagespec changes, then the page with the map on it needs to be rebuilt to show a different list of pages. +>>>>> Inline needs this type of dependency as well as the previous type - This type handles a change in which pages +>>>>> are inlined, the previous type handles a change in the content of any of those pages. Shortcut does not need this type of +>>>>> dependency. Most of the places that use `add_depends()` seem to need this type of dependency rather than the first type. + +>>>>>> Note that inline and map currently achieve the second type of dependency by +>>>>>> explicitly calling `add_depends` for each page the displayed. +>>>>>> If any of those pages are removed, the regular pagespec would not +>>>>>> match them -- since they're gone. However, the explicit dependency +>>>>>> on them does cause them to match. It's an ugly corner I'd like to +>>>>>> get rid of. --[[Joey]] + +>>>>> Implementation Details: The first type of dependency can be handled very similarly to the current +>>>>> dependency system. You just need to keep a list of pages that the content depends upon. You could +>>>>> keep that list as a pagespec, but if you do this you might want to check that the pagespec doesn't change, +>>>>> possibly by adding a dependency of the second type along with the dependency of the first type. + +>>>>>> An example of the current system not tracking enough data is +>>>>>> described in [[bugs/transitive_dependencies]]. +>>>>>> --[[Joey]] + +>>>>> The second type of dependency is a little more tricky. For each page, we'd need a list of pagespecs that +>>>>> the page depended on, and for each pagespec you'd want to store the list of pages that currently match it. +>>>>> On refresh, you'd need to check each pagespec to see if the set of pages that match it has changed, and if +>>>>> that set has changed, then rebuild the dependent page(s). Oh, and for this second type of dependency, I +>>>>> don't think you can merge pagespecs. If I wanted to know if either "\*" or "link(done)" changes, then just checking +>>>>> to see if the set of pages matched by "\* or link(done)" changes doesn't work. + +>>>>> The current system works because even though you usually want dependencies of the second type, the set of pages +>>>>> referred to by a pagespec can only change if one of those pages itself changes. i.e. A dependency check of the +>>>>> first type will catch a dependency change of the second type with current pagespecs. +>>>>> This doesn't work with backlinks, and it doesn't work with existential matching. Backlinks are currently special-cased. I don't know +>>>>> how to special-case existential matching - I suspect you're better off just getting the dependency tracking right. + +>>>>> I also tried to come up with other possible solutions: e.g. can we find the dependencies for a pagespec? That +>>>>> would be the set of pages where a change on one of those pages could lead to a change in the set of pages matched by the pagespec. +>>>>> For old-style pagespecs without backlinks, the dependency set for a pagespec is the same as the set of pages the pagespec matches. +>>>>> Unfortunately, with existential matching, the set of pages that each +>>>>> pagespec depends upon can quickly become "*", which is not very useful. -- [[Will]] + +### proposal + +I propose the following. --[[Joey]] + +* Add a second type of dependency, call it an "presence dependency". +* `add_depends` defaults to adding a regular ("full") dependency, as + before. (So nothing breaks.) +* `add_depends($page, $spec, presence => 0)` adds an presence dependency. +* `refresh` only looks at added/removed pages when resolving presence + dependencies. + +This seems straightforwardly doable. I'd like [[Will]]'s feedback on it, if +possible. The type types of dependencies I am proposing are not identical +to the two types he talks about above, but I hope are close enough that +they can be used. + +This doesn't deal with the stuff that only depend on the metadata of a +page, as collected in the scan pass, changing. But it does leave a window +open for adding such a dependency type later. + +---- + +I implemented the above in a branch. +[[!template id=gitbranch branch=origin/dependency-types author="[[joey]]"]] + +Then I found some problems: + +* Something simple like pagecount, that seems like it could use a + presence dependency, can have a pagespec that uses metadata, like + `author()` or `copyright()`. +* pagestats, orphans and brokenlinks cannot use presence dependencies + because they need to update when links change. + +Now I'm thinking about having a special dependency look at page +metadata, and fire if the metadata changes. And it seems links should +either be included in that, or there should be a way to make a dependency +that fires when a page's links change. (And what about backlinks?) + +It's easy to see when a page's links change, since there is `%oldlinks`. +To see when metadata is changed is harder, since it's stored in the +pagestate by the meta plugin. Also, there are many different types of +metadata, that would need to be matched with the pagespecs somehow. + +Quick alternative: Make add_depends look at the pagespec. Ie, if it +is a simple page name, or a glob, we know a presence dependency +can be valid. If's more complex, convert the dependency from +presence to full. + +There is a lot to dislike about this method. Its parsing of the pagespec, +as currently implemented, does not let plugins add new types of pagespecs +that only care about presence. Its pagespec parsing is also subject to +false negatives (though these should be somewhat rare, and no false +positives). Still, it does work, and it makes things like simple maps and +pagecounts much more efficient. + +---- + +#### Will's first pass feedback. + +If the API is going to be updated, then it would be good to make it forward compatible. +I'd like for the API to be extendible to what is useful for complex pagespecs, even if we +that is a little redundant at the moment. + +My attempt to play with this is in my git repo. [[!template id=gitbranch branch=origin/depends-spec author="[[will]]"]] +That branch is a little out of date, but if you just look at the changes in IkiWiki.pm you'll see the concept I was looking at. +I added an "add_depends_spec()" function that adds a dependency on the pagespec passed to it. If the set of matched pages +changes, then the dependent page is rebuilt. At the moment the implementation uses the same hack used by map and inline - +just add all the pages that currently exist as traditional content dependencies. + +> As I note below, a problem with this approach is that it has to try +> matching the pagespec against every page, redundantly with the work done +> by the plugin. (But there are ways to avoid that redundant matching.) +> --[[Joey]] + +Getting back to commenting on your proposal: + +Just talking about the definition of a "presence dependency" for the moment, and ignoring implementation. Is a +"presence dependency" supposed to cause an update when a page disappears? I assume so. Is a presence dependency +supposed to cause an update when a pages existence hasn't changed, but it no longer matches the pagespec. +(e.g. you use `created_before(test_page)` in a pagespec, and there was a page, `new_page`, that was created +after `test_page`. `new_page` will not match the spec. Now we'll delete and then re-create `test_page`. Now +`new_page` will match the spec, and yet `new_page` itself hasn't changed. Nor has its 'presence' - it was present +before and it is present now. Should this cause a re-build of any page that has a 'presence' dependency on the spec? + +> Yes, a presence dep will trigger when a page is added, or removed. + +> Your example is valid.. but it's also not handled right by normal, +> (content) dependencies, for the same reasons. Still, I think I've +> addressed it with the pagespec influence stuff below. --[[Joey]] + +I think that is another version of the problem you encountered with meta-data. + +In the longer term I was thinking we'd have to introduce a concept of 'internal pagespec dependencies'. Note that I'm +defining 'internal' pagespec dependencies differently to the pagespec dependencies I defined above. Perhaps an example: +If you had a pagespec that was `created_before(test_page)`, then you could list all pages created before `test_page` +with a `map` directive. The map directive would add a pagespec dependency on `created_before(test_page)`. +Internally, there would be a second page-spec parsing function that discovers which pages a given pagespec +depends on. As well as the function `match_created_before()`, we'd have to add a new function `depend_created_before()`. +This new function would return a list of pages, which when any of them change, the output of `match_created_before()` +would change. In this example, it would just return `test_page`. + +These lists of dependent pages could just be concatenated for every `match_...()` function in a pagespec - you can ignore +the boolean formula aspects of the pagespec for this. If a content dependency were added on these pages, then I think +the correct rebuilds would occur. + +In all, this is a surprisingly difficult problem to solve perfectly. Consider the following case: + +PageA.mdwn: + +> [ShavesSelf] + +PageB.mdwn + +> Doesn't shave self. + +ShavedByBob.mdwn: + +> [!include pages="!link(ShavesSelf)"] + +Does ShavedByBob.mdwn include itself? + +(Yeah - in IkiWiki currently links are *not* included by include, but the idea holds. I had a good example a while back, but I can't think of it right now.) + +sigh. + +-- [[Will]] + +> I have also been thinking about some sort of analysis pass over pagespecs +> to determine what metadata, pages, etc they depend on. It is indeed +> tricky to do. More thoughts on influence lists a bit below. --[[Joey]] + +>> The big part of what makes this tricky is that there may be cycles in the +>> dependency graph. This can lead to situations where the result is just not +>> well defined. This is what I was trying to get at above. -- [[Will]] + +>>> Hmm, I'm not seeing cycles be a problem, at least with the current +>>> pagespec terms. --[[Joey]] + +>>>> Oh, they're not with current pagespec terms. But this is really close to extending to handle +>>>> functional pagespecs, etc. And I think I'd like to think about that now. +>>>> +>>>> Having said that, I don't want to hold you up - you seem to be making progress. The best is +>>>> the enemy of the good, etc. etc. +>>>> +>>>> For my part, I'm imagining we have two more constructs in IkiWiki: +>>>> +>>>> * A map directive that actually wikilinks to the pages it links to, and +>>>> * A `match_sharedLink(pageX)` matching function that matches pageY if both pageX and pageY each have links to any same third page, pageZ. +>>>> +>>>> With those two constructs, one page changing might change the set of pages included in a map somewhere, which might then change the set of pages matched by some other pagespec, which might then... +>>>> +>>>> --[[Will]] + +>>>>> I think that should be supported by [[bugs/transitive_dependencies]]. +>>>>> At least in the current implementation, which considers each page +>>>>> that is rendered to be changed, and rebuilds pages that are dependent +>>>>> on it, in a loop. An alternate implementation, which could be faster, +>>>>> is to construct a directed graph and traverse it just once. Sounds +>>>>> like that would probably not support what you want to do. +>>>>> --[[Joey]] + +>>>>>> Yes - that's what I'm talking about - I'll add some comments there. -- [[Will]] + +---- + +### Link dependencies + +* `add_depends($page, $spec, links => 1, presence => 1)` + adds a links + presence dependency. +* Use backlinks change code to detect changes to link dependencies too. +* So, brokenlinks can fire whenever any links in any of the + pages it's tracking change, or when pages are added or + removed. +* To determine if a pagespec is valid to be used with a links dependency, + use the same set that are valid for presence dependencies. But also + allow `backlinks()` to be used in it, since that matches pages + that the page links to, which is just what link dependencies are + triggered on. + +[[done]] +---- + +### the removal problem + +So far I have not addressed fixing the removal problem (which Will +discusses above). + +Summary of problem: A has a dependency on a pagespec such as +"bugs/* and !link(done)". B currently matches. Then B is updated, +in a way that makes A's dependency not match it (ie, it links to done). +Now A is not updated, because ikiwiki does not realize that it +depended on B before. + +This was worked around to fix [[bugs/inline_page_not_updated_on_removal]] +by inline and map adding explicit dependencies on each page that appears +on them. Then a change to B triggers the explicit dep. While this works, +it's 1) ugly 2) probably not implemented by all plugins that could +be affected by this problem (ie, linkmap) and 3) is most of the reason why +we grew the complication of `depends_simple`. + +One way to fix this is to include with each dependency, a list of pages +that currently match it. If the list changes, the dependency is triggered. + +Should be doable, but may involve more work than +currently. Consider that a dependency on `bugs/*` currently +is triggered by just checking until *one* page is found to match it. +But to store the list, *every* page would have to be tried against it. +Unless the list can somehow be intelligently updated, looking at only the +changed pages. + +---- + +Found a further complication in presence dependencies. Map now uses +presence dependencies when adding its explicit dependencies on pages. But +this defeats the purpose of the explicit dependencies! Because, now, +when B is changed to not match a pagespec, the A's presence dep does +not fire. + +I didn't think things through when switching it to use presence +dependencies there. But, if I change it to use full dependencies, then all +the work that was done to allow map to use presence dependencies for its +main pagespec is for naught. The map will once again have to update +whenever *any* content of the page changes. + +This points toward the conclusion that explicit dependencies, however they +are added, are not the right solution at all. Some other approach, such as +maintaining the list of pages that match a dependency, and noticing when it +changes, is needed. + +---- + +### pagespec influence lists + +I'm using this term for the concept of a list of pages whose modification +can indirectly influence what pages a pagespec matches. + +> Trying to make a formal definition of this: (Note, I'm using the term sets rather than lists, but they're roughly equivalent) +> +> * Let the *matching set* for a pagespec be the set of existing pages that the pagespec matches. +> * Let the *missing document matching set* be the set of pages that would match the spec if they didn't exist. These pages may or may not currently exist. Note that membership of this set depends upon how the `match_()` functions react to non-existant pages. +> * Let the *indirect influence set* for a pagespec be the set of all pages, *p*, whose alteration might: +> * cause the pagespec to include or exclude a page other than *p*, or +> * cause the pagespec to exclude *p*, unless the alteration is the removal of *p* and *p* is in the missing document matching set. +> +> Justification: The 'base dependency mechanism' is to compare changed pages against each pagespec. If the page matches, then rebuild the spec. For this comparison, creation and removal +> of pages are both considered changes. This base mechanism will catch: +> +> * The addition of any page to the matching set through its own modification/creation +> * The removal of any page *that would still match while non-existant* from the matching set through its own removal. (Note: The base mechanism cannot remove a page from the matching set because of that page's own modification (not deletion). If the page should be removed matching set, then it obviously cannot match the spec after the change.) +> * The modification (not deletion) of any page that still matches after the modification. +> +> The base mechanism may therefore not catch: +> +> * The addition or removal of any page from the matching set through the modification/addition/removal of any other page. +> * The removal of any page from the matching set through its own modification/removal if it does not still match after the change. +> +> The indirect influence set then should handle anything that the base mechanism will not catch. +> +> --[[Will]] + +>> I appreciate the formalism! +>> +>> Only existing pages need to be in these sets, because if a page is added +>> in the future, the existing dependency code will always test to see +>> if it matches. So it will be in the maching set (or not) at that point. +>> +>>> Hrm, I agree with you in general, but I think I can come up with nasty counter-examples. What about a pagespec +>>> of "!backlink(bogus)" where the page bogus doesn't exist? In this case, the page 'bogus' needs to be in the influence +>>> set even though it doesn't exist. +>>> +>>>> I think you're right, this is a case that the current code is not +>>>> handling. Actually, I made all the pagespecs return influences +>>>> even if the influence was not present or did not match. But, it +>>>> currently only records influences as dependencies when a pagespec +>>>> successfully matches. Now I'm sure that is wrong, and I've removed +>>>> that false optimisation. I've updated some of the below. --[[Joey]] +>>> +>>> Also, I would really like the formalism to include the whole dependency system, not just any additions to it. That will make +>>> the whole thing much easier to reason about. +>> +>> The problem with your definition of direct influence set seems to be +>> that it doesn't allow `link()` and `title()` to have as an indirect +>> influence, the page that matches. But I'm quite sure we need those. +>> --[[Joey]] + +>>> I see what you mean. Does the revised definition capture this effectively? +>>> The problem with this revised definition is that it still doesn't match your examples below. +>>> My revised definition will include pretty much all currently matching pages to be in the influence list +>>> because deletion of any of them would cause a change in which pages are matched - the removal problem. +>>> -- [[Will]] + +#### Examples + +* The pagespec "created_before(foo)" has an indirect influence list that contains foo. + The removal or (re)creation of foo changes what pages match it. Note that + this is true even if the pagespec currently fails to match. + +>>> This is an annoying example (hence well worth having :) ). I think the +>>> indirect influence list must contain 'foo' and all currently matching +>>> pages. `created_before(foo)` will not match +>>> a deleted page, and so the base mechanism would not cause a rebuild. The +>>> removal problem strikes. -- [[Will]] + +>>>> But `created_before` can in fact match a deleted page. Because the mtime +>>>> of a deleted page is temporarily set to 0 while the base mechanism runs to +>>>> find changes in deleted pages. (I verified this works by experiment, +>>>> also that `created_after` is triggered by a deleted page.) --[[Joey]] + +>>>>> Oh, okie. I looked at the source, saw the `if (exists $IkiWiki::pagectime{$testpage})` and assumed it would fail. +>>>>> Of course, having it succeed doesn't cure all the issues -- just moves them. With `created_before()` succeeding +>>>>> for deleted files, this pagespec will be match any removal in the entire wiki with the base mechanism. Whether this is +>>>>> better or worse than the longer indirect influence list is an empirical question. -- [[Will]] + +* The pagespec "foo" has an empty influence list. This is because a + modification/creation/removal of foo directly changes what the pagespec + matches. + +* The pagespec "*" has an empty influence list, for the same reason. + Avoiding including every page in the wiki into its influence list is + very important! + +>>> So, why don't the above influence lists contain the currently matched pages? +>>> Don't you need this to handle the removal problem? -- [[Will]] + +>>>> The removal problem is slightly confusingly named, since it does not +>>>> affect pages that were matched by a glob and have been removed. Such +>>>> pages can be handled without being influences, because ikiwiki knows +>>>> they have been removed, and so can still match them against the +>>>> pagespec, and see they used to match; and thus knows that the +>>>> dependency has triggered. +>>>> +>>>>> IkiWiki can only see that they used to match if they're in the glob matching set. -- [[Will]] +>>>> +>>>> Maybe the thing to do is consider this an optimisation, where such +>>>> pages are influences, but ikiwiki is able to implicitly find them, +>>>> so they do not need to be explicitly stored. --[[Joey]] + +* The pagespec "title(foo)" has an influence list that contains every page + that currently matches it. A change to any matching page can change its + title, making it not match any more, and so the list is needed due to the + removal problem. A page that does not have a matching title is not an + influence, because modifying the page to change its title directly + changes what the pagespec matches. + +* The pagespec "backlink(index)" has an influence list + that contains index (because a change to index changes the backlinks). + Note that this is true even if the backlink currently fails. + +>>> This is another interesting example. The missing document matching set contains all links on the page index, and so +>>> the influence list only needs to contain 'index' itself. -- [[Will]] + +* The pagespec "link(done)" has an influence list that + contains every page that it matches. A change to any matching page can + remove a link and make it not match any more, and so the list is needed + due to the removal problem. + +>> Why doesn't this include every page? If I change a page that doesn't have a link to +>> 'done' to include a link to 'done', then it will now match... or is that considered a +>> 'direct match'? -- [[Will]] + +>>> The regular dependency calculation code will check if every changed +>>> page matches every dependency. So it will notice the link was added. +>>> --[[Joey]] + +#### Low-level Calculation + +One way to calculate a pagespec's influence would be to +expand the SuccessReason and FailReason objects used and returned +by `pagespec_match`. Make the objects be created with an +influence list included, and when the objects are ANDed or ORed +together, combine the influence lists. + +That would have the benefit of allowing just using the existing `match_*` +functions, with minor changes to a few of them to gather influence info. + +But does it work? Let's try some examples: + +Consider "bugs/* and link(done) and backlink(index)". + +Its influence list contains index, and it contains all pages that the whole +pagespec matches. It should, ideally, not contain all pages that link +to done. There are a lot of such pages, and only a subset influence this +pagespec. + +When matching this pagespec against a page, the `link` will put the page +on the list. The `backlink` will put index on the list, and they will be +anded together and combined. If we combine the influences from each +successful match, we get the right result. + +Now consider "bugs/* and link(done) and !backlink(index)". + +It influence list is the same as the previous one, even though a term has +been negated. Because a change to index still influences it, though in a +different way. + +If negation of a SuccessReason preserves the influence list, the right +influence list will be calculated. + +Consider "bugs/* and (link(done) or backlink(index))" +and "bugs/* and (backlink(index) or link(done))' + +Its clear that the influence lists for these are identical. And they +contain index, plus all matching pages. + +When matching the first against page P, the `link` will put P on the list. +The OR needs to be a non-short-circuiting type. (In perl, `or`, not `||` -- +so, `pagespec_translate` will need to be changed to not use `||`.) +Given that, the `backlink` will always be evalulated, and will put index +onto the influence list. If we combine the influences from each +successful match, we get the right result. + +> This is implemented, seems to work ok. --[[Joey]] + +> `or` short-circuits too, but the implementation correctly uses `|`, +> which I assume is what you meant. --[[smcv]] + +>> Er, yeah. --[[Joey]] + +---- + +What about: "!link(done)" + +Specifically, I want to make sure it works now that I've changed +`match_link` to only return a page as an influence if it *does* +link to done. + +So, when matching against page P, that does not link to done, +there are no influences, and the pagespec matches. If P is later +changed to add a link to done, then the dependency resolver will directly +notice that. + +When matching against page P, that does link to done, P +is an influence, and the pagespec does not match. If P is later changed +to not link to done, the influence will do its job. + +Looks good! + +---- + +Here is a case where this approach has some false positives. + +"bugs/* and link(patch)" + +This finds as influences all pages that link to patch, even +if they are not under bugs/, and so can never match. + +To fix this, the influence calculation would need to consider boolean +operators. Currently, this turns into roughly: + +`FailReason() & SuccessReason(patch)` + +Let's say that the glob instead returns a HardFailReason, which when +ANDed with another object, blocks their influences. (But when ORed, +combines them.) + +Question: Are all pagespec terms that return reason objects w/o any +influence info, suitable to block influence in this way? + +To be suitable to block, a term should never change from failing to match a +page to successfully matching it, unless that page is directly changed in a +way that influences are not needed for ikiwiki to notice. But, if a term +did not meet these criteria, it would have an influence. QED. + +#### Influence types + +Note that influences can also have types, same as dependency types. +For example, "backlink(foo)" has an influence of foo, of type links. +"created_before(foo)" also is influenced by foo, but it's a presence +type. Etc. + +> This is an interesting concept that I hadn't considered. It might +> allow significant computational savings, but I suspect will be tricky +> to implement. -- [[Will]] + +>> It was actually really easy to implement it, assuming I picked the right +>> dependency types of course. --[[Joey]] diff --git a/doc/todo/description_meta_param_passed_to_templates.mdwn b/doc/todo/description_meta_param_passed_to_templates.mdwn new file mode 100644 index 000000000..712471258 --- /dev/null +++ b/doc/todo/description_meta_param_passed_to_templates.mdwn @@ -0,0 +1,10 @@ +[[!tag wishlist patch]] + +I'd like to use the description parameter from [[meta|/ikiwiki/directive/meta]] directives in custom [[inline|/ikiwiki/directive/inline]] templates. I guess this could be useful to others too. + +The only change required is on [line 266](http://github.com/joeyh/ikiwiki/blob/master/IkiWiki/Plugin/meta.pm#L266) of `meta.pm` + + - foreach my $field (qw{author authorurl permalink}) { + + foreach my $field (qw{author authorurl description permalink}) { + +> Good idea, [[done]]. --[[Joey]] diff --git a/doc/todo/double-click_protection_for_form_buttons.mdwn b/doc/todo/double-click_protection_for_form_buttons.mdwn new file mode 100644 index 000000000..501be4498 --- /dev/null +++ b/doc/todo/double-click_protection_for_form_buttons.mdwn @@ -0,0 +1,5 @@ +A small piece of JS to prevent double-submitting forms would be quite nice. I seem to have developed a habit of doing this and having to resolve a merge conflict for two initial commits. -- [[Jon]] + +> By the time you see that merge conflict, the first commit has +> already successfully happened, so you can just hit cancel +> and throw away the second submit. --[[Joey]] diff --git a/doc/todo/edit_form:_no_fixed_size_for_textarea.mdwn b/doc/todo/edit_form:_no_fixed_size_for_textarea.mdwn index 4c9c2352a..77e46049f 100644 --- a/doc/todo/edit_form:_no_fixed_size_for_textarea.mdwn +++ b/doc/todo/edit_form:_no_fixed_size_for_textarea.mdwn @@ -10,7 +10,7 @@ On longer pages its not very comfortable to edit pages with such a small box. Th > } > > Perhaps you have replaced it with a modified style sheet that does not -> include that? --[[Joey]] [[!tag done]] +> include that? --[[Joey]] >> The screen shot was made with http://ikiwiki.info/ where i didn't change anything. The width is optimally used. The problem is the height. @@ -32,3 +32,21 @@ On longer pages its not very comfortable to edit pages with such a small box. Th >>> --[[Joey]] >>>>>> the javascript approach would need to work something like this: you need to know about the "bottom-most" item on the edit page, and get a handle for that object in the DOM. You can then obtain the absolute position height-wise of this element and the absolute position of the bottom of the window to determine the pixel-difference. Then, you set the height of the textarea to (current height in px) + determined-value. This needs to be re-triggered on various resize events, at least for the window and probably for other elements too. I may have a stab at this at some point. -- [[Jon]] + +Google chrome has a completly elegant fix for this problem: All textareas +have a small resize handle in a corner, that can be dragged around. No +nasty javascript needed. IMHO, this is the right solution, and I hope other +browsers emulate it. [[done]] +--[[Joey]] + +Wouldn't it be possible to just implement an integer-valued setting for this, accessible via the "Setup" wiki page? This would require a wiki regen, but such a setting would not be changed frequently I suppose. Also, Mediawiki has this implemented as a per-user setting (two settings, actually, -- number of rows and columns of the edit area); such a per-user setting would be the best possible implementation, but I'm not sure if ikiwiki already supports per-user settings. Please consider implementing this as the current 20 rows is a great PITA for any non-trivial page. + +> I don't think it would need a wiki rebuild, as the textarea is generated dynamically by the CGI when you perform a CGI action, and (as far as I know) is not cooked into any static content. -- [[Jon]] + +>> There is no need for a configuration setting for this -- to change +>> the default height from 20 rows to something else, you can just put +>> something like this in your `local.css`: --[[Joey]] + + #editcontent { + height: 50em; + } diff --git a/doc/todo/edittemplate_should_look_in_templates_directory_by_default.mdwn b/doc/todo/edittemplate_should_look_in_templates_directory_by_default.mdwn new file mode 100644 index 000000000..4bc10e432 --- /dev/null +++ b/doc/todo/edittemplate_should_look_in_templates_directory_by_default.mdwn @@ -0,0 +1,8 @@ +[[plugins/edittemplate]] looks for the specified template relative to the +page the directive appears on. Which can be handy, eg, make a +blog/mytemplate and put the directive on blog, and it will find +"mytemplate". However, it can also be confusing, since other templates +always are looked for in `templates/`. + +I think it should probably fall back to looking for `templates/$foo`. +--[[Joey]] diff --git a/doc/todo/enable-htaccess-files.mdwn b/doc/todo/enable-htaccess-files.mdwn index e302a49ed..3b9721d50 100644 --- a/doc/todo/enable-htaccess-files.mdwn +++ b/doc/todo/enable-htaccess-files.mdwn @@ -12,6 +12,13 @@ qr/(^|\/).svn\//, qr/.arch-ids\//, qr/{arch}\//], wiki_link_regexp => qr/\[\[(?:([^\]\|]+)\|)?([^\s\]#]+)(?:#([^\s\]]+))?\]\]/, +> Note that the above patch is **completely broken**. +> It removes the crucial excludes of all files starting with a dot. +> The negative regexps for htaccess have no effect, so the whole +> thing only "works" because it allows *any* file starting with a dot. +> If you applied this patch to your ikiwiki, you opened a huge security +> hole. --[[Joey]] + [[!tag patch patch/core]] This lets the site administrator have a `.htaccess` file in their underlay @@ -57,5 +64,17 @@ It should be off by default of course. --Max --- +1 I want `.htaccess` so I can rewrite some old Wordpress URLs to make feeds work again. --[[hendry]] +> Unless you cannot modify apache's configuration, you do not need htaccess +> to do that. Apache's documentation recommends against using htaccess +> unless you're a user who cannot modify the main server configuration. +> --[[Joey]] + --- +1 for various purposes (but sometimes the filename isn't `.htaccess`, so please make it configurable) --[[schmonz]] + +> I've described a workaround for one use case at the [[plugins/rsync]] [[plugins/rsync/discussion]] page. --[[schmonz]] + +--- + +[[done]], you can use the `include` setting to override the default +excludes now. Please use extreme caution when doing so. --[[Joey]] diff --git a/doc/todo/enable_arbitrary_markup_for_directives.mdwn b/doc/todo/enable_arbitrary_markup_for_directives.mdwn new file mode 100644 index 000000000..c1f0f86ed --- /dev/null +++ b/doc/todo/enable_arbitrary_markup_for_directives.mdwn @@ -0,0 +1,47 @@ +One of the good things about [PmWiki](http://www.pmwiki.org) is the ability to treat arbitrary markup as directives. +In ikiwiki, all directives have the same format: + +\[[!name arguments]] + +But with PmWiki, directives can be added to the engine (with the "Markup" hook) with the usual name and function passing, but also with a regexp which has capturing parentheses, and the results of the match are passed to the given function. +Would it be possible to alter the "preprocess" hook to have an optional regex argument which acted in a similar fashion? + +For example, one could then write a plugin which would treat + +Category: Foo, Bar + +as a tag, by using a regex such as /^Category:\s*([\w\s,]+)$/; the result "Foo, Bar" could then be further processed by the hook function. + +This could also make it easier to support more styles of markup, rather than having to do all the processing in "htmlize" and/or "filter". + +-- [[KathrynAndersen]] + +[[!taglink wishlist]] + +> Arbitrary text transformations can already be done via the filter and +> sanitize hooks. That's how the smiley and typography plugins do their +> thing. +> +> AFAICS, the only benefit to having a regexp-based-hook interface is less +> overhead in passing page content into the hooks. But that overhead is a +> small amount of the total render time. +> +> Also, I notice that smiley does such complicated things in its sanitize +> hook (ie, it looks at html context around the smilies) that a simple +> matching regexp would not be sufficient. Furthermore, typography needs to +> pass the page content into the library it uses, which does not expose +> regexps to match on. So ikiwiki's more general filtering interface seems +> to allow both of these to do things that could not be done with the +> PmWiki interface. --[[Joey]] + +>>You have some good points. I was aware of using filter, but it didn't occur to me that one could use sanitize to do processing also, probably because "sanitize" brought to mind removing harmful content rather than doing other alterations. +>>It has also occurred to me, on further thought, that if one wants one's chosen markup to actually be processed during the "preprocess" stage, that one could do so by converting the chosen markup to directive-style markup during the "filter" stage and then processing the directive during the "preprocess" stage as per usual. Is there a tag for "no longer on the wishlist?". --[[KathrynAndersen]] + +>>> Yeah, sanitize is a misleading name for the relatively few things that +>>> use it this way. +>>> +>>> While you could do a filter to preprocess step, it is a bit +>>> of a long way round, since filter always runs just before +>>> preprocess. +>>> +>>> Anyway, guess this is [[done]] --[[Joey]] diff --git a/doc/todo/finer_control_over___60__object___47____62__s.mdwn b/doc/todo/finer_control_over___60__object___47____62__s.mdwn new file mode 100644 index 000000000..50c4d43bf --- /dev/null +++ b/doc/todo/finer_control_over___60__object___47____62__s.mdwn @@ -0,0 +1,98 @@ +IIUC, the current version of [HTML::Scrubber][] allows for the `object` tags to be either enabled or disabled entirely. However, while `object` can be used to add *code* (which is indeed a potential security hole) to a document, reading [Objects, Images, and Applets in HTML documents][objects-html] reveals that the “dangerous” are not all the `object`s, but rather those having the following attributes: + + classid %URI; #IMPLIED -- identifies an implementation -- + codebase %URI; #IMPLIED -- base URI for classid, data, archive-- + codetype %ContentType; #IMPLIED -- content type for code -- + archive CDATA #IMPLIED -- space-separated list of URIs -- + +It seems that the following attributes are, OTOH, safe: + + declare (declare) #IMPLIED -- declare but don't instantiate flag -- + data %URI; #IMPLIED -- reference to object's data -- + type %ContentType; #IMPLIED -- content type for data -- + standby %Text; #IMPLIED -- message to show while loading -- + height %Length; #IMPLIED -- override height -- + width %Length; #IMPLIED -- override width -- + usemap %URI; #IMPLIED -- use client-side image map -- + name CDATA #IMPLIED -- submit as part of form -- + tabindex NUMBER #IMPLIED -- position in tabbing order -- + +Should the former attributes be *scrubbed* while the latter left intact, the use of the `object` tag would seemingly become safe. + +Note also that allowing `object` (either restricted in such a way or not) automatically solves the [[/todo/svg]] issue. + +For Ikiwiki, it may be nice to be able to restrict [URI's][URI] (as required by the `data` and `usemap` attributes) to, say, relative and `data:` (as per [RFC 2397][]) ones as well, though it requires some more consideration. + +— [[Ivan_Shmakov]], 2010-03-12Z. + +[[wishlist]] + +> SVG can contain embedded javascript. + +>> Indeed. + +>> So, a more general tool (`XML::Scrubber`?) will be necessary to +>> refine both [XHTML][] and SVG. + +>> … And to leave [MathML][] as is (?.) + +>> — [[Ivan_Shmakov]], 2010-03-12Z. + +> The spec that you link to contains +> examples of objects that contain python scripts, Microsoft OLE +> objects, and Java. And then there's flash. I don't think ikiwiki can +> assume all the possibilities are handled securely, particularly WRT XSS +> attacks. +> --[[Joey]] + +>> I've scanned over all the `object` examples in the specification and +>> all of those that hold references to code (as opposed to data) have a +>> distinguishing `classid` attribute. + +>> While I won't assert that it's impossible to reference code with +>> `data` (and, thanks to `text/xhtml+xml` and `image/svg+xml`, it is +>> *not* impossible), throwing away any of the “insecure” +>> attributes listed above together with limiting the possible URI's +>> (i. e., only *local* and certain `data:` ones for `data` and +>> `usemap`) should make `object` almost as harmless as, say, `img`. + +>>> But with local data, one could not embed youtube videos, which surely +>>> is the most obvious use case? + +>>>> Allowing a “remote” object to render on one's page is a + security issue by itself. + Though, of course, having an explicit whitelist of URI's may make + this issue more tolerable. + — [[Ivan_Shmakov]], 2010-03-12Z. + +>>> Note that youtube embedding uses an +>>> object element with no classid. The swf file is provided via an +>>> enclosed param element. --[[Joey]] + +>>>> I've just checked a random video on YouTube and I see that the + `.swf` file is provided via an enclosed `embed` element. Whether + to allow those or not is a different issue. + — [[Ivan_Shmakov]], 2010-03-12Z. + +>> (Though it certainly won't solve the [[SVG_problem|/todo/SVG]] being +>> restricted in such a way.) + +>> Of the remaining issues I could only think of recursive +>> `object` — the one that references its container document. + +>> — [[Ivan_Shmakov]], 2010-03-12Z. + +## See also + +* [Objects, Images, and Applets in HTML documents][objects-html] +* [[plugins/htmlscrubber|/plugins/htmlscrubber]] +* [[todo/svg|/todo/svg]] +* [RFC 2397: The “data” URL scheme. L. Masinter. August 1998.][RFC 2397] +* [Uniform Resource Identifier — the free encyclopedia][URI] + +[HTML::Scrubber]: http://search.cpan.org/~podmaster/HTML-Scrubber-0.08/Scrubber.pm +[MathML]: http://en.wikipedia.org/wiki/MathML +[objects-html]: http://www.w3.org/TR/1999/REC-html401-19991224/struct/objects.html +[RFC 2397]: http://tools.ietf.org/html/rfc2397 +[URI]: http://en.wikipedia.org/wiki/Uniform_Resource_Identifier +[XHTML]: http://en.wikipedia.org/wiki/XHTML diff --git a/doc/todo/generated_po_stuff_not_ignored_by_git.mdwn b/doc/todo/generated_po_stuff_not_ignored_by_git.mdwn index 1d24fd385..29c017c5d 100644 --- a/doc/todo/generated_po_stuff_not_ignored_by_git.mdwn +++ b/doc/todo/generated_po_stuff_not_ignored_by_git.mdwn @@ -1,4 +1,3 @@ -[[!template id=gitbranch branch=smcv/gitignore author="[[smcv]]"]] [[!tag patch]] The recent merge of the po branch didn't come with a .gitignore. diff --git a/doc/todo/generic_insert_links b/doc/todo/generic_insert_links new file mode 100644 index 000000000..050f32ee7 --- /dev/null +++ b/doc/todo/generic_insert_links @@ -0,0 +1,24 @@ +The attachment plugin's Insert Links button currently only knows +how to insert plain wikilinks and img directives (for images). + +[[wishlist]]: Generalize this, so a plugin can cause arbitrary text +to be inserted for a particular file. --[[Joey]] + +Design: + +Add an insertlinks hook. Each plugin using the hook would be called, +and passed the filename of the attachment. If it knows how to handle +the file type, it returns a the text that should be inserted on the page. +If not, it returns undef, and the next plugin is tried. + +This would mean writing plugins in order to handle links for +special kinds of attachments. To avoid that for simple stuff, +a fallback plugin could run last and look for a template +named like `templates/embed_$extension`, and insert a directive like: + + \[[!template id=embed_vp8 file=my_movie.vp8]] + +Then to handle a new file type, a user could just make a template +that expands to some relevant html. In the example above, +`templates/embed_vp8` could make a html5 video tag, possibly with some +flash fallback code even. diff --git a/doc/todo/git_attribution/discussion.mdwn b/doc/todo/git_attribution/discussion.mdwn index dfb490bc2..6905d9b4b 100644 --- a/doc/todo/git_attribution/discussion.mdwn +++ b/doc/todo/git_attribution/discussion.mdwn @@ -72,7 +72,7 @@ no determination of uniqueness) > GIT_AUTHOR_EMAIL can also be set. > > There is one thing yet to be solved, and that is how to tell the -> difference between a web commit by 'Joey Hess <joey@kitenet.net>', +> difference between a web commit by 'Joey Hess <joey\@kitenet.net>', > and a git commit by the same. I think we do want to differentiate these, > and the best way to do it seems to be to add a line to the end of the > commit message. Something like: "\n\nWeb-commit: true" @@ -94,5 +94,5 @@ no determination of uniqueness) > * github pushes to twitter ;-) > > So while I tried that way at first, I'm now leaning toward encoding the -> username in the email address. Like "user <user@web>", or -> "joey <http://joey.kitenet.net/@web>". +> username in the email address. Like "user <user\@web>", or +> "joey <http://joey.kitenet.net/\@web>". diff --git a/doc/todo/html.mdwn b/doc/todo/html.mdwn index 44f20c876..4f4542be2 100644 --- a/doc/todo/html.mdwn +++ b/doc/todo/html.mdwn @@ -1,6 +1,6 @@ Create some nice(r) stylesheets. Should be doable w/o touching a single line of code, just -editing the [[wikitemplates]] and/or editing [[style.css]]. +editing the [[templates]] and/or editing [[style.css]]. [[done]] ([[css_market]] ..) diff --git a/doc/todo/htpasswd_mirror_of_the_userdb.mdwn b/doc/todo/htpasswd_mirror_of_the_userdb.mdwn new file mode 100644 index 000000000..e4a411780 --- /dev/null +++ b/doc/todo/htpasswd_mirror_of_the_userdb.mdwn @@ -0,0 +1,29 @@ +[[!tag wishlist]] + +Ikiwiki is static, so access control for viewing the wiki must be +implemented on the web server side. Managing wiki users and access +together, we can currently + +* use [[httpauth|plugins/httpauth/]], but some [[passwordauth|plugins/passwordauth]] functionnality [[is missing|todo/httpauth_feature_parity_with_passwordauth/]]; +* use [[passwordauth|plugins/passwordauth]] plus [[an Apache `mod_perl` authentication mechanism|plugins/passwordauth/discussion/]], but this is Apache-centric and enabling `mod_perl` just for auth seems overkill. + +Moreover, when ikiwiki is just a part of a wider web project, we may want +to use the same userdb for the other parts of this project. + +I think an ikiwiki plugin which would (re)generate an htpasswd version of +the user/passwd base (better, two htpasswd files, one with only the wiki +admins and one with everyone) each time an user is added or modified would +solve this problem: + +* access control can be managed from the web server +* user management is handled by the passwordauth plugin +* htpasswd format is understood by various servers (Apache, lighttpd, nginx, ...) and languages commonly used for web development (perl, python, ruby) +* htpasswd files can be mirrored on other machines when the web site is distributed + +-- [[nil]] + +> I think this is a good idea. Although unless the password hashes that +> are stored in the userdb are compatible with htpasswd hashes, +> the htpasswd hashes will need to be stored in the userdb too. Then +> any userdb change can just regenerate the htpasswd file, dumping out +> the right kind of hashes. --[[Joey]] diff --git a/doc/todo/http_bl_support.mdwn b/doc/todo/http_bl_support.mdwn new file mode 100644 index 000000000..f7a46ee6c --- /dev/null +++ b/doc/todo/http_bl_support.mdwn @@ -0,0 +1,67 @@ +[Project Honeypot](http://projecthoneypot.org/) has an HTTP:BL API available to subscribed (it's free, accept donations) people/orgs. There's a basic perl package someone wrote, I'm including a copy here. + +[from here](http://projecthoneypot.org/board/read.php?f=10&i=112&t=112) + +> The [[plugins/blogspam]] service already checks urls against +> the surbl, and has its own IP blacklist. The best way to +> support the HTTP:BL may be to add a plugin +> [there](http://blogspam.repository.steve.org.uk/file/cc858e497cae/server/plugins/). +> --[[Joey]] + +<pre> +package Honeypot; + +use Socket qw/inet_ntoa/; + +my $dns = 'dnsbl.httpbl.org'; +my %types = ( +0 => 'Search Engine', +1 => 'Suspicious', +2 => 'Harvester', +4 => 'Comment Spammer' +); +sub query { +my $key = shift || die 'You need a key for this, you get one at http://www.projecthoneypot.org'; +my $ip = shift || do { +warn 'no IP for request in Honeypot::query().'; +return; +}; + +my @parts = reverse split /\./, $ip; +my $lookup_name = join'.', $key, @parts, $dns; + +my $answer = gethostbyname ($lookup_name); +return unless $answer; +$answer = inet_ntoa($answer); +my(undef, $days, $threat, $type) = split /\./, $answer; +my @types; +while(my($bit, $typename) = each %types) { +push @types, $typename if $bit & $type; +} +return { +days => $days, +threat => $threat, +type => join ',', @types +}; + +} +1; +</pre> + +From the page: + +> The usage is simple: + +> use Honeypot; +> my $key = 'XXXXXXX'; # your key +> my $ip = '....'; the IP you want to check +> my $q = Honeypot::query($key, $ip); + +> use Data::Dumper; +> print Dumper $q; + +Any chance of having this as a plugin? + +I could give it a go, too. Would be fun to try my hand at Perl. --[[simonraven]] + +[[!tag wishlist]] diff --git a/doc/todo/inline_plugin:_specifying_ordered_page_names.mdwn b/doc/todo/inline_plugin:_specifying_ordered_page_names.mdwn index bbde04f83..85bb4ff5a 100644 --- a/doc/todo/inline_plugin:_specifying_ordered_page_names.mdwn +++ b/doc/todo/inline_plugin:_specifying_ordered_page_names.mdwn @@ -1,5 +1,3 @@ -[[!template id=gitbranch branch=smcv/ready/inline-pagenames author="[[smcv]]"]] - A [[!taglink patch]] in my git repository (the inline-pagenames branch) adds the following parameter to the [[ikiwiki/directive/inline]] directive: diff --git a/doc/todo/l10n.mdwn b/doc/todo/l10n.mdwn index bba103494..a539f0424 100644 --- a/doc/todo/l10n.mdwn +++ b/doc/todo/l10n.mdwn @@ -1,3 +1,21 @@ +ikiwiki should be fully internationalized. + +---- + +As to the hardcoded strings in ikiwiki, I've internationalized the program, +and there is a po/ikiwiki.pot in the source that can be translated. +--[[Joey]] + +---- + +> The now merged po plugin handles l10n of wiki pages. The only missing +> piece now is l10n of the templates. +> --[[Joey]] + +---- + +## template i18n + From [[Recai]]: > Here is my initial work on ikiwiki l10n infrastructure (I'm sending it > before finalizing, there may be errors). @@ -48,54 +66,19 @@ Birisi[1], ki muhtemelen bu sizsiniz, <TMPL_VAR WIKINAME>[2] üzerindeki bulundu. Parola: <TMPL_VAR USER_PASSWORD> -- ikiwiki [1] Parolayı isteyen kullanıcının ait IP adresi: <TMPL_VAR REMOTE_ADDR>[2] <TMPL_VAR WIKIURL> -> Looks like, tmpl_process3 cannot preserve line breaks in template files. -> For example, it processed the following template: - This could be easily worked around in tmpl_process3, but I wouldn't like to maintain a separate utility. ---- -As to the hardcoded strings in ikiwiki, I've internationalized the program, -and there is a po/ikiwiki.pot in the source that can be translated. ---[[Joey]] - ----- - -Danish l10n of templates and basewiki is available with the following commands: - - git clone http://source.jones.dk/ikiwiki.git newsite - cd newsite - make - -Updates are retrieved with this single command: +Another way to approach this would be to write a small program that outputs +the current set of templates. Now i18n of that program is trivial, +and it can be run once per language to generate localized templates. - make +Then it's just a matter of installing the templates somewhere, and having +them be used when a different language is enabled. -l10n is maintained using po4a for basewiki, smiley and templates - please send me PO files if you -translate to other languagess than the few I know about myself: <dr@jones.dk> - -As upstream ikiwiki is now maintained in GIT too, keeping the master mirror in sync with upstream -could probably be automated even more - but an obstacle seems to be that content is not maintained -separately but as an integral part of upstream source (GIT seems to not support subscribing to -only parts of a repository). - -For example use, here's how to roll out a clone of the [Redpill support site](http://support.redpill.dk/): - - mkdir -p ~/public_cgi/support.redpill.dk - git clone git://source.jones.dk/bin - bin/localikiwikicreatesite -o git://source.redpill.dk/support rpdemo - -(Redpill support is inspired by <http://help.riseup.net> but needs to be reusable for several similarly configured networks) - ---[[JonasSmedegaard]] - -> I don't understand at all why you're using git the way you are. -> -> I think that this needs to be reworked into a patch against the regular -> ikiwiki tree, that adds the po4a stuff needed to generate the pot files for the -> basewiki and template content, as well as the stuff that generates the -> translated versions of those from the po files. -> -> The now merged po plugin also handles some of this. -> --[[Joey]] +It would make sense to make the existing `locale` setting control which +templates are used. But the [[plugins/po]] plugin would probably want to do +something more, and use the actual language the page is written in. +--[[Joey]] diff --git a/doc/todo/language_definition_for_the_meta_plugin.mdwn b/doc/todo/language_definition_for_the_meta_plugin.mdwn index 90bfbef3b..44fcf32bb 100644 --- a/doc/todo/language_definition_for_the_meta_plugin.mdwn +++ b/doc/todo/language_definition_for_the_meta_plugin.mdwn @@ -95,7 +95,24 @@ This may be useful for sites with a few pages in different languages, but no ful >>> Now that po has been merged, this patch should probably also be adapted >>> so that the po plugin forces the meta::lang of every page to what po ->>> thinks it should be. Perhaps [[the_special_po_pagespecs|ikiwiki/pagespec/po]] ->>> should also work with meta-assigned languages? --[[smcv]] +>>> thinks it should be. --[[smcv]] + +>>>> Agreed, users of the po plugin would greatly benefit from it. +>>>> Seems doable. --[[intrigeri]] + +>>> Perhaps [[the_special_po_pagespecs|ikiwiki/pagespec/po]] should +>>> also work with meta-assigned languages? --[[smcv]] + +>>>> Yes. But then, these special pagespecs should be moved outside of +>>>> [[plugins/po]], as they could be useful to anyone using the +>>>> currently discussed patch even when not using the po plugin. +>>>> +>>>> We could add these pagespecs to the core and make them use +>>>> a simple language-guessing system based on a new hook. Any plugin +>>>> that implements such a hook could decide whether it should +>>>> overrides the language guessed by another one, and optionally use +>>>> the `first`/`last` options (e.g. the po plugin will want to be +>>>> authoritative on the pages of type po, and will then use +>>>> `last`). --[[intrigeri]] [[!tag wishlist patch plugins/meta translation]] diff --git a/doc/todo/link_plugin_perhaps_too_general__63__.mdwn b/doc/todo/link_plugin_perhaps_too_general__63__.mdwn new file mode 100644 index 000000000..8a5fd50eb --- /dev/null +++ b/doc/todo/link_plugin_perhaps_too_general__63__.mdwn @@ -0,0 +1,25 @@ +[[!tag wishlist blue-sky]] +(This isn't important to me - I don't use MediaWiki or Creole syntax myself - +but just thinking out loud...) + +The [[ikiwiki/wikilink]] syntax IkiWiki uses sometimes conflicts with page +languages' syntax (notably, [[plugins/contrib/MediaWiki]] and [[plugins/Creole]] +want their wikilinks the other way round, like +`\[[plugins/write|how to write a plugin]]`). It would be nice if there was +some way for page language plugins to opt in/out of the normal wiki link +processing - then MediaWiki and Creole could have their own `linkify` hook +that was only active for *their* page types, and used the appropriate +syntax. + +In [[todo/matching_different_kinds_of_links]] I wondered about adding a +`\[[!typedlink to="foo" type="bar"]]` directive. This made me wonder whether +a core `\[[!link]]` directive would be useful; this could be a fallback for +page types where a normal wikilink can't be done for whatever reason, and +could also provide extension points more easily than WikiLinks' special +syntax with extra punctuation, which doesn't really scale? + +Straw-man: + + \[[!link to="ikiwiki/wikilink" desc="WikiLinks"]] + +--[[smcv]] diff --git a/doc/todo/make_link_target_search_all_paths_as_fallback.mdwn b/doc/todo/make_link_target_search_all_paths_as_fallback.mdwn new file mode 100644 index 000000000..f971fd8e0 --- /dev/null +++ b/doc/todo/make_link_target_search_all_paths_as_fallback.mdwn @@ -0,0 +1,27 @@ +[[!tag wishlist]] + +## Idea + +After searching from the most local to the root for a wikilinkable page, descend into the tree of pages looking for a matching page. + +For example, if I link to \[\[Pastrami\]\] from /users/eric, the current behavior is to look for + + * /users/eric/pastrami + * /users/pastrami + * /users/eric/pastrami + +I'd like it to find /sandwiches/pastrami. + +## Issues + +I know this is a tougher problem, especially to scale efficiently. There is also not a clear ordering unless it is the recursive dictionary ordering (ie the order of a breadth-first search with natural ordering). It would probably require some sort of static lookup table keyed by pagename and yielding a path. This could be generated initially by a breadth-first search and then updated incrementally when pages are added/removed/renamed. In some ways a global might not be ideal, since one might argue that the link above should match /users/eric/sandwiches/pastrami before /sandwiches/pastrami. I guess you could put all matching paths in the lookup table and then evaluate the ordering at parse-time. + +## Motivation + +Since I often access my documents using a text editor, I find it useful to keep them ordered in a heirarchy, about 3 levels deep with a branching factor of perhaps 10. When linking though, I'd like the wiki to find the document for me, since I am lazy. + +Also, many of my wiki pages comprise the canonical local representation of some unique entity, for example I might have /software/ikiwiki. The nesting, however, is only to aid navigation, and shouldn't be considered as part of resource's name. + +## Alternatives + +If an alias could be specified in the page body (for example, /ikiwiki for /software/ikiwiki) which would then stand in for a regular page when searching, then the navigational convenience of folders could be preserved while selectively flattening the search namespace. diff --git a/doc/todo/mark_edit_as_trivial__44___identify__47__filter_on_trivial_changes.mdwn b/doc/todo/mark_edit_as_trivial__44___identify__47__filter_on_trivial_changes.mdwn new file mode 100644 index 000000000..2b2b0242e --- /dev/null +++ b/doc/todo/mark_edit_as_trivial__44___identify__47__filter_on_trivial_changes.mdwn @@ -0,0 +1,11 @@ +One feature of mediawiki which I quite like is the ability to mark a change as 'minor', or 'trivial'. This can then be used to filter the 'recentchanges' page, to only show substantial edits. + +The utility of this depends entirely on whether the editors use it properly. + +I currently use an inline on the front page of my personal homepage to show the most recent pages (by creation date) within a subsection of my site (a blog). Blog posts are rarely modified much after they are 'created' (or published - I bodge the creation time via meta when I publish a post. It might sit in draft form indefinitely), so this effectively shows only non-trivial changes. + +I would like to have a short list of the most recent modifications to the site on the front page. I therefore want to sort by modified time rather than creation time, but exclude edits that I self-identify as minor. I also only want to take a short number of items, the top 5, and display only their titles (which may be derived from filename, or set via meta again). + +I'm still thinking through how this might be achieved in an ikiwiki-suitable fashion, but I think I need a scheme to identify certain edits as trivial. This would have to work via web edits (easier: could add a check box to the edit form) and plain changes in the VCS (harder: scan for keywords in a commit message? in a VCS-agnostic fashion?) + +[[!tag wishlist]] diff --git a/doc/todo/matching_different_kinds_of_links.mdwn b/doc/todo/matching_different_kinds_of_links.mdwn index 26c5a072b..da3ea49f6 100644 --- a/doc/todo/matching_different_kinds_of_links.mdwn +++ b/doc/todo/matching_different_kinds_of_links.mdwn @@ -36,6 +36,11 @@ Besides pagespecs, the `rel=` attribute could be used for styles. --Ivan Z. > normal links.) Might be better to go ahead and add the variable to > core though. --[[Joey]] +>> I've implemented this with the data structure you suggested, except that +>> I called it `%typedlinks` instead of `%linktype` (it seemed to make more +>> sense that way). I also ported `tag` to it, and added a `tagged_is_strict` +>> config option. See below! --[[smcv]] + I saw somewhere else here some suggestions for the wiki-syntax for specifying the relation name of a link. One more suggestion---[the syntax used in Semantic MediaWiki](http://en.wikipedia.org/wiki/Semantic_MediaWiki#Basic_usage), like this: <pre> @@ -45,3 +50,147 @@ I saw somewhere else here some suggestions for the wiki-syntax for specifying th So a part of the effect of [[`\[[!taglink TAG\]\]`|plugins/tag]] could be represented as something like `\[[tag::TAG]]` or (more understandable relation name in what concerns the direction) `\[[tagged::TAG]]`. I don't have any opinion on this syntax (whether it's good or not)...--Ivan Z. + +------- + +>> [[!template id=gitbranch author="[[Simon_McVittie|smcv]]" branch=smcv/ready/link-types]] +>> [[!tag patch]] + +## Documentation for smcv's branch + +### added to [[ikiwiki/pagespec]] + +* "`typedlink(type glob)`" - matches pages that link to a given page (or glob) + with a given link type. Plugins can create links with a specific type: + for instance, the tag plugin creates links of type `tag`. + +### added to [[plugins/tag]] + +If the `tagged_is_strict` config option is set, `tagged()` will only match +tags explicitly set with [[ikiwiki/directive/tag]] or +[[ikiwiki/directive/taglink]]; if not (the default), it will also match +any other [[WikiLinks|ikiwiki/WikiLink]] to the tag page. + +### added to [[plugins/write]] + +#### `%typedlinks` + +The `%typedlinks` hash records links of specific types. Do not modify this +hash directly; call `add_link()`. The keys are page names, and the values +are hash references. In each page's hash reference, the keys are link types +defined by plugins, and the values are hash references with link targets +as keys, and 1 as a dummy value, something like this: + + $typedlinks{"foo"} = { + tag => { short_word => 1, metasyntactic_variable => 1 }, + next_page => { bar => 1 }, + }; + +Ordinary [[WikiLinks|ikiwiki/WikiLink]] appear in `%links`, but not in +`%typedlinks`. + +#### `add_link($$;$)` + + This adds a link to `%links`, ensuring that duplicate links are not + added. Pass it the page that contains the link, and the link text. + +An optional third parameter sets the link type (`undef` produces an ordinary +[[ikiwiki/WikiLink]]). + +## Review + +Some code refers to `oldtypedlinks`, and other to `oldlinktypes`. --[[Joey]] + +> Oops, I'll fix that. That must mean missing test coverage, too :-( +> --s + +>> A test suite for the dependency resolver *would* be nice. --[[Joey]] + +>>> Bug fixed, I think. A test suite for the dependency resolver seems +>>> more ambitious than I want to get into right now, but I added a +>>> unit test for this part of it... --s + +I'm curious what your reasoning was for adding a new variable +rather than using `pagestate`. Was it only because you needed +the `old` version to detect change, or was there other complexity? +--J + +> You seemed to be more in favour of adding it to the core in +> your proposal above, so I assumed that'd be more likely to be +> accepted :-) I don't mind one way or the other - `%typedlinks` +> costs one core variable, but saves one level of hash nesting. If +> you're not sure either, then I think the decision should come down +> to which one is easier to document clearly - I'm still unhappy with +> my docs for `%typedlinks`, so I'll try to write docs for it as +> `pagestate` and see if they work any better. --s + +>> On reflection, I don't think it's any better as a pagestate, and +>> the contents of pagestates (so far) aren't documented for other +>> plugins' consumption, so I'm inclined to leave it as-is, unless +>> you want to veto that. Loose rationale: it needs special handling +>> in the core to be a dependency type (I re-used the existing link +>> type), it's API beyond a single plugin, and it's really part of +>> the core parallel to pagestate rather than being tied to a +>> specific plugin. Also, I'd need to special-case it to have +>> ikiwiki not delete it from the index, unless I introduced a +>> dummy typedlinks plugin (or just hook) that did nothing... --s + +I have not convinced myself this is a real problem, but.. +If a page has a typed link, there seems to be no way to tell +if it also has a separate, regular link. `add_link` will add +to `@links` when adding a typed, or untyped link. If only untyped +links were recorded there, one could tell the difference. But then +typed links would not show up at all in eg, a linkmap, +unless it was changed to check for typed links too. +(Or, regular links could be recorded in typedlinks too, +with a empty type. (Bloaty.)) --J + +> I think I like the semantics as-is - I can't think of any +> reason why you'd want to ask the question "does A link to B, +> not counting tags and other typed links?". A typed link is +> still a link, in my mind at least. --s + +>> Me neither, let's not worry about it. --[[Joey]] + +I suspect we could get away without having `tagged_is_strict` +without too much transitional trouble. --[[Joey]] + +> If you think so, I can delete about 5 LoC. I don't particularly +> care either way; [[Jon]] expressed concern about people relying +> on the current semantics, on one of the pages requesting this +> change. --s + +>> Removed in a newer version of the branch. --s + +I might have been wrong to introduce `typedlink(tag foo)`. It's not +very user-friendly, and is more useful as a backend for other plugins +that as a feature in its own right - any plugin introducing a link +type will probably also want to have its own preprocessor directive +to set that link type, and its own pagespec function to match it. +I wonder whether to make a `typedlink` plugin that has the typedlink +pagespec match function and a new `\[[!typedlink to="foo" type="bar"]]` +though... --[[smcv]] + +> I agree, per-type matchers are more friendly and I'm not enamored of the +> multi-parameter pagespec syntax. --[[Joey]] + +>> Removed in a newer version of the branch. I re-introduced it as a +>> plugin in `smcv/typedlink`, but I don't think we really need it. --s + +---- + +I am ready to merge this, but I noticed one problem -- since `match_tagged` +now only matches pages with the tag linktype, a wiki will need to be +rebuilt on upgrade in order to get the linktype of existing tags in it +recorded. So there needs to be a NEWS item about this and +the postinst modified to force the rebuild. + +> Done, although you'll need to plug in an appropriate version number when +> you release it. Is there a distinctive reminder string you grep for +> during releases? I've used `UNRELEASED` for now. --[[smcv]] + +Also, the ready branch adds `typedlink()` to [[ikiwiki/pagespec]], +but you removed that feature as documented above. +--[[Joey]] + +> [[Done]]. --s diff --git a/doc/todo/mercurial.mdwn b/doc/todo/mercurial.mdwn index e71c8106a..de1f148e5 100644 --- a/doc/todo/mercurial.mdwn +++ b/doc/todo/mercurial.mdwn @@ -119,3 +119,11 @@ I have a few notes on mercurial usage after trying it out for a while: >> I think the ideal solution would be to build `$destdir/recentchanges/*` directly from the output of `hg log`. --[[buo]] >>>> That would be 100 times as slow, so I chose not to do that. --[[Joey]] + +>>>> Since this is confusing people, allow me to clarify: Ikiwiki's +>>>> recentchanges generation pulls log information directly out of the VCS as +>>>> needed. It caches it in recentchanges/* in the `scrdir`. These cache +>>>> files need not be preserved, should never be checked into VCS, and if +>>>> you want to you can configure your VCSignore file to ignore them, +>>>> just as you can configure it to ignore the `.ikiwiki` directory in the +>>>> `scrdir`. --[[Joey]] diff --git a/doc/todo/mirrorlist_with_per-mirror_usedirs_settings.mdwn b/doc/todo/mirrorlist_with_per-mirror_usedirs_settings.mdwn new file mode 100644 index 000000000..6ca9962ba --- /dev/null +++ b/doc/todo/mirrorlist_with_per-mirror_usedirs_settings.mdwn @@ -0,0 +1,24 @@ +I've got a wiki that is built at two places: + +* a static copy, aimed at being viewed without any web server, using + a web browser's `file:///` urls => usedirs is disabled to get nice + and working links +* an online copy, with usedirs enabled in order to benefit from the + language negotiation using the po plugin + +I need to use mirrorlist on the static copy, so that one can easily +reach the online, possibly updated, pages. But as documented, "pages are +assumed to exist in the same location under the specified url on each +mirror", so the generated urls are wrong. + +My `mirrorlist` branch contains a patch that allows one to configure usedirs +per-mirror. Note: the old configuration format is still supported, so this should +not break existing wikis. + +OT: as a bonus, this branch contains a patch to support {hashes,arrays} of +{hashes,arrays} in `$config`, which I missed a bit when writing the po plugin, +and decided this time it was really needed to implement this feature. + +--[[intrigeri]] + +[[!tag patch]] diff --git a/doc/todo/more_flexible_inline_postform.mdwn b/doc/todo/more_flexible_inline_postform.mdwn index bc8bc0809..414476bd7 100644 --- a/doc/todo/more_flexible_inline_postform.mdwn +++ b/doc/todo/more_flexible_inline_postform.mdwn @@ -16,3 +16,8 @@ logical first step towards doing comment-like things with inlined pages). > Perhaps what we need is a `postform` plugin/directive that inline depends > on (automatically enables); its preprocess method could automatically be > invoked from preprocess_inline when needed. --[[smcv]] + +>> I've been looking at this stuff again. I think you are right, this would +>> be the right approach. The comments plugin could use it similarly, allowing +>> sites which desire it to have an inline comment submission form on all +>> pages with comments enabled. I'm going to take a look. -- [[Jon]] diff --git a/doc/todo/multiple_template_directories.mdwn b/doc/todo/multiple_template_directories.mdwn index c09a9595f..6a474b4f3 100644 --- a/doc/todo/multiple_template_directories.mdwn +++ b/doc/todo/multiple_template_directories.mdwn @@ -11,3 +11,63 @@ ought to do the trick. > global dir when it cannot find a template. For me, this is good enough. > And it is even documented in the man page. Sigh. I guess this could be > considered [[done]]. + +I have a use case for this, a site composed of blogs and wikis, templates divided in three categories : common, blog and wiki. The only solution I found is maintaining hard links, being able to have multiple template dirs would obviously be better. -- Changaco + +> [[plugins/underlay]] used to allow adding extra templatedirs, but Joey +> removed that functionality when he made templates search the wiki's +> own `templates` directory. +> +> You can get a 3-level hierarchy like this: +> +> * instance-specific overrides: $srcdir/templates +> * common to the entire site: a directory that is the value of all +> instances' `templatedir` parameters +> * common to every ikiwiki in the world: /usr/share/ikiwiki/templates +> (implicitly searched) +> +> (by "instance" I mean an instance of ikiwiki - a .setup file, basically.) +> +> For a more complex hierarchy you'd need the old [[plugins/underlay]] +> functionality, i.e. you'd need to (ask Joey to) revert the patch that +> removed it. For instance, if anyone has a hierarchy like this, then +> they need the old functionality back in order to split the template +> search path for the things marked `(???)`: +> +> every ikiwiki in the world (/usr/share/ikiwiki/templates) +> \--- your site (???) +> \--- your blogs (???) +> \--- travel blog ($srcdir/templates) +> \--- code blog ($srcdir/templates) +> \--- your wikis (???) +> \--- travel wiki ($srcdir/templates) +> \--- code wiki ($srcdir/templates) +> +> This looks pretty hypothetical to me, though... +> --[[smcv]] + +>> The reason I removed it is because the same functionality of having +>> multiple template directories is still present. Just put them in +>> the templates/ subdirectory of multiple underlay directories instead. +>> --[[Joey]] + +>>>Thanks, I didn't realize this was possible. Problem solved. -- Changaco + +>>>> We can consider this [[done]], then. For reference, the solution +>>>> to the hierarchy I mentioned above would be: +>>>> +>>>> all your sites have $your_underlay as an underlay +>>>> +>>>> the blogs and wikis all have $blog_underlay or $wiki_underlay +>>>> (as appropriate) as a higher priority underlay +>>>> +>>>> every ikiwiki in the world (/usr/share/ikiwiki/templates) +>>>> \--- your site ($your_underlay/templates, or templatedir) +>>>> \--- your blogs ($blog_underlay/templates) +>>>> \--- travel blog ($srcdir/templates) +>>>> \--- code blog ($srcdir/templates) +>>>> \--- your wikis ($wiki_underlay/templates) +>>>> \--- travel wiki ($srcdir/templates) +>>>> \--- code wiki ($srcdir/templates) +>>>> +>>>> --[[smcv]] diff --git a/doc/todo/multiple_templates.mdwn b/doc/todo/multiple_templates.mdwn index 72783c556..30fb8d6ee 100644 --- a/doc/todo/multiple_templates.mdwn +++ b/doc/todo/multiple_templates.mdwn @@ -1,4 +1,4 @@ -> Another useful feature might be to be able to choose a different [[template|wikitemplates]] +> Another useful feature might be to be able to choose a different [[template|templates]] > file for some pages; [[blog]] pages would use a template different from the > home page, even if both are managed in the same repository, etc. diff --git a/doc/todo/openid_user_filtering.mdwn b/doc/todo/openid_user_filtering.mdwn index 8b2d0082e..6a318c4c0 100644 --- a/doc/todo/openid_user_filtering.mdwn +++ b/doc/todo/openid_user_filtering.mdwn @@ -7,3 +7,7 @@ So I suggest an ikiwiki configuration like: users => ["*.webvm.net"], Would only allow edits from openIDs of that form. + +> This kind of thing can be [[done]] now: --[[Joey]] +> +> locked_pages => "* and !user(http://*.webvm.net/)" diff --git a/doc/todo/optimize_simple_dependencies.mdwn b/doc/todo/optimize_simple_dependencies.mdwn new file mode 100644 index 000000000..6f6284303 --- /dev/null +++ b/doc/todo/optimize_simple_dependencies.mdwn @@ -0,0 +1,95 @@ +I'm still trying to optimize ikiwiki for a site using +[[plugins/contrib/album]], and checking which pages depend on which pages +is still taking too long. Here's another go at fixing that, using [[Will]]'s +suggestion from [[todo/should_optimise_pagespecs]]: + +> A hash, by itself, is not optimal because +> the dependency list holds two things: page names and page specs. The hash would +> work well for the page names, but you'll still need to iterate through the page specs. +> I was thinking of keeping a list and a hash. You use the list for pagespecs +> and the hash for individual page names. To make this work you need to adjust the +> API so it knows which you're adding. -- [[Will]] + +If you have P pages and refresh after changing C of them, where an average +page has E dependencies on exact page names and D other dependencies, this +branch should drop the complexity of checking dependencies from +O(P * (D+E) * C) to O(C + P*E + P*D*C). Pages that use inline or map have +a large value for E (e.g. one per inlined page) and a small value for D (e.g. +one per inline). + +Benchmarking: + +Test 1: a wiki with about 3500 pages and 3500 photos, and a change that +touches about 350 pages and 350 photos + +Test 2: the docwiki (about 700 objects not excluded by docwiki.setup, mostly +pages), docwiki.setup modified to turn off verbose, and a change that touches +the 98 pages of plugins/*.mdwn + +In both tests I rebuilt the wiki with the target ikiwiki version, then touched +the appropriate pages and refreshed. + +Results of test 1: without this branch it took around 5:45 to rebuild and +around 5:45 again to refresh (so rebuilding 10% of the pages, then deciding +that most of the remaining 90% didn't need to change, took about as long as +rebuilding everything). With this branch it took 5:47 to rebuild and 1:16 +to refresh. + +Results of test 2: rebuilding took 14.11s without, 13.96s with; refreshing +three times took 7.29/7.40/7.37s without, 6.62/6.56/6.63s with. + +(This benchmarking was actually done with my [[plugins/contrib/album]] branch, +since that's what the huge wiki needs; that branch doesn't alter core code +beyond the ready/depends-exact branch point, so the results should be +equally valid.) + +--[[smcv]] + +> Now [[merged|done]] --[[smcv]] + +---- + +> We discussed this on irc; I had some worries that things may have been +> switched to `add_depends_exact` that were not pure page names. My current +> feeling is it's all safe, but who knows. It's easy to miss something. +> Which makes me think this is not a good interface. +> +> Why not, instead, make `add_depends` smart. If it's passed something +> that is clearly a raw page name, it can add it to the exact depends hash. +> Else, add it to the pagespec hash. You can tell if it's a pure page name +> by matching on `$config{wiki_file_regexp}`. + +>> Good thinking. Done in commit 68ce514a 'Auto-detect "simple dependencies"', +>> with a related bugfix in e8b43825 "Force %depends_exact to lower case". +>> +>> Performance impact: Test 2 above takes 0.2s longer to rebuild (probably +>> from all the calls to lc, which are, however, necessary for correctness) +>> and has indistinguishable performance for a refresh. +>> +>> Test 1 took about 6 minutes to rebuild and 1:25 to refresh; those are +>> pessimistic figures, since I've added 90 more photos and 90 more pages +>> (both to the wiki as a whole, and the number touched before refreshing) +>> since testing the previous version of this branch. --[[smcv]] + +> Also I think there may be little optimisation value left in +> 7227c2debfeef94b35f7d81f42900aa01820caa3, since the "regular" dependency +> lists will be much shorter. + +>> You're probably right, but IMO it's not worth reverting it - a set (hash with +>> dummy values) is still the right data structure. --[[smcv]] + +> Sounds like inline pagenames has an already exstant bug WRT +> pages moving, which this should not make worse. Would be good to verify. + +>> If you mean the standard "add a better match for a link-like construct" bug +>> that also affects sidebar, then yes, it does have the bug, but I'm pretty +>> sure this branch doesn't make it any worse. I could solve this at the cost +>> of making pagenames less useful for interactive use, by making it not +>> respect [[ikiwiki/subpage/LinkingRules]], but instead always interpret +>> its paths as relative to the top of the wiki - that's actually all that +>> [[plugins/contrib/album]] needs. --[[smcv]] + +> Re coding, it would be nice if `refresh()` could avoid duplicating +> the debug message, etc in the two cases. --[[Joey]] + +>> Fixed in commit f805d566 "Avoid duplicating debug message..." --[[smcv]] diff --git a/doc/todo/optional_underlaydir_prefix.mdwn b/doc/todo/optional_underlaydir_prefix.mdwn new file mode 100644 index 000000000..06900a904 --- /dev/null +++ b/doc/todo/optional_underlaydir_prefix.mdwn @@ -0,0 +1,46 @@ +For security reasons, symlinks are disabled in IkiWiki. That's fair enough, but that means that some problems, which one could otherwise solve by using a symlink, cannot be solved. The specfic problem in this case is that all underlays are placed at the root of the wiki, when it could be more convenient to place some underlays in specific sub-directories. + +Use-case 1 (to keep things tidy): + +Currently IkiWiki has some javascript files in `underlays/javascript`; that directory is given as one of the underlay directories. Thus, all the javascript files appear in the root of the generated site. But it would be tidier if one could say "put the contents of *this* underlaydir under the `js` directory". + +> Of course, this could be accomplished, if we wanted to, by moving the +> files to `underlays/javascript/js`. --[[Joey]] + +Use-case 2 (a read-only external dir): + +Suppose I want to include a subset of `/usr/local/share/docs` on my wiki, say the docs about `foo`. But I want them to be under the `docs/foo` sub-directory on the generated site. Currently I can't do that. If I give `/usr/local/share/docs/foo` as an underlaydir, then the contents of that will be in the root of the site, rather than under `docs/foo`. And if I give `/usr/local/share/docs` as an underlaydir, then the contents of the `foo` dir will be under `foo`, but it will also include every other thing in `/usr/local/share/docs`. + +Since we can't use symlinks in an underlay dir to link to these directories, then perhaps one could give a specific underlay dir a specific prefix, which defines the sub-directory that the underlay should appear in. + +I'm not sure how this would be implemented, but I guess it could be configured something like this: + + prefixed_underlay => { + 'js' => '/usr/local/share/ikiwiki/javascript', + 'docs/foo' => '/usr/local/share/docs/foo', + } + +> So, let me review why symlinks are an issue. For normal, non-underlay +> pages, users who do not have filesystem access to the server may have +> commit access, and so could commit eg, a symlink to `/etc/passwd` (or +> to `/` !). The guards are there to prevent ikiwiki either exposing the +> symlink target's contents, or potentially overwriting it. +> +> Is this a concern for underlays? Most of the time, certianly not; +> the underlay tends to be something only the site admin controls. +> Not all the security checks that are done on the srcdir are done +> on the underlays, either. Most checks done on files in the underlay +> are only done because the same code handles srcdir files. The one +> exception is the test that skips processing symlinks in the underlay dir. +> (But note that the underlay directory can be a symlinkt to elsewhere +> which the srcdir, by default, cannot.) +> +> So, one way to approach this is to make ikiwiki follow directory symlinks +> inside the underlay directory. Just a matter of passing `follow => 1` to +> find. (This would still not allow individual files to be symlinks, because +> `readfile` does not allow reading symlinks. But I don't see much need +> for that.) --[[Joey]] + +>> If you think that enabling symlinks in underlay directories wouldn't be a security issue, then I'm all for it! That would be much simpler to implement, I'm sure. --[[KathrynAndersen]] + +[[!taglink wishlist]] diff --git a/doc/todo/pagespec_to_disable_ikiwiki_directives.mdwn b/doc/todo/pagespec_to_disable_ikiwiki_directives.mdwn new file mode 100644 index 000000000..4211c2d10 --- /dev/null +++ b/doc/todo/pagespec_to_disable_ikiwiki_directives.mdwn @@ -0,0 +1,5 @@ +I would like some pages (identified by pagespec) to not expand ikiwiki directives (wikilinks, or \[[!, or both, or perhaps either). + +I will tag this [[wishlist]]. It's something I might try myself. It's part of my thinking about how to handle [[comments]], as I'm still ruminating on alternatives to [[smcv]]'s approach. (with the greatest of respect to smcv!) (Perhaps my attempt will try to factor out the no-directives-allowed logic from the comments plugin). + + -- [[Jon]] diff --git a/doc/todo/pagestats_among_a_subset_of_pages.mdwn b/doc/todo/pagestats_among_a_subset_of_pages.mdwn index fd15d6a42..33f9258fd 100644 --- a/doc/todo/pagestats_among_a_subset_of_pages.mdwn +++ b/doc/todo/pagestats_among_a_subset_of_pages.mdwn @@ -1,5 +1,4 @@ [[!tag patch plugins/pagestats]] -[[!template id=gitbranch branch=smcv/ready/among author="[[smcv]]"]] My `among` branch fixes [[todo/backlinks_result_is_lossy]], then uses that to provide pagestats for links from a subset of pages. From the docs included diff --git a/doc/todo/paste_plugin.mdwn b/doc/todo/paste_plugin.mdwn new file mode 100644 index 000000000..83384a8d7 --- /dev/null +++ b/doc/todo/paste_plugin.mdwn @@ -0,0 +1,36 @@ +It was suggested that using ikiwiki as an alternative to pastebin services +could be useful, especially if you want pastes to not expire and be +cloneable. + +All you really need is a special purpose ikiwiki instance that you commit +to by git. But a web interface for pasting could also be nice. + +There could be a directive that inserts a paste form onto a page. The form +would have a big textarea for pasting into, and might also have a file +upload button (for uploading instead of pasting). It could also copy the +page edit form's dropdown of markup types, which would be especially useful +if using the highlight plugin to support programming languages. The default +should probably be txt, not mdwn, if the txt plugin is enabled. + +(There's a lot of overlap between that and editpage of course .. similar +to the overlap between the comment form and editpage.) + +When posted, the form would just come up with a new, numeric subpage +of the page it appears on, and save the paste there. + +Another thing that might be useful is a "copy" (or "paste as new") action +on the action bar. This would take an existing paste and copy it into the +paste edit form, for editing and saving under a new id. + +--- + +A sample wiki configuration using this might be: + +* enable highlight and txt +* enable anonok so anyone can paste; lock anonymous users down to only + creating new pastes, not editing other pages +* disable modification of existing pastes (how? disabling editpage would + work, but that would disallow setting up anonymous git push) +* enable comments, so that each paste can be commented on +* enable getsource, so the source to a paste can easily be downloaded +* optionally, enable untrusted git push diff --git a/doc/todo/po:_avoid_rebuilding_to_fix_meta_titles.mdwn b/doc/todo/po:_avoid_rebuilding_to_fix_meta_titles.mdwn new file mode 100644 index 000000000..402f70b79 --- /dev/null +++ b/doc/todo/po:_avoid_rebuilding_to_fix_meta_titles.mdwn @@ -0,0 +1,45 @@ +Re the meta title escaping issue worked around by `change`. + +> I suppose this does not only affect meta, but other things +> at scan time too. Also, handling it only on rebuild feels +> suspicious -- a refresh could involve changes to multiple +> pages and trigger the same problem, I think. Also, exposing +> this rebuild to the user seems really ugly, not confidence inducing. +> +> So I wonder if there's a better way. Such as making po, at scan time, +> re-run the scan hooks, passing them modified content (either converted +> from po to mdwn or with the escaped stuff cheaply de-escaped). (Of +> course the scan hook would need to avoid calling itself!) +> +> (This doesn't need to block the merge, but I hope it can be addressed +> eventually..) +> +> --[[Joey]] +>> +>> I'll think about it soon. +>> +>> --[[intrigeri]] +>> +>>> Did you get a chance to? --[[Joey]] + +>>>> I eventually did, and got rid of the ugly double rebuild of pages +>>>> at build time. This involved adding a `rescan` hook. Rationale +>>>> and details are in my po branch commit messages. I believe this +>>>> new way of handling meta title escaping to be far more robust. +>>>> Moreover this new implementation is more generic, feels more +>>>> logical to me, and probably fixes other similar bugs outside the +>>>> meta plugin scope. Please have a look when you can. +>>>> --[[intrigeri]] + +>>>>> Glad you have tackled this. Looking at +>>>>> 25447bccae0439ea56da7a788482a4807c7c459d, +>>>>> I wonder how this rescan hook is different from a scan hook +>>>>> with `last => 1` ? Ah, it comes *after* the preprocess hook +>>>>> in scan mode. Hmm, I wonder if there's any reason to have +>>>>> the scan hook called before those as it does now. Reordering +>>>>> those 2 lines could avoid adding a new hook. --[[Joey]] + +>>>>>> Sure. I was fearing to break other plugins if I did so, so I +>>>>>> did not dare to. I'll try this. --[[intrigeri]] + +>>>>>>> Done in my po branch, please have a look. --[[intrigeri]] diff --git a/doc/todo/po:_better_documentation.mdwn b/doc/todo/po:_better_documentation.mdwn new file mode 100644 index 000000000..6e9804df4 --- /dev/null +++ b/doc/todo/po:_better_documentation.mdwn @@ -0,0 +1,3 @@ +Maybe write separate documentation for the po plugin, depending on the +people it targets: translators, wiki administrators, hackers. This +plugin may be complex enough to deserve this. diff --git a/doc/todo/po:_better_links.mdwn b/doc/todo/po:_better_links.mdwn new file mode 100644 index 000000000..af879a56a --- /dev/null +++ b/doc/todo/po:_better_links.mdwn @@ -0,0 +1,12 @@ +Once the fix to +[[bugs/pagetitle_function_does_not_respect_meta_titles]] from +[[intrigeri]]'s `meta` branch is merged into ikiwiki upstream, the +generated links' text will be optionally based on the page titles set +with the [[meta|plugins/meta]] plugin, and will thus be translatable. +It will also allow displaying the translation status in links to slave +pages. Both were implemented, and reverted in commit +ea753782b222bf4ba2fb4683b6363afdd9055b64, which should be reverted +once [[intrigeri]]'s `meta` branch is merged. + +An integration branch, called `meta-po`, merges [[intrigeri]]'s `po` +and `meta` branches, and thus has this additional features. diff --git a/doc/todo/po:_better_translation_interface.mdwn b/doc/todo/po:_better_translation_interface.mdwn new file mode 100644 index 000000000..e66a77b85 --- /dev/null +++ b/doc/todo/po:_better_translation_interface.mdwn @@ -0,0 +1,2 @@ +Add a message-by-message translation interface to the PO plugin, +with automatic escaping of special chars. diff --git a/doc/todo/po:_remove_po_files_when_disabling_plugin.mdwn b/doc/todo/po:_remove_po_files_when_disabling_plugin.mdwn new file mode 100644 index 000000000..0801f7fcd --- /dev/null +++ b/doc/todo/po:_remove_po_files_when_disabling_plugin.mdwn @@ -0,0 +1,11 @@ +ikiwiki now has a `disable` hook. Should the po plugin remove the po +files from the source repository when it has been disabled? + +> pot files, possibly, but the po files contain work, so no. --[[Joey]] + +>> I tried to implement this in my `po-disable` branch, but AFAIK, the +>> current rcs plugins interface provides no way to tell whether a +>> given file (e.g. a POT file in my case) is under version control; +>> in most cases, it is not, thanks to .gitignore or similar, but we +>> can't be sure. So I just can't decide it is needed to call +>> `rcs_remove` rather than a good old `unlink`. --[[intrigeri]] diff --git a/doc/todo/po:_rethink_pagespecs.mdwn b/doc/todo/po:_rethink_pagespecs.mdwn new file mode 100644 index 000000000..50c10c5db --- /dev/null +++ b/doc/todo/po:_rethink_pagespecs.mdwn @@ -0,0 +1,11 @@ +I was suprised that, when using the map directive, a pagespec of "*" +listed all the translated pages as well as regular pages. That can +make a big difference to an existing wiki when po is turned on, +and seems generally not wanted. +(OTOH, you do want to match translated pages by +default when locking pages.) --[[Joey]] + +> Seems hard to me to sort apart the pagespec whose matching pages +> list must be restricted to pages in the master (or current?) +> language, and the ones that should not. The only solution I can see +> to this surprising behaviour is: documentation. --[[intrigeri]] diff --git a/doc/todo/po:_translation_of_directives.mdwn b/doc/todo/po:_translation_of_directives.mdwn new file mode 100644 index 000000000..89fc93620 --- /dev/null +++ b/doc/todo/po:_translation_of_directives.mdwn @@ -0,0 +1,8 @@ +If a translated page contains a directive, it may expand to some english +text, or text in whatever single language ikiwiki is configured to "speak". + +Maybe there could be a way to switch ikiwiki to speaking another language +when building a non-english page? Then the directives would get translated. + +(We also will need this in order to use translated templates, when they are +available.) diff --git a/doc/todo/po_needstranslation_pagespec.mdwn b/doc/todo/po_needstranslation_pagespec.mdwn new file mode 100644 index 000000000..45b7377ea --- /dev/null +++ b/doc/todo/po_needstranslation_pagespec.mdwn @@ -0,0 +1,12 @@ +Commit b225fdc44d4b3d in my po branch adds a `needstranslation()` +PageSpec. It makes it easy to list pages that need translation work. +Please review. --[[intrigeri]] + +> Looks good, cherry-picked. The only improvment I can +> think of is that `needstranslation(50)` could match +> only pages less than 50% translated. --[[Joey]] + +>> This improvement has been implemented as 98cc946 in my po branch. +>> --[[intrigeri]] + +[[!tag patch done]] diff --git a/doc/todo/preview_changes_before_git_commit.mdwn b/doc/todo/preview_changes_before_git_commit.mdwn new file mode 100644 index 000000000..187497cf4 --- /dev/null +++ b/doc/todo/preview_changes_before_git_commit.mdwn @@ -0,0 +1,17 @@ +ikiwiki allows to commit changes to the doc wiki over the `git://...` protocol. +It would be nice if there'd be a uniform way to view these changes before `git +push`ing. For the GNU Hurd's web pages, we include a *render_locally* script, +<http://www.gnu.org/software/hurd/render_locally>, with instructions on +<http://www.gnu.org/software/hurd/contributing/web_pages.html>, section +*Preview Changes*. With ikiwiki, one can use `make docwiki`, but that excludes +a set of pages, as per `docwiki.setup`. --[[tschwinge]] + +> `ikiwiki -setup some.setup --render file.mdwn` will build the page and +> dump it to stdout. So, for example: + + ikiwiki -setup docwiki.setup --render doc/todo/preview_changes_before_git_commit.mdwn | w3m -T text/html + +> You have to have a setup file, though it suffices to make up your own +> if you don't have the real one. Using ikiwiki.info's real setup file +> won't actually work since it uses a search plugin that gets unhappy +> if this is not in `/srv/web/ikiwiki.info`. --[[Joey]] diff --git a/doc/todo/rewrite_ikiwiki_in_haskell.mdwn b/doc/todo/rewrite_ikiwiki_in_haskell.mdwn index 204c48cd7..48ed744b1 100644 --- a/doc/todo/rewrite_ikiwiki_in_haskell.mdwn +++ b/doc/todo/rewrite_ikiwiki_in_haskell.mdwn @@ -29,6 +29,7 @@ It's appealing for a lot of reasons, including: edit in html editors currently. - This would be a chance to make WikiLinks with link texts read "the right way round" (ie, vaguely wiki creole compatably). + *[See also [[todo/link_plugin_perhaps_too_general?]] --[[smcv]]]* - The data structures would probably be quite different. - I might want to drop a lot of the command-line flags, either requiring a setup file be used for those things, or leaving the diff --git a/doc/todo/salmon_protocol_for_comment_sharing.mdwn b/doc/todo/salmon_protocol_for_comment_sharing.mdwn new file mode 100644 index 000000000..1e56b0a8b --- /dev/null +++ b/doc/todo/salmon_protocol_for_comment_sharing.mdwn @@ -0,0 +1,21 @@ +The <a href="http://www.salmon-protocol.org/home">Salmon protocol</a> +provides for aggregating comments across sites. If a site that syndicates +a feed receives a comment on an item in that feed, it can re-post the +comment to the original source. + +> Ikiwiki does not allow comments to be posted on items it aggregates. +> So salmon protocol support would only need to handle the comment +> receiving side of the protocol. +> +> The current draft protocol document confuses me when it starts talking +> about using OAuth in the abuse prevention section, since their example +> does not show use of OAuth, and it's not at all clear to me where the +> OAuth relationship between aggregator and original source is supposed +> to come from. +> +> Their security model, which goes on to include Webfinger, +> thirdparty validation services, XRD, and Magic Signatures, looks sorta +> like they kept throwing technology, at it, hoping something will stick. :-P +> --[[Joey]] + +[[!tag wishlist]] diff --git a/doc/todo/should_optimise_pagespecs.mdwn b/doc/todo/should_optimise_pagespecs.mdwn index 3ccef62fe..728ab8994 100644 --- a/doc/todo/should_optimise_pagespecs.mdwn +++ b/doc/todo/should_optimise_pagespecs.mdwn @@ -79,8 +79,6 @@ I can think about reducung the size of my wiki source and making it available on > > --[[Joey]] -[[!template id=gitbranch branch=smcv/ready/optimize-depends author="[[smcv]]"]] - >> I've been looking at optimizing ikiwiki for a site using >> [[plugins/contrib/album]] (which produces a lot of pages) and it seems >> that checking which pages depend on which pages does take a significant @@ -90,6 +88,8 @@ I can think about reducung the size of my wiki source and making it available on >> rather than a single pagespec. This does turn out to be faster, although >> not as much as I'd like. --[[smcv]] +>>> [[Merged|done]] --[[smcv]] + >>> I just wanted to note that there is a whole long discussion of dependencies and pagespecs on the [[todo/tracking_bugs_with_dependencies]] page. -- [[Will]] >>>> Yeah, I had a look at that (as the only other mention of `pagespec_merge`). @@ -98,11 +98,216 @@ I can think about reducung the size of my wiki source and making it available on >>>> I haven't actually deleted it), because the "or" operation is now done in >>>> the Perl code, rather than by merging pagespecs and translating. --[[smcv]] -[[!template id=gitbranch branch=smcv/ready/remove-pagespec-merge author="[[smcv]]"]] - >>>>> I've now added a patch to the end of that branch that deletes >>>>> `pagespec_merge` almost entirely (we do need to keep a copy around, in >>>>> ikiwiki-transition, but that copy doesn't have to be optimal or support >>>>> future features like [[tracking_bugs_with_dependencies]]). --[[smcv]] +--- + +Some questions on your optimize-depends branch. --[[Joey]] + +In saveindex it still or'd together the depends list, but the `{depends}` +field seems only useful for backwards compatability (ie, ikiwiki-transition +uses it still), and otherwise just bloats the index. + +> If it's acceptable to declare that downgrading IkiWiki requires a complete +> rebuild, I'm happy with that. I'd prefer to keep the (simple form of the) +> transition done automatically during a load/save cycle, rather than +> requiring ikiwiki-transition to be run; we should probably say in NEWS +> that the performance increase won't fully apply until the next +> rebuild. --[[smcv]] + +>> It is acceptable not to support downgrades. +>> I don't think we need a NEWS file update since any sort of refresh, +>> not just a full rebuild, will cause the indexdb to be loaded and saved, +>> enabling the optimisation. --[[Joey]] + +>>> A refresh will load the current dependencies from `{depends}` and save +>>> them as-is as a one-element `{dependslist}`; only a rebuild will replace +>>> the single complex pagespec with a long list of simpler pagespecs. +>>> --[[smcv]] + +Is an array the right data structure? `add_depends` has to loop through the +array to avoid dups, it would be better if a hash were used there. Since +inline (and other plugins) explicitly add all linked pages, each as a +separate item, the list can get rather long, and that single add_depends +loop has suddenly become O(N^2) to the number of pages, which is something +to avoid.. + +> I was also thinking about this (I've been playing with some stuff based on the +> `remove-pagespec-merge` branch). A hash, by itself, is not optimal because +> the dependency list holds two things: page names and page specs. The hash would +> work well for the page names, but you'll still need to iterate through the page specs. +> I was thinking of keeping a list and a hash. You use the list for pagespecs +> and the hash for individual page names. To make this work you need to adjust the +> API so it knows which you're adding. -- [[Will]] + +> I wasn't thinking about a lookup hash, just a dedup hash, FWIW. +> --[[Joey]] + +>> I was under the impression from previous code review that you preferred +>> to represent unordered sets as lists, rather than hashes with dummy +>> values. If I was wrong, great, I'll fix that and it'll probably go +>> a bit faster. --[[smcv]] + +>>> It depends, really. And it'd certianly make sense to benchmark such a +>>> change. --[[Joey]] + +>>>> Benchmarked, below. --[[smcv]] + +Also, since a lot of places are calling add_depends in a loop, it probably +makes sense to just make it accept a list of dependencies to add. It'll be +marginally faster, probably, and should allow for better optimisation +when adding a lot of depends at once. + +> That'd be an API change; perhaps marginally faster, but I don't +> see how it would allow better optimisation if we're de-duplicating +> anyway? --[[smcv]] + +>> Well, I was thinking that it might be sufficient to build a `%seen` +>> hash of dependencies inside `add_depends`, if the places that call +>> it lots were changed to just call it once. Of course the only way to +>> tell is benchmarking. --[[Joey]] + +>>> It doesn't seem that it significantly affects performance either way. +>>> --[[smcv]] + +In Render.pm, we now have a triply nested loop, which is a bit +scary for efficiency. It seems there should be a way to +rework this code so it can use the optimised `pagespec_match_list`, +and/or hoist some of the inner loop calculations (like the `pagename`) +out. + +> I don't think the complexity is any greater than it was: I've just +> moved one level of "loop" out of the generated Perl, to be +> in visible code. I'll see whether some of it can be hoisted, though. +> --[[smcv]] + +>> The call to `pagename` is the only part I can see that's clearly +>> run more often than before. That function is pretty inexpensive, but.. +>> --[[Joey]] + +>>> I don't see anything that can be hoisted without significant refactoring, +>>> actually. Beware that there are two pagename calls in the loop: one for +>>> `$f` (which is the page we might want to rebuild), and one for `$file` +>>> (which is the changed page that it might depend on). Note that I didn't +>>> choose those names! +>>> +>>> The three loops are over source files, their lists of dependency pagespecs, +>>> and files that might have changed. I see the following things we might be +>>> doing redundantly: +>>> +>>> * If `$file` is considered as a potential dependency for more than +>>> one `$f`, we evaluate `pagename($file)` more than once. Potential fix: +>>> cache them (this turns out to save about half a second on the docwiki, +>>> see below). +>>> * If several pages depend on the same pagespec, we evaluate whether each +>>> changed page matches that pagespec more than once: however, we do so +>>> with a different location parameter every time, so repeated calls are, +>>> in the general case, the only correct thing to do. Potential fix: +>>> perhaps special-case "page x depends on page y and nothing else" +>>> (i.e. globs that have no wildcards) into a separate hash? I haven't +>>> done anything in this direction. +>>> * Any preparatory work done by pagespec_match (converting the pagespec +>>> into Perl, mostly?) is done in the inner loop; switching to +>>> pagespec_match_list (significant refactoring) saves more than half a +>>> second on the docwiki. +>>> +>>> --[[smcv]] + +Very good catch on img/meta using the wrong dependency; verified in the wild! +(I've cherry-picked those bug fixes.) + +---- + +Benchmarking results: I benchmarked by altering docwiki.setup to switch off +verbose, running "make clean && ./Makefile.PL && make", and timing one rebuild +of the docwiki followed by three refreshes. Before each refresh I used +`touch plugins/*.mdwn` to have something significant to refresh. + +I'm assuming that "user" CPU time is the important thing here (system time was +relatively small in all cases, up to 0.35 seconds per run). + +master at the time of rebasing: 14.20s to rebuild, 10.04/12.07/14.01s to +refresh. I think you can see the bug clearly here - the pagespecs are getting +more complicated every time! + +> I can totally see a bug here, and it's one I didn't think existed. Ie, +> I thought that after the first refresh, the pagespec should stabalize, +> and what it stabalized to was probably unnecessarily long, but not +> growing w/o bounds! +> +> a) Explains why ikiwiki.info has been so slow lately. Well that and some +> other things that overloaded the system. +> b) Suggests to me we will probably want to force a rebuild on upgrade +> when fixing this (via the mechanism in the postinst). +> +> I've investigated why the pagespecs keep growing: When page A changes, +> its old depends are cleared. Then +> page B that inlines A gets rebuilt, and its old depends are also cleared. +> But page B also inlines page C; which means C gets re-rendered. And this +> happens w/o its old depends being cleared, so C's depends are doubled. +> --[[Joey]] + +After the initial optimization: 14.27s to rebuild, 8.26/8.33/8.26 to refresh. +Success! + +Not pre-joining dependencies actually took about ~0.2s more; I don't know why. +I'm worried that duplicates will just build up (again) in less simple cases, +though, so 0.2s is probably a small price to pay for that not happening (it +might well be experimental error, for that matter). + +> It's weird that the suggested optimisations to +> `add_depends` had no effect. So, the commit message to +> b6fcb1cb0ef27e5a63184440675d465fad652acf is actually wrong.. ? --[[Joey]] + +>> I'll try benchmarking again on the non-public wiki where I had the 4% +>> speedup. The docwiki is so small that 4% is hard to measure... --[[smcv]] + +Not saving {depends} to the index, using a hash instead of a list to +de-duplicate, and allowing add_depends to take an arrayref instead of a single +pagespec had no noticable positive or negative effect on this test. + +> I see e4cd168ebedd95585290c97ff42234344bfed46c is still in your branch +> though. I don't like using an arrayref, it could just take `($page, @depends)`. +> and I don't see the need to keep it if it doesn't currently help. + +>> I'll drop it. --[[smcv]] + +> Is there any reason to keep 7227c2debfeef94b35f7d81f42900aa01820caa3 +> if it doesn't improve speed? +> --[[Joey]] + +>> I'll try benchmarking on a more complex wiki and see whether it has a +>> positive or negative effect. It does avoid being O(n**2) in number of +>> dependencies. --[[smcv]] + +Memoizing the results of pagename brought the rebuild time down to 14.06s +and the refresh time down to 7.96/7.92/7.92, a significant win. + +> Ok, that seems safe to memoize. (It's a real function and it isn't +> called with a great many inputs.) Why did you chose to memoize it +> explicitly rather than adding it to the memoize list at the top? + +>> It does depend on global variables, so using Memoize seemed like asking for +>> trouble. I suppose what I did is equivalent to Memoize though... --[[smcv]] + +Refactoring to use pagespec_match_list looks more risky from a code churn +point of view; rebuild now takes 14.35s, but refresh is only 7.30/7.29/7.28, +another significant win. + +--[[smcv]] + +> I had mostly convinced myself that +> `pagespec_match_list` would not lead to a speed gain here. My reasoning +> was that you want to stop after finding one match, while `pagespec_match_list` +> checks all pages for matches. So what we're seeing is that +> on a rebuild, `@changed` is all pages, and not short-circuiting leads +> to unnecessary work. OTOH, on refresh, `@changed` is small and I suppose +> `pagespec_match_list`'s other slight efficiencies win out somehow. +> +> Welcome to the "I made ikiwiki twice as fast +> and all I got was this lousy git sha1sum" club BTW :-) --[[Joey]] + [[!tag wishlist patch patch/core]] diff --git a/doc/todo/smarter_sorting.mdwn b/doc/todo/smarter_sorting.mdwn new file mode 100644 index 000000000..901e143a7 --- /dev/null +++ b/doc/todo/smarter_sorting.mdwn @@ -0,0 +1,141 @@ +I benchmarked a build of a large wiki (my home wiki), and it was spending +quite a lot of time sorting; `CORE::sort` was called only 1138 times, but +still flagged as the #1 time sink. (I'm not sure I trust NYTProf fully +about that FWIW, since it also said 27238263 calls to `cmp_age` were +the #3 timesink, and I suspect it may not entirely accurately measure +the overhead of so many short function calls.) + +`pagespec_match_list` currently always sorts *all* pages first, and then +finds the top M that match the pagespec. That's innefficient when M is +small (as for example in a typical blog, where only 20 posts are shown, +out of maybe thousands). + +As [[smcv]] noted, It could be flipped, so the pagespec is applied first, +and then sort the smaller matching set. But, checking pagespecs is likely +more expensive than sorting. (Also, influence calculation complicates +doing that.) + +Another option, when there is a limit on M pages to return, might be to +cull the M top pages without sorting the rest. + +> The patch below implements this. +> +> But, I have not thought enough about influence calculation. +> I need to figure out which pagespec matches influences need to be +> accumulated for in order to determine all possible influences of a +> pagespec are known. +> +> The old code accumulates influences from matching all successful pages +> up to the num cutoff, as well as influences from an arbitrary (sometimes +> zero) number of failed matches. New code does not accumulate influences +> from all the top successful matches, only an arbitrary group of +> successes and some failures. +> +> Also, by the time I finished this, it was not measuarably faster than +> the old method. At least not with a few thousand pages; it +> might be worth revisiting this sometime for many more pages? [[done]] +> --[[Joey]] + +<pre> +diff --git a/IkiWiki.pm b/IkiWiki.pm +index 1730e47..bc8b23d 100644 +--- a/IkiWiki.pm ++++ b/IkiWiki.pm +@@ -2122,36 +2122,54 @@ sub pagespec_match_list ($$;@) { + my $num=$params{num}; + delete @params{qw{num deptype reverse sort filter list}}; + +- # when only the top matches will be returned, it's efficient to +- # sort before matching to pagespec, +- if (defined $num && defined $sort) { +- @candidates=IkiWiki::SortSpec::sort_pages( +- $sort, @candidates); +- } +- ++ # Find the first num matches (or all), before sorting. + my @matches; +- my $firstfail; + my $count=0; + my $accum=IkiWiki::SuccessReason->new(); +- foreach my $p (@candidates) { +- my $r=$sub->($p, %params, location => $page); ++ my $i; ++ for ($i=0; $i < @candidates; $i++) { ++ my $r=$sub->($candidates[$i], %params, location => $page); + error(sprintf(gettext("cannot match pages: %s"), $r)) + if $r->isa("IkiWiki::ErrorReason"); + $accum |= $r; + if ($r) { +- push @matches, $p; ++ push @matches, $candidates[$i]; + last if defined $num && ++$count == $num; + } + } + ++ # We have num natches, but they may not be the best. ++ # Efficiently find and add the rest, without sorting the full list of ++ # candidates. ++ if (defined $num && defined $sort) { ++ @matches=IkiWiki::SortSpec::sort_pages($sort, @matches); ++ ++ for ($i++; $i < @candidates; $i++) { ++ # Comparing candidate with lowest match is cheaper, ++ # so it's done before testing against pagespec. ++ if (IkiWiki::SortSpec::cmptwo($candidates[$i], $matches[-1], $sort) < 0 && ++ $sub->($candidates[$i], %params, location => $page) ++ ) { ++ # this could be done less expensively ++ # using a binary search ++ for (my $j=0; $j < @matches; $j++) { ++ if (IkiWiki::SortSpec::cmptwo($candidates[$i], $matches[$j], $sort) < 0) { ++ splice @matches, $j, $#matches-$j+1, $candidates[$i], ++ @matches[$j..$#matches-1]; ++ last; ++ } ++ } ++ } ++ } ++ } ++ + # Add simple dependencies for accumulated influences. +- my $i=$accum->influences; +- foreach my $k (keys %$i) { +- $depends_simple{$page}{lc $k} |= $i->{$k}; ++ my $inf=$accum->influences; ++ foreach my $k (keys %$inf) { ++ $depends_simple{$page}{lc $k} |= $inf->{$k}; + } + +- # when all matches will be returned, it's efficient to +- # sort after matching ++ # Sort if we didn't already. + if (! defined $num && defined $sort) { + return IkiWiki::SortSpec::sort_pages( + $sort, @matches); +@@ -2455,6 +2473,12 @@ sub sort_pages { + sort $f @_ + } + ++sub cmptwo { ++ $a=$_[0]; ++ $b=$_[1]; ++ $_[2]->(); ++} ++ + sub cmp_title { + IkiWiki::pagetitle(IkiWiki::basename($a)) + cmp +</pre> + +This would be bad when M is very large, and particularly, of course, when +there is no limit and all pages are being matched on. (For example, an +archive page shows all pages that match a pagespec specifying a creation +date range.) Well, in this case, it *does* make sense to flip it, limit by +pagespe first, and do a (quick)sort second. (No influence complications, +either.) + +> Flipping when there's no limit implemented, and it knocked 1/3 off +> the rebuild time of my blog's archive pages. --[[Joey]] + +Adding these special cases will be more complicated, but I think the best +of both worlds. --[[Joey]] diff --git a/doc/todo/sort_parameter_for_map_plugin_and_directive.mdwn b/doc/todo/sort_parameter_for_map_plugin_and_directive.mdwn new file mode 100644 index 000000000..f6ccaf538 --- /dev/null +++ b/doc/todo/sort_parameter_for_map_plugin_and_directive.mdwn @@ -0,0 +1,7 @@ +## sort= parameter + +Having a `sort=` parameter for the map plugin/directive would be real nice; like `inline`'s parameter, with `age`, `title`, etc. + +I may hack one in from `inline` if it seem within my skill level. + +[[!tag wishlist]] diff --git a/doc/todo/source_link.mdwn b/doc/todo/source_link.mdwn index 4a8285948..cf3e69487 100644 --- a/doc/todo/source_link.mdwn +++ b/doc/todo/source_link.mdwn @@ -11,8 +11,7 @@ All of this code is licensed under the GPLv2+. -- [[Will]] > by loading the index with IkiWiki::loadindex, like [[plugins/goto]] does? > --[[smcv]] -[[!template id=gitbranch branch=smcv/ready/getsource - author="[[Will]]/[[smcv]]"]] +[[done]] >> I've applied the patch below in a git branch, fixed my earlier criticism, >> and also fixed a couple of other issues I noticed: @@ -132,3 +131,5 @@ All of this code is licensed under the GPLv2+. -- [[Will]] } 1 + +[[done]] --[[smcv]] diff --git a/doc/todo/structured_page_data.mdwn b/doc/todo/structured_page_data.mdwn index 72bfd8dea..9f21fab7f 100644 --- a/doc/todo/structured_page_data.mdwn +++ b/doc/todo/structured_page_data.mdwn @@ -1,5 +1,7 @@ This is an idea from [[JoshTriplett]]. --[[Joey]] +* See further discussion at [[forum/an_alternative_approach_to_structured_data]]. + Some uses of ikiwiki, such as for a bug-tracking system (BTS), move a bit away from the wiki end of the spectrum, and toward storing structured data about a page or instead of a page. @@ -251,6 +253,9 @@ in a large number of other cases. > dependencies between bugs from arbitrary links. >> This issue (the need for distinguished kinds of links) has also been brought up in other discussions: [[tracking_bugs_with_dependencies#another_kind_of_links]] (deps vs. links) and [[tag_pagespec_function]] (tags vs. links). --Ivan Z. +>>> And multiple link types are now supported; plugins can set the link +>>> type when registering a link, and pagespec functions can match on them. --[[Joey]] + ---- #!/usr/bin/perl diff --git a/doc/todo/support_link__40__.__41___in_pagespec.mdwn b/doc/todo/support_link__40__.__41___in_pagespec.mdwn new file mode 100644 index 000000000..79809662a --- /dev/null +++ b/doc/todo/support_link__40__.__41___in_pagespec.mdwn @@ -0,0 +1,13 @@ +[[!tag wishlist]] + +It would be nice to have pagespecs support "link(.)" as syntax. +This would match pages that link to the page that invokes the pagespec. +The use case is a blog with tags, and having a page for each tag +which uses !inline to list all posts with the tag. + +Joey said on IRC that "probably changing the derel() function in +IkiWiki.pm is the best way to do it". + +> I implemented this suggestion in the simplest possible way, [[!taglink patch]] available [[here|http://git.oblomov.eu/ikiwiki/patch/f4a52de556436fdee00fd92ca9a3b46e876450fa]]. +> An alternative approach, very similar, would be to make the empty page parameter mean current page (e.g. `link()` would mean pages linking here). The patch would be very similar. +> -- GB diff --git a/doc/todo/svg.mdwn b/doc/todo/svg.mdwn index 0a15af4cd..274ebf3e3 100644 --- a/doc/todo/svg.mdwn +++ b/doc/todo/svg.mdwn @@ -3,6 +3,7 @@ We should support SVG. In particular: * We could support rendering SVGs to PNGs when compiling the wiki. Not all browsers support SVG yet. * We could support editing SVGs via the web interface. SVG can contain unsafe content such as scripting, so we would need to whitelist safe markup. + * I am interested in seeing [svg-edit](http://code.google.com/p/svg-edit/) integrated -- [[EricDrechsel]] --[[JoshTriplett]] @@ -56,3 +57,21 @@ in the trunk if other people think it's useful. [htmlscrubber.pm]:http://xbeta.org/gitweb/?p=xbeta/ikiwiki.git;a=blob;f=IkiWiki/Plugin/htmlscrubber.pm;h=3c0ddc8f25bd8cb863634a9d54b40e299e60f7df;hb=fe333c8e5b4a5f374a059596ee698dacd755182d [diff]: http://xbeta.org/gitweb/?p=xbeta/ikiwiki.git;a=blobdiff;f=IkiWiki/Plugin/htmlscrubber.pm;h=3c0ddc8f25bd8cb863634a9d54b40e299e60f7df;hp=3bdaccea119ec0e1b289a0da2f6d90e2219b8d66;hb=fe333c8e5b4a5f374a059596ee698dacd755182d;hpb=be0b4f603f918444b906e42825908ddac78b7073 + +> Unfortuantly these links are broken. --[[Joey]] + +* * * + +Actually, there's a way to embed SVG into MarkDown sources using the [data: URI scheme][rfc2397], [like this](data:image/svg+xml;base64,PD94bWwgdmVyc2lvbj0iMS4wIiBzdGFuZGFsb25lPSJubyI/Pgo8c3ZnIHdpZHRoPSIxOTIiIGhlaWdodD0iMTkyIiB4bWxuczp4bGluaz0iaHR0cDovL3d3dy53My5vcmcvMTk5OS94bGluayIgeG1sbnM9Imh0dHA6Ly93d3cudzMub3JnLzIwMDAvc3ZnIj4KIDwhLS0gQ3JlYXRlZCB3aXRoIFNWRy1lZGl0IC0gaHR0cDovL3N2Zy1lZGl0Lmdvb2dsZWNvZGUuY29tLyAtLT4KIDx0aXRsZT5IZWxsbywgd29ybGQhPC90aXRsZT4KIDxnPgogIDx0aXRsZT5MYXllciAxPC90aXRsZT4KICA8ZyB0cmFuc2Zvcm09InJvdGF0ZSgtNDUsIDk3LjY3MTksIDk3LjY2OCkiIGlkPSJzdmdfNyI+CiAgIDxyZWN0IHN0cm9rZS13aWR0aD0iNSIgc3Ryb2tlPSIjMDAwMDAwIiBmaWxsPSIjRkYwMDAwIiBpZD0ic3ZnXzUiIGhlaWdodD0iNTYuMDAwMDAzIiB3aWR0aD0iMTc1IiB5PSI2OS42Njc5NjkiIHg9IjEwLjE3MTg3NSIvPgogICA8dGV4dCB4bWw6c3BhY2U9InByZXNlcnZlIiB0ZXh0LWFuY2hvcj0ibWlkZGxlIiBmb250LWZhbWlseT0ic2VyaWYiIGZvbnQtc2l6ZT0iMjQiIHN0cm9rZS13aWR0aD0iMCIgc3Ryb2tlPSIjMDAwMDAwIiBmaWxsPSIjZmZmZjAwIiBpZD0ic3ZnXzYiIHk9IjEwNS42NjgiIHg9Ijk5LjY3MTkiPkhlbGxvLCB3b3JsZCE8L3RleHQ+CiAgPC9nPgogPC9nPgo8L3N2Zz4=). +Of course, this way to display an image one needs to click a link, but it may be considered a feature. +— [[Ivan_Shmakov]], 2010-03-12Z. + +[rfc2397]: http://tools.ietf.org/html/rfc2397 + +> You can do the same with img src actually. +> +> If svg markup allows unsafe elements (ie, javascript), +> which it appears to, +> then this is a security hole, and the htmlscrubber +> needs to lock it down more. Darn, now I have to spend my afternoon making +> security releases! --[[Joey]] diff --git a/doc/todo/tagging_with_a_publication_date.mdwn b/doc/todo/tagging_with_a_publication_date.mdwn index 80240ec5a..39fc4e220 100644 --- a/doc/todo/tagging_with_a_publication_date.mdwn +++ b/doc/todo/tagging_with_a_publication_date.mdwn @@ -38,3 +38,34 @@ on vacation". > > > > I no longer have the original wiki for which I wanted this feature, but I can > > see using it on future ones. -- [[DonMarti]] + +>>> FWIW, for the case where one wants to update a site offline, +>>> using an ikiwiki instance on a laptop, and include some deffered +>>> posts in the push, the ad-hoc cron job type approach will be annoying. +>>> +>>> In modern ikiwiki, I guess the way to accomplish this would be to +>>> add a pagespec that matches only pages posted in the present or past. +>>> Then a page can have its post date set to the future, using meta date, +>>> and only show up when its post date rolls around. +>>> +>>> Ikiwiki will need to somehow notice that a pagespec began matching +>>> a page it did not match previously, despite said page not actually +>>> changing. I'm not sure what the best way is. +>>> +>>> * One way could be to +>>> use a needsbuild hook and some stored data about which pagespecs +>>> exclude pages in the future. (But I'm not sure how evaluating the +>>> pagespec could lead to that metadata and hook being set up.) +>>> * Another way would be to use an explicit directive to delay a +>>> page being posted. Then the directive stores the metadata and +>>> sets up the needsbuild hook. +>>> * Another way would be for ikiwiki to remember the last +>>> time it ran. It could then easily find pages that have a post +>>> date after that time, and treat them the same as it treats actually +>>> modified files. Or a plugin could do this via a needsbuild hook, +>>> probably. (Only downside to this is it would probably need to do +>>> a O(n) walk of the list of pages -- but only running an integer +>>> compare per page.) +>>> +>>> You'd still need a cron job to run ikiwiki -refresh every hour, or +>>> whatever, so it can update. --[[Joey]] diff --git a/doc/todo/tmplvars_plugin/discussion.mdwn b/doc/todo/tmplvars_plugin/discussion.mdwn new file mode 100644 index 000000000..93cb9b414 --- /dev/null +++ b/doc/todo/tmplvars_plugin/discussion.mdwn @@ -0,0 +1 @@ +I find this plugin quite usefull. But one thing, I would like to be able to do is set a tmplvar e.g. in a sidebar so that it affects all Pages this sidebar is used in. --martin diff --git a/doc/todo/toc_plugin:_set_a_header_ceiling___40__opposite_of_levels__61____41__.mdwn b/doc/todo/toc_plugin:_set_a_header_ceiling___40__opposite_of_levels__61____41__.mdwn index 547c7a80a..07d2d383c 100644 --- a/doc/todo/toc_plugin:_set_a_header_ceiling___40__opposite_of_levels__61____41__.mdwn +++ b/doc/todo/toc_plugin:_set_a_header_ceiling___40__opposite_of_levels__61____41__.mdwn @@ -1,3 +1,48 @@ It would be nice if the [[plugins/toc]] plugin let you specify a header level "ceiling" above which (or above and including which) the headers would not be incorporated into the toc. Currently, the levels=X parameter lets you tweak how deep it will go for small headers, but I'd like to chop off the h1's (as I use them for my page title) -- [[Jon]] + +> This change to toc.pm should do it. --[[KathrynAndersen]] + +> > The patch looks vaguely OK to me but it's hard to tell without +> > context. It'd be much easier to review if you used unified diff +> > (`diff -u`), which is what `git diff` defaults to - almost all +> > projects prefer to receive changes as unified diffs (or as +> > branches in their chosen VCS, which is [[git]] here). --[[smcv]] + +> > > Done. -- [[KathrynAndersen]] + +> > > > Looks like Joey has now [[merged|done]] this. Thanks! --[[smcv]] + + --- /files/git/other/ikiwiki/IkiWiki/Plugin/toc.pm 2009-11-16 12:44:00.352050178 +1100 + +++ toc.pm 2009-12-26 06:36:06.686512552 +1100 + @@ -53,8 +53,8 @@ + my $page=""; + my $index=""; + my %anchors; + - my $curlevel; + - my $startlevel=0; + + my $startlevel=($params{startlevel} ? $params{startlevel} : 0); + + my $curlevel=$startlevel-1; + my $liststarted=0; + my $indent=sub { "\t" x $curlevel }; + $p->handler(start => sub { + @@ -67,10 +67,16 @@ + + # Take the first header level seen as the topmost level, + # even if there are higher levels seen later on. + + # unless we're given startlevel as a parameter + if (! $startlevel) { + $startlevel=$level; + $curlevel=$startlevel-1; + } + + elsif (defined $params{startlevel} + + and $level < $params{startlevel}) + + { + + return; + + } + elsif ($level < $startlevel) { + $level=$startlevel; + } + +[[!tag patch]] diff --git a/doc/todo/tracking_bugs_with_dependencies.mdwn b/doc/todo/tracking_bugs_with_dependencies.mdwn index a198530fc..456dadad0 100644 --- a/doc/todo/tracking_bugs_with_dependencies.mdwn +++ b/doc/todo/tracking_bugs_with_dependencies.mdwn @@ -81,6 +81,9 @@ I like the idea of [[tips/integrated_issue_tracking_with_ikiwiki]], and I do so >> I saw that this issue is targeted at by the work on [[structured page data#another_kind_of_links]]. --Ivan Z. +>>> It's fixed now; links can have a type, such as "tag", or "dependency", +>>> and pagespecs can match links of a given typo. --[[Joey]] + Okie - I've had a quick attempt at this. Initial patch attached. This one doesn't quite work. And there is still a lot of debugging stuff in there. @@ -360,7 +363,10 @@ account all comments above (which doesn't mean it is above reproach :) ). --[[W > --[[Joey]] >> There is one issue that I've been thinking about that I haven't raised anywhere (or checked myself), and that is how this all interacts with page dependencies. ->> Firstly, I'm not sure anymore that the `pagespec_merge` function will continue to work in all cases. +>> +>>> I've moved the discussion of that to [[dependency_types]]. --[[Joey]] +>> +>> I'm not sure anymore that the `pagespec_merge` function will continue to work in all cases. >>> The problem I can see there is that if two pagespecs >>> get merged and both use `~foo` but define it differently, @@ -410,92 +416,10 @@ account all comments above (which doesn't mean it is above reproach :) ). --[[W >>>>> then the last definition (baz) takes precedence. >>>>> In the process of writing this I think I've come up with a way to change this back the way it was, still using closures. -- [[Will]] ->>> Alternatively, my [[remove-pagespec-merge|should_optimise_pagespecs]] ->>> branch solves this, in a Gordian knot sort of way :-) --[[smcv]] - ->> Secondly, it seems that there are two types of dependency, and ikiwiki ->> currently only handles one of them. The first type is "Rebuild this ->> page when any of these other pages changes" - ikiwiki handles this. ->> The second type is "rebuild this page when set of pages referred to by ->> this pagespec changes" - ikiwiki doesn't seem to handle this. I ->> suspect that named pagespecs would make that second type of dependency ->> more important. I'll try to come up with a good example. -- [[Will]] - ->>> Hrm, I was going to build an example of this with backlinks, but it ->>> looks like that is handled as a special case at the moment (line 458 of ->>> render.pm). I'll see if I can breapk ->>> things another way. Fixing this properly would allow removal of that special case. -- [[Will]] - ->>>> I can't quite understand the distinction you're trying to draw ->>>> between the two types of dependencies. Backlinks are a very special ->>>> case though and I'll be suprised if they fit well into pagespecs. ->>>> --[[Joey]] - ->>>>> The issue is that the existential pagespec matching allows you to build things that have similar ->>>>> problems to backlinks. ->>>>> e.g. the following inline: - - \[[!inline pages="define(~done, link(done)) and link(~done)" archive=yes]] - ->>>>> includes any page that links to a page that links to done. Now imagine I add a new link to 'done' on ->>>>> some random page somewhere - a page which some other page links to which didn't previously get included - the set of pages accepted by the pagespec, and hence the set of ->>>>> pages inlined, will change. But, there is no dependency anywhere on the page that I altered, so ->>>>> ikiwiki will not rebuild the page with the inline in it. What is happening is that the page that I altered affects ->>>>> the set of pages matched by the pagespec without itself being matched by the pagespec, and hence included in the dependency list. - ->>>>> To make this work well, I think you need to recognise two types of dependencies for each page (and no ->>>>> special cases for particular types of links, eg backlinks). The first type of dependency says, "The content of ->>>>> this page depends upon the content of these other pages". The `add_depends()` in the shortcuts ->>>>> plugin is of this form: any time the shortcuts page is edited, any page with a shortcut on it ->>>>> is rebuilt. The inline plugin also needs to add dependencies of this form to detect when the inlined ->>>>> content changes. By contrast, the map plugin does not need a dependency of this form, because it ->>>>> doesn't actually care about the content of any pages, just which pages it needs to include (which we'll handle next). - ->>>>> The second type of dependency says, "The content of this page depends upon the exact set of pages matched ->>>>> by this pagespec". The first type of dependency was about the content of some pages, the second type is about ->>>>> which pages get matched by a pagespec. This is the type of dependency tracking that the map plugin needs. ->>>>> If the set of pages matched by map pagespec changes, then the page with the map on it needs to be rebuilt to show a different list of pages. ->>>>> Inline needs this type of dependency as well as the previous type - This type handles a change in which pages ->>>>> are inlined, the previous type handles a change in the content of any of those pages. Shortcut does not need this type of ->>>>> dependency. Most of the places that use `add_depends()` seem to need this type of dependency rather than the first type. - ->>>>>> Note that inline and map currently achieve the second type of dependency by ->>>>>> explicitly calling `add_depends` for each page the displayed. ->>>>>> If any of those pages are removed, the regular pagespec would not ->>>>>> match them -- since they're gone. However, the explicit dependency ->>>>>> on them does cause them to match. It's an ugly corner I'd like to ->>>>>> get rid of. --[[Joey]] - ->>>>> Implementation Details: The first type of dependency can be handled very similarly to the current ->>>>> dependency system. You just need to keep a list of pages that the content depends upon. You could ->>>>> keep that list as a pagespec, but if you do this you might want to check that the pagespec doesn't change, ->>>>> possibly by adding a dependency of the second type along with the dependency of the first type. - ->>>>>> An example of the current system not tracking enough data is ->>>>>> where A inlines B which inlines C. A change to C will cause B to ->>>>>> rebuild, but A will not "notice" that B has implicitly changed. ->>>>>> That example suggests it might be fixable without explicitly storing ->>>>>> data, by causing a rebuild of B to be treated as a change to B. ->>>>>> --[[Joey]] - ->>>>> The second type of dependency is a little more tricky. For each page, we'd need a list of pagespecs that ->>>>> the page depended on, and for each pagespec you'd want to store the list of pages that currently match it. ->>>>> On refresh, you'd need to check each pagespec to see if the set of pages that match it has changed, and if ->>>>> that set has changed, then rebuild the dependent page(s). Oh, and for this second type of dependency, I ->>>>> don't think you can merge pagespecs. If I wanted to know if either "\*" or "link(done)" changes, then just checking ->>>>> to see if the set of pages matched by "\* or link(done)" changes doesn't work. - ->>>>> The current system works because even though you usually want dependencies of the second type, the set of pages ->>>>> referred to by a pagespec can only change if one of those pages itself changes. i.e. A dependency check of the ->>>>> first type will catch a dependency change of the second type with current pagespecs. ->>>>> This doesn't work with backlinks, and it doesn't work with existential matching. Backlinks are currently special-cased. I don't know ->>>>> how to special-case existential matching - I suspect you're better off just getting the dependency tracking right. - ->>>>> I also tried to come up with other possible solutions: e.g. can we find the dependencies for a pagespec? That ->>>>> would be the set of pages where a change on one of those pages could lead to a change in the set of pages matched by the pagespec. ->>>>> For old-style pagespecs without backlinks, the dependency set for a pagespec is the same as the set of pages the pagespec matches. ->>>>> Unfortunately, with existential matching, the set of pages that each ->>>>> pagespec depends upon can quickly become "*", which is not very useful. -- [[Will]] +>>> My [[remove-pagespec-merge|should_optimise_pagespecs]] branch has now +>>> solved all this by deleting the offending function :-) --[[smcv]] + + Patch updated to use closures rather than inline generated code for named pagespecs. Also includes some new use of ErrorReason where appropriate. -- [[Will]] diff --git a/doc/todo/two-way_convert_of_wikis.mdwn b/doc/todo/two-way_convert_of_wikis.mdwn new file mode 100644 index 000000000..61f02a30b --- /dev/null +++ b/doc/todo/two-way_convert_of_wikis.mdwn @@ -0,0 +1,15 @@ + +[[!tag wishlist]] + +Ok, the vision is this: Some of you will know git-svn. I want something like +git-svn,, but for wikis. I want to be able to do the following: + +1. Convert a moinmoin (or whatever) wiki to a local ikiwiki on my laptop. +2. Edit my local copy (offline). +3. Preview the changes with my local ikiwki installation + browser. +4. Push the changes back to moinmoin (or whatever) wiki. + +I know, I know, ikiwiki wasn't designed for that, but it would be really cool, +and useful and people ask for that kind of thing too. + +--[[David_Riebenbauer]] diff --git a/doc/todo/user-defined_templates_outside_the_wiki.mdwn b/doc/todo/user-defined_templates_outside_the_wiki.mdwn new file mode 100644 index 000000000..1d72aa6a7 --- /dev/null +++ b/doc/todo/user-defined_templates_outside_the_wiki.mdwn @@ -0,0 +1,10 @@ +[[!tag wishlist]] + +The [[plugins/contrib/ftemplate]] plugin looks for templates inside the wiki +source, but also looks in the system templates directory (the one with +`page.tmpl`). This means the wiki admin can provide templates that can be +invoked via `\[[!template]]`, but don't have to "work" as wiki pages in their +own right. I think the normal [[plugins/template]] plugin could benefit from +this functionality. + +[[done]] --[[Joey]] diff --git a/doc/todo/want_to_avoid_ikiwiki_using_http_or_https_in_urls_to_allow_serving_both.mdwn b/doc/todo/want_to_avoid_ikiwiki_using_http_or_https_in_urls_to_allow_serving_both.mdwn index 65b7cd96a..5f17e3ae0 100644 --- a/doc/todo/want_to_avoid_ikiwiki_using_http_or_https_in_urls_to_allow_serving_both.mdwn +++ b/doc/todo/want_to_avoid_ikiwiki_using_http_or_https_in_urls_to_allow_serving_both.mdwn @@ -26,4 +26,6 @@ which seems to do the right thing in page.tmpl, but not for change.tmpl. Where i > The use of an absolute baseurl in change.tmpl is a special case. --[[Joey]] +So I'm facing this same issue. I have a wiki which needs to be accessed on three different URLs(!) and the hard coding of the URL from the setup file is becoming a problem for me. Is there anything I can do here? --[[Perry]] + [[wishlist]] diff --git a/doc/todo/wrapperuser.mdwn b/doc/todo/wrapperuser.mdwn new file mode 100644 index 000000000..4c42b046f --- /dev/null +++ b/doc/todo/wrapperuser.mdwn @@ -0,0 +1,7 @@ +ikiwiki's .setup file can specify wrappergroup, and ikiwiki will set the group +of the wrappers accordingly. Having had people encounter difficulty before +when trying to do the same thing with users (for instance, making all wrappers +6755 ikiwiki:ikiwiki), I think it would help to have "wrapperuser". This could +only actually take effect if building the wrappers as root (not really the best +plan), but ikiwiki could at least warn if wrapperuser does not match the user +the wrapper will end up with. diff --git a/doc/translation.mdwn b/doc/translation.mdwn index 459f47eb5..9d874d98e 100644 --- a/doc/translation.mdwn +++ b/doc/translation.mdwn @@ -23,9 +23,13 @@ essentially three pieces needed for a complete translation: file, and in preprocessor directives. 1. The [[basewiki]] needs to be translated. The - [[plugins/contrib/po]] ikiwiki plugin will allow translating + [[plugins/po]] ikiwiki plugin will allow translating wikis using po files and can be used for this. + There is now a website, [l10n.ikiwiki.info](http://l10n.ikiwiki.info) + that both demos the translated basewiki, and allows easy translation of + it. + To generate the po and pot files for translating the basewiki, get ikiwiki's source, edit the `po/underlay.setup` file, adding your language. Then run 'make -C po underlays`. diff --git a/doc/usage.mdwn b/doc/usage.mdwn index 0c618de5c..840d105d2 100644 --- a/doc/usage.mdwn +++ b/doc/usage.mdwn @@ -32,14 +32,22 @@ These options control the mode that ikiwiki operates in. * --setup setupfile - In setup mode, ikiwiki reads the config file, which is really a perl - program that can call ikiwiki internal functions. - The default action when --setup is specified is to automatically generate wrappers for a wiki based on data in a setup file, and rebuild the wiki. If you only want to build any changed pages, you can use --refresh with --setup. +* --changesetup setupfile + + Reads the setup file, adds any configuration changes specified by other + options, and writes the new configuration back to the setup file. Also + updates any configured wrappers. In this mode, the wiki is not fully + rebuilt, unless you also add --rebuild. + + Example, to enable some plugins: + + ikiwiki --changesetup ~/ikiwiki.setup --plugin goodstuff --plugin calendar + * --dumpsetup setupfile Causes ikiwiki to write to the specified setup file, dumping out @@ -50,6 +58,14 @@ These options control the mode that ikiwiki operates in. If used with --setup --refresh, this makes it also update any configured wrappers. +* --clean + + This makes ikiwiki clean up by removing any files it denerated in the + `destination` directory, as well as any configured wrappers, and the + `.ikiwiki` state directory. This is mostly useful if you're running + ikiwiki in a Makefile to build documentation and want a corresponding + `clean` target. + * --cgi Enable [[CGI]] mode. In cgi mode ikiwiki runs as a cgi script, and @@ -112,10 +128,11 @@ also be configured using a setup file. * --templatedir dir - Specify the directory that the page [[templates|wikitemplates]] are stored in. + Specify the directory that [[templates|templates]] are stored in. Default is `/usr/share/ikiwiki/templates`, or another location as configured at build time. If the templatedir is changed, missing templates will still - be searched for in the default location as a fallback. + be searched for in the default location as a fallback. Templates can also be + placed in the "templates/" subdirectory of the srcdir. Note that if you choose to copy and modify ikiwiki's templates, you will need to be careful to keep them up to date when upgrading to new versions of @@ -226,6 +243,12 @@ also be configured using a setup file. Specifies a rexexp of source files to exclude from processing. May be specified multiple times to add to exclude list. +* --include regexp + + Specifies a rexexp of source files, that would normally be excluded, + but that you wish to include in processing. + May be specified multiple times to add to include list. + * --adminuser name Specifies a username of a user (or, if openid is enabled, an openid) @@ -249,8 +272,8 @@ also be configured using a setup file. Makes ikiwiki look in the specified directory first, before the regular locations when loading library files and plugins. For example, if you set - libdir to "/home/you/.ikiwiki/", you can install a Foo.pm plugin as - "/home/you/.ikiwiki/IkiWiki/Plugin/Foo.pm". + libdir to "/home/you/.ikiwiki/", you can install a foo.pm plugin as + "/home/you/.ikiwiki/IkiWiki/Plugin/foo.pm". * --discussion, --no-discussion @@ -306,20 +329,22 @@ also be configured using a setup file. intercepted. If you enable this option then you must run at least the CGI portion of ikiwiki over SSL. -* --getctime +* --gettime, --no-gettime - Pull creation time for each new page out of the revision control - system. This rarely used option provides a way to get the real creation - times of items in weblogs, such as when building a wiki from a new - VCS checkout. It is unoptimised and quite slow. It is best used - with --rebuild, to force ikiwiki to get the ctime for all pages. + Extract creation and modification times for each new page from the + the revision control's log. This is done automatically when building a + wiki for the first time, so you normally do not need to use this option. * --set var=value This allows setting an arbitrary configuration variable, the same as if it - were set via a setup file. Since most options can be configured - using command-line switches, you will rarely need to use this, but it can be - useful for the odd option that lacks a command-line switch. + were set via a setup file. Since most commonly used options can be + configured using command-line switches, you will rarely need to use this. + +* --set-yaml var=value + + This is like --set, but it allows setting configuration variables that + use complex data structures, by passing in a YAML document. # EXAMPLES @@ -345,6 +370,10 @@ also be configured using a setup file. This controls what C compiler is used to build wrappers. Default is 'cc'. +* CFLAGS + + This can be used to pass options to the C compiler when building wrappers. + # SEE ALSO * [[ikiwiki-mass-rebuild]](8) diff --git a/doc/users/BerndZeimetz.mdwn b/doc/users/BerndZeimetz.mdwn new file mode 100644 index 000000000..cf21dc585 --- /dev/null +++ b/doc/users/BerndZeimetz.mdwn @@ -0,0 +1,8 @@ +See [wiki.debian.org/BerndZeimetz](http://wiki.debian.org/BerndZeimetz) for details. + +<pre> + Bernd Zeimetz Debian GNU/Linux Developer + http://bzed.de http://www.debian.org + GPG Fingerprints: 06C8 C9A2 EAAD E37E 5B2C BE93 067A AD04 C93B FF79 + ECA1 E3F2 8E11 2432 D485 DD95 EB36 171A 6FF9 435F +</pre> diff --git a/doc/users/Christine_Spang.mdwn b/doc/users/Christine_Spang.mdwn new file mode 100644 index 000000000..223e9739d --- /dev/null +++ b/doc/users/Christine_Spang.mdwn @@ -0,0 +1 @@ +Running ikiwiki on her [homepage](http://spang.cc/) and [blog](http://blog.spang.cc/). diff --git a/doc/users/David_Riebenbauer.mdwn b/doc/users/David_Riebenbauer.mdwn new file mode 100644 index 000000000..d7469696e --- /dev/null +++ b/doc/users/David_Riebenbauer.mdwn @@ -0,0 +1,8 @@ +Runs ikiwiki on his [homepage](http://liegesta.at/) and can be reached through +<davrieb@liegesta.at> + +## Branches in his [[git]] repository ## + +* `autotag` +([browse](http://git.liegesta.at/?p=ikiwiki.git;a=shortlog;h=refs/heads/autotag)) +See [[todo/auto-create_tag_pages_according_to_a_template]] diff --git a/doc/users/Edward_Betts.mdwn b/doc/users/Edward_Betts.mdwn index b32927a1c..61d6150ef 100644 --- a/doc/users/Edward_Betts.mdwn +++ b/doc/users/Edward_Betts.mdwn @@ -1,9 +1,4 @@ My watchlist: -[[!inline archive="yes" sort="mtime" atom="yes" pages=" -todo/allow_wiki_syntax_in_commit_messages* -todo/shortcut_with_different_link_text* -todo/structured_page_data* -tips/convert_mediawiki_to_ikiwiki* -"]] +[[!inline archive="yes" sort="mtime" atom="yes" pages="todo/allow_wiki_syntax_in_commit_messages* or todo/shortcut_with_different_link_text* or todo/structured_page_data* or tips/convert_mediawiki_to_ikiwiki*"]] diff --git a/doc/users/Jimmy_Tang.mdwn b/doc/users/Jimmy_Tang.mdwn new file mode 100644 index 000000000..a1402bcae --- /dev/null +++ b/doc/users/Jimmy_Tang.mdwn @@ -0,0 +1 @@ +<http://www.sgenomics.org/~jtang> diff --git a/doc/users/KarlMW/discussion.mdwn b/doc/users/KarlMW/discussion.mdwn index 9117abcab..4a111a3f9 100644 --- a/doc/users/KarlMW/discussion.mdwn +++ b/doc/users/KarlMW/discussion.mdwn @@ -23,3 +23,5 @@ things that need changing then I will probably need help/guidance. >> I suspect that asciidoc can't really be made to play nice to the extent that I would want casual users/abusers to have it as a markup option on a live wiki - it's fine for a personal site where you can look at the output before putting it online, but I think it would be a hideously gaping integrity hole for anything more than that. However, for a personal site (as I am using it), it does seem to have its uses. >> I'll keep an eye on the format_escape plugin, and assuming it is accepted into ikiwiki, will see if I can apply it to asciidoc. --[[KarlMW]] + +Is there any way to enable latexmath rendering? It semes that ikiwiki strips the necessary javascript and/or style sheet information from the HTML page generated by asciidoc. --Peter diff --git a/doc/users/KathrynAndersen.mdwn b/doc/users/KathrynAndersen.mdwn new file mode 100644 index 000000000..8e827b0da --- /dev/null +++ b/doc/users/KathrynAndersen.mdwn @@ -0,0 +1,8 @@ +* aka [[rubykat]] +* <http://kerravonsen.dreamwidth.org> +* <http://www.katspace.org> (uses IkiWiki!) +* <http://github.com/rubykat> +* Also an active [PmWiki](http://www.pmwiki.org) user, interested in having the best of both worlds. + +Has written the following plugins: +[[!map pages="!*/Discussion and ((link(users/KathrynAndersen) or link(users/rubykat)) and plugins/*) "]] diff --git a/doc/users/KathrynAndersen/discussion.mdwn b/doc/users/KathrynAndersen/discussion.mdwn new file mode 100644 index 000000000..4f2790c39 --- /dev/null +++ b/doc/users/KathrynAndersen/discussion.mdwn @@ -0,0 +1,20 @@ +Had a look at your site. Sprawling, individualistic, using ikiwiki in lots of +ways. Makes me happy. :) I see that I have let a lot of contrib plugins +pile up. I will try to get to these. I'm particularly interested in +your use of yaml+fields. Encourage you to go ahead with any others you +have not submitted here, like pmap. (Unless it makes more sense to submit +that as a patch to the existing map plugin.) --[[Joey]] + +> Thanks. I would have put more up, but I didn't want to until they were properly documented, and other things have taken a higher priority. + +> I think pmap is probably better as a separate plugin, because it has additional dependencies (HTML::LinkList) which people might not want to have to install. + +>> One approach commonly used in ikiwiki is to make such optional features +>> be enabled by a switch somewhere, and 'eval q{use Foo}` so the module +>> does not have to be loaded unless the feature is used. --[[Joey]] + +>>> Unfortunately, HTML::LinkList isn't an optional feature for pmap; that's what it uses to create the HTML for the map. --[[KathrynAndersen]] + +> The "includepage" plugin I'm not sure whether it is worth releasing or not; it's basically a cut-down version of "inline", because the inline plugin is so complicated and has so many options, I felt more at ease to have something simpler. + +> --[[KathrynAndersen]] diff --git a/doc/users/Mick_Pollard.mdwn b/doc/users/Mick_Pollard.mdwn new file mode 100644 index 000000000..9c558e357 --- /dev/null +++ b/doc/users/Mick_Pollard.mdwn @@ -0,0 +1 @@ +. diff --git a/doc/users/NicolasLimare.mdwn b/doc/users/NicolasLimare.mdwn index 003449d40..56a950f7e 100644 --- a/doc/users/NicolasLimare.mdwn +++ b/doc/users/NicolasLimare.mdwn @@ -1,9 +1 @@ -Nicolas (nil) uses ikiwiki on a site/wiki/blog/something... and feels this approach much more comfortable than the usual web-only ones. - -He didn't touch any perl code before using ikiwiki, ant that was the first opportunity to propose tiny patches. - -Actualy, he would have felt much more comfortable with a python ikiwiki... :) - -Can be reached at nicolas at limare.net - -By the way, I can make translations to french if needed. And maybe to japanese.
\ No newline at end of file +[[!meta redir="nil"]] diff --git a/doc/users/Oblomov.mdwn b/doc/users/Oblomov.mdwn new file mode 100644 index 000000000..be6e666cb --- /dev/null +++ b/doc/users/Oblomov.mdwn @@ -0,0 +1 @@ +Getting started with Ikiwiki, like the git backend a lot, would like to see a dynamic version of it. diff --git a/doc/users/Perry.mdwn b/doc/users/Perry.mdwn new file mode 100644 index 000000000..d10b8621f --- /dev/null +++ b/doc/users/Perry.mdwn @@ -0,0 +1 @@ +Just another IkiWiki user. diff --git a/doc/users/Tim_Lavoie.mdwn b/doc/users/Tim_Lavoie.mdwn new file mode 100644 index 000000000..90df011c6 --- /dev/null +++ b/doc/users/Tim_Lavoie.mdwn @@ -0,0 +1 @@ +Hey... I'm just starting to use ikiwiki, but am happy to find it repeatedly doing the sorts of things in a way which makes sense to me. (e.g. most pages are static, DVCS for file store etc.) diff --git a/doc/users/Will.mdwn b/doc/users/Will.mdwn index 9c46edc69..1956263e0 100644 --- a/doc/users/Will.mdwn +++ b/doc/users/Will.mdwn @@ -2,7 +2,9 @@ I started using Ikiwiki as a way to replace [Trac](http://trac.edgewall.org/) wh Lately I've been using Ikiwiki for other things and seem to be scratching a few itches here and there. :) -I generally use my [[ikiwiki/openid]] login when editing here: <http://www.cse.unsw.edu.au/~willu/> +I generally use my [[ikiwiki/openid]] login when editing here: <http://www.cse.unsw.edu.au/~willu/> or <http://www.google.com/profiles/will.uther>. + +I have a git repository for some of my IkiWiki code: <http://www.cse.unsw.edu.au/~willu/ikiwiki.git>. Generic License Grant ----------------- @@ -11,14 +13,16 @@ Unless otherwise specified, any code that I post to this wiki I release under th ------ +Disabling these as I'm not using them much any more... + ### Open Bugs: -[[!inline pages="link(users/Will) and bugs/* and !bugs/done and !bugs/discussion and !link(patch) and !link(bugs/done) and !bugs/*/*" archive="yes" feeds="no" ]] +\[[!inline pages="link(users/Will) and bugs/* and !bugs/done and !bugs/discussion and !link(patch) and !link(bugs/done) and !bugs/*/*" archive="yes" feeds="no" ]] ### Open ToDos: -[[!inline pages="link(users/Will) and todo/* and !todo/done and !todo/discussion and !link(patch) and !link(todo/done) and !bugs/*/*" archive="yes" feeds="no" ]] +\[[!inline pages="link(users/Will) and todo/* and !todo/done and !todo/discussion and !link(patch) and !link(todo/done) and !bugs/*/*" archive="yes" feeds="no" ]] ### Unapplied Patches: -[[!inline pages="link(users/Will) and (todo/* or bugs/*) and !bugs/done and !bugs/discussion and !todo/done and !todo/discussion and link(patch) and !link(bugs/done) and !link(todo/done) and !bugs/*/*" archive="yes" feeds="no" ]] +\[[!inline pages="link(users/Will) and (todo/* or bugs/*) and !bugs/done and !bugs/discussion and !todo/done and !todo/discussion and link(patch) and !link(bugs/done) and !link(todo/done) and !bugs/*/*" archive="yes" feeds="no" ]] diff --git a/doc/users/blipvert.mdwn b/doc/users/blipvert.mdwn new file mode 100644 index 000000000..7c4a24ba1 --- /dev/null +++ b/doc/users/blipvert.mdwn @@ -0,0 +1 @@ +<http://github.com/blipvert> diff --git a/doc/users/emptty.mdwn b/doc/users/emptty.mdwn new file mode 100644 index 000000000..08ef7d0f3 --- /dev/null +++ b/doc/users/emptty.mdwn @@ -0,0 +1,2 @@ +My professional homepage is [here](http://www.loria.fr/~quinson/). I'm currently (09/09) trying to move it from WML to ikiwiki. +I'm sure I'll have a bunch of ideas, requests and maybe even patches in the process. diff --git a/doc/users/ericdrechsel.mdwn b/doc/users/ericdrechsel.mdwn new file mode 100644 index 000000000..2efb7039c --- /dev/null +++ b/doc/users/ericdrechsel.mdwn @@ -0,0 +1 @@ +[My homewiki profile](http://wiki.shared.dre.am/people/eric/) diff --git a/doc/users/harishcm.mdwn b/doc/users/harishcm.mdwn index 47f28c83c..164711022 100644 --- a/doc/users/harishcm.mdwn +++ b/doc/users/harishcm.mdwn @@ -1 +1 @@ -Using ikiwiki for my yet to be publish personal website :) +Using ikiwiki for my yet to be published personal website :) diff --git a/doc/users/ivan_shmakov.mdwn b/doc/users/ivan_shmakov.mdwn new file mode 100644 index 000000000..4123e0fc6 --- /dev/null +++ b/doc/users/ivan_shmakov.mdwn @@ -0,0 +1,50 @@ +… To put it short: an Ikiwiki newbie. + +[Emacs]: http://www.gnu.org/software/emacs/ +[Lynx]: http://lynx.isc.org/ + +## Wikis + +Currently, I run a couple of Ikiwiki instances. Namely: + +* <http://lhc.am-1.org/lhc/> + — to hold random stuff written by me, my colleagues, + students, etc. + +* <http://rsdesne.am-1.org/rsdesne-2010/> + — for some of the materials related to the + “Remote Sensing in Education, Science and National + Economy” (2010-03-29 … 2010-04-10, Altai State + University) program I've recently participated in as + an instructor. + +## Preferences + +I prefer to use [Lynx][] along with [Emacs][] (via +`emacsclient`) to work with the wikis. (Note the “Local +variables” section below.) + +The things I dislike in the wiki engines are: + +* the use of home-brew specialized version control systems + — while there're a lot of much more developed general + purpose ones; + +* oversimplified syntax + — which (to some extent) precludes more sophisticated + forms of automated processing; in particular, this forces one + to reformat the material, once complete, to, say, prepare a + book, or an article, or slides. + +Out of all the wiki engines I'm familiar with, only Ikiwiki is +free of the first of these. I hope that it will support more +elaborate syntaxes eventually. + +---- + + Local variables: + mode: markdown + coding: utf-8 + fill-column: 64 + ispell-dictionary: "american" + End: diff --git a/doc/users/jasonblevins.mdwn b/doc/users/jasonblevins.mdwn index b50e4844a..e4a459e30 100644 --- a/doc/users/jasonblevins.mdwn +++ b/doc/users/jasonblevins.mdwn @@ -1,66 +1,46 @@ [[!meta title="Jason Blevins"]] -I'm currently hosting a private ikiwiki for keeping research notes -which, with some patches and a plugin (below), will -convert inline [[todo/LaTeX]] expressions to [[MathML]]. I'm working towards a -patchset and instructions for others to do the same. - -I've setup a test ikiwiki [here](http://xbeta.org/colab/) where I've -started keeping a few notes on my progress. There is an example of -inline [[todo/SVG]] on the homepage (note that the logo scales along with the -font size). There are a few example mathematical expressions in the -[sandbox](http://xbeta.org/colab/sandbox/). The MathML is generated -automatically from inline LaTeX expressions using an experimental -plugin I'm working on. - -My (also MathML-enabled) homepage: <http://jblevins.org/> (still using -Blosxom...maybe one day I'll convert it to ikiwiki...) - -Current ikiwki issues of interest: - - * [[bugs/recentchanges_feed_links]] - * [[bugs/HTML_inlined_into_Atom_not_necessarily_well-formed]] - * [[plugins/toc/discussion]] - * [[todo/BibTeX]] - * [[todo/svg]] - * [[todo/Option_to_make_title_an_h1?]] - * [[bugs/SVG_files_not_recognized_as_images]] +I am a former Ikiwiki user who wrote several plugins and patches +related to MathML, [[SVG|todo/svg]], and [[todo/syntax highlighting]]. +Some related links and notes are archived below. + +Homepage: <http://jblevins.org/> ## Plugins -These plugins are experimental. Use them at your own risk. Read the -perldoc documentation for more details. Patches and suggestions are -welcome. +The following [plugins](http://jblevins.org/projects/ikiwiki/) +are no longer maintained, but please feel free to use, modify, and +redistribute them. Read the corresponding perldoc documentation for +more details. - * [mdwn_itex][] - Works with the [[`mdwn`|plugins/mdwn]] plugin to convert inline [[todo/LaTeX]] - expressions to [[MathML]] using `itex2MML`. + * [mdwn_itex][] - Works with the [[`mdwn`|plugins/mdwn]] plugin to convert + inline [[todo/LaTeX]] expressions to MathML using `itex2MML`. * [h1title][] - If present, use the leading level 1 Markdown header to set the page title and remove it from the page body. * [code][] - Whole file and inline code snippet [[todo/syntax highlighting]] via GNU Source-highlight. The list of supported file extensions is - configurable. There is also some preliminary [documentation][code-doc]. - See the [FortranWiki](http://fortranwiki.org) for examples. + configurable. - * [metamail][] - a plugin for loading metadata from [[email]]-style + * [metamail][] - a plugin for loading metadata from email-style headers at top of a file (e.g., `title: Page Title` or `date: November 2, 2008 11:14 EST`). - * [pandoc][] - [[ikiwiki/Markdown]] page processing via [Pandoc](http://johnmacfarlane.net/pandoc/) (a Haskell library for converting from one markup format to another). [[todo/LaTeX]] and + * [pandoc][] - [[ikiwiki/Markdown]] page processing via + [Pandoc](http://johnmacfarlane.net/pandoc/) (a Haskell library for + converting from one markup format to another). [[todo/LaTeX]] and [[reStructuredText|plugins/rst]] are optional. * [path][] - Provides path-specific template conditionals such as `IS_HOMEPAGE` and `IN_DIR_SUBDIR`. - [mdwn_itex]: http://code.jblevins.org/ikiwiki/plugins.git/plain/mdwn_itex.pm - [h1title]: http://code.jblevins.org/ikiwiki/plugins.git/plain/h1title.pm - [code]: http://code.jblevins.org/ikiwiki/plugins.git/plain/code.pm - [code-doc]: http://code.jblevins.org/ikiwiki/plugins.git/plain/code.text - [metamail]: http://code.jblevins.org/ikiwiki/plugins.git/plain/metamail.pm - [pandoc]: http://code.jblevins.org/ikiwiki/plugins.git/plain/pandoc.pm - [path]: http://code.jblevins.org/ikiwiki/plugins.git/plain/path.pm - + [mdwn_itex]: http://jblevins.org/git/ikiwiki/plugins.git/plain/mdwn_itex.pm + [h1title]: http://jblevins.org/git/ikiwiki/plugins.git/plain/h1title.pm + [code]: http://jblevins.org/projects/ikiwiki/code + [metamail]: http://jblevins.org/git/ikiwiki/plugins.git/plain/metamail.pm + [pandoc]: http://jblevins.org/git/ikiwiki/plugins.git/plain/pandoc.pm + [path]: http://jblevins.org/git/ikiwiki/plugins.git/plain/path.pm ## MathML and SVG support @@ -105,5 +85,5 @@ optimal solution is to force users to preview the page before saving. That way if someone introduces invalid XHTML then they can't save the page in the first place (unless they post directly to the right URL). - [template-patch]: http://xbeta.org/gitweb/?p=xbeta/ikiwiki.git;a=blobdiff;f=templates/page.tmpl;h=380ef699fa72223744eb5c1ee655fb79aa6bce5b;hp=9084ba7e11e92a10528b2ab12c9b73cf7b0f40a7;hb=416d5d1b15b94e604442e4e209a30dee4b77b684;hpb=ececf4fb8766a4ff7eff943b3ef600be81a0df49 - [cgi-patch]: http://xbeta.org/gitweb/?p=xbeta/ikiwiki.git;a=commitdiff;h=fa538c375250ab08f396634135f7d79fce2a9d36 + [template-patch]: http://jblevins.org/git/ikiwiki.git/commit/?h=xbeta&id=416d5d1b15b94e604442e4e209a30dee4b77b684 + [cgi-patch]: http://jblevins.org/git/ikiwiki.git/commit/?id=fa538c375250ab08f396634135f7d79fce2a9d36 diff --git a/doc/users/jaywalk.mdwn b/doc/users/jaywalk.mdwn new file mode 100644 index 000000000..31b3e0c4d --- /dev/null +++ b/doc/users/jaywalk.mdwn @@ -0,0 +1,5 @@ +Jonatan Walck. Home page: [jonatan.walck.se](http://jonatan.walck.se) + +## Contact ## +* jonatan at walck dot se +* [I2P-Bote key](http://jonatan.walck.i2p/docs/i2p-bote.txt) [what is this?](http://i2pbote.i2p.to/) diff --git a/doc/users/jeanprivat.mdwn b/doc/users/jeanprivat.mdwn new file mode 100644 index 000000000..4d75a9867 --- /dev/null +++ b/doc/users/jeanprivat.mdwn @@ -0,0 +1 @@ +Jean Privat is <jean@pryen.org>. diff --git a/doc/users/jerojasro.mdwn b/doc/users/jerojasro.mdwn new file mode 100644 index 000000000..e2e620d3f --- /dev/null +++ b/doc/users/jerojasro.mdwn @@ -0,0 +1,3 @@ +Javier Rojas + +I keep a personal [wiki](http://devnull.li/~jerojasro/wiki) and my [blog](http://devnull.li/~jerojasro/blog) in ikiwiki. diff --git a/doc/users/jmtd.mdwn b/doc/users/jmtd.mdwn new file mode 100644 index 000000000..a816baf6f --- /dev/null +++ b/doc/users/jmtd.mdwn @@ -0,0 +1 @@ +[[!meta redir=users/jon]] diff --git a/doc/users/jogo.mdwn b/doc/users/jogo.mdwn new file mode 100644 index 000000000..e8068a10f --- /dev/null +++ b/doc/users/jogo.mdwn @@ -0,0 +1,5 @@ + * An [economic game](http://sef.matabio.net/) in french, which [use](http://sef.matabio.net/wiki/) IkiWiki. + * Some [plugins](http://www.matabio.net/tcgi/hg/IkiPlugins/file/). + * An alternative [base wiki](http://www.matabio.net/tcgi/hg/FrIkiWiki/file/) in french. + +email: `jogo matabio net`. diff --git a/doc/users/jon.mdwn b/doc/users/jon.mdwn index 551d4764a..eb01821ff 100644 --- a/doc/users/jon.mdwn +++ b/doc/users/jon.mdwn @@ -4,24 +4,57 @@ team-documentation management system for a small-sized group of UNIX sysadmins. * my edits should appear either as 'Jon' (if I've used - [[tips/untrusted_git_push]]) or 'jmtd.net' (or once upon a time - 'alcopop.org/me/openid/' or 'jondowland'). + [[tips/untrusted_git_push]]); 'jmtd.net', 'jmtd.livejournal.com', + 'jmtd' if I've forgotten to set my local git config properly, + or once upon a time 'alcopop.org/me/openid/' or 'jondowland'. * My [homepage](http://jmtd.net/) is powered by ikiwiki I gave a talk at the [UK UNIX User's Group](http://www.ukuug.org/) annual -[Linux conference](http://www.ukuug.org/events/linux2008/) about organising -system administrator documentation. Roughly a third of this talk was -discussing IkiWiki in some technical detail and suggesting it as a good piece -of software for this task. +[Linux conference](http://www.ukuug.org/events/linux2008/) in 2008 about +organising system administrator documentation. Roughly a third of this talk +was discussing IkiWiki in some technical detail and suggesting it as a good +piece of software for this task. * slides at <http://www.staff.ncl.ac.uk/jon.dowland/unix/docs/>. I am also working on some ikiwiki hacks: +* [[todo/allow site-wide meta definitions]] +* Improving the means by which you can migrate from mediawiki to + IkiWiki. See [[tips/convert mediawiki to ikiwiki]] and the + [[plugins/contrib/mediawiki]] plugin. + +I am mostly interested in ikiwiki usability issues: + + * [[bugs/the login page is unclear when multiple methods exist]] + * [[bugs/backlinks onhover thing can go weird]] + * [[todo/CSS classes for links]] + * [[todo/adjust commit message for rename, remove]] + +The following I have been looking at, but are on the back-burner: + * an alternative approach to [[plugins/comments]] (see [[todo/more flexible inline postform]] for one piece of the puzzle; <http://dev.jmtd.net/comments/> for some investigation into making the post - form more integrated) + form more integrated); possibly also [[todo/pagespec to disable ikiwiki directives]] * a system for [[forum/managing_todo_lists]] (see also [[todo/interactive todo lists]] and <http://dev.jmtd.net/outliner/> for the current WIP). +* a `tag2` plugin, which does the same thing as [[plugins/tag]], but + does not sit on top of [[ikiwiki/wikilink]]s, so does not result in + bugs such as [[bugs/tagged() matching wikilinks]]. Code for this lives + in my github `tag2` branch: <http://github.com/jmtd/ikiwiki> + +Penultimately, the following are merely half-formed thoughts: + + * adding and removing tags to pages via the edit form by ticking and + unticking checkboxes next to a tag name (rather than entering the + directive into the text of the page directly) + * perhaps the same for meta + * I'd like to make profiling ikiwiki in action very easy for newcomers. + Perhaps even a plugin that created a file /profile or similar on build. + +Finally, backlinks (since I have issues with the current backlinks +implementation, see [[bugs/backlinks onhover thing can go weird]]): + +[[!inline pages="link(users/Jon)" archive="yes" feeds="no"]] diff --git a/doc/users/joshtriplett.mdwn b/doc/users/joshtriplett.mdwn index f85c068c3..29178057c 100644 --- a/doc/users/joshtriplett.mdwn +++ b/doc/users/joshtriplett.mdwn @@ -6,9 +6,10 @@ Email: `josh@{joshtriplett.org,freedesktop.org,kernel.org,psas.pdx.edu}`. Proud user of ikiwiki. -Currently working on scripts to convert MoinMoin and TWiki wikis to -ikiwikis backed by a git repository, including full history. -Available from the following repositories, though not well-documented: +Worked on scripts to convert MoinMoin and TWiki wikis to ikiwikis backed by a +git repository, including full history. Used for a couple of wikis, and now no +longer maintained, but potentially still useful. Available from the following +repositories, though not well-documented: git clone git://svcs.cs.pdx.edu/git/wiki2iki/moin2iki git clone git://svcs.cs.pdx.edu/git/wiki2iki/html-wikiconverter diff --git a/doc/users/joshtriplett/discussion.mdwn b/doc/users/joshtriplett/discussion.mdwn new file mode 100644 index 000000000..16e9be057 --- /dev/null +++ b/doc/users/joshtriplett/discussion.mdwn @@ -0,0 +1,66 @@ +Can we please have a very brief HOWTO? + +I have a Moin wiki in /var/www/wiki and want to create an IkIwiki clone of it in /var/www/ikiwiki backed by a git repos in /data/ikiwiki. + +I tried: + + mkdir /var/www/ikiwiki + mkdir /data/ikiwiki + PATH=.:/usr/lib/git-core:$PATH ./moin2iki /data/ikiwiki http://localhost/wiki + +Help please!but this failed. (BTW, I don't usually put . in my PATH). The failure appears to be that the converter doesn't actually create an ikiwiki instance, but appears to want to update one: + + fatal: ambiguous argument 'master': unknown revision or path not in the working tree. + Use '--' to separate paths from revisions + fatal: ambiguous argument 'master': unknown revision or path not in the working tree. + Use '--' to separate paths from revisions + fatal: Not a valid object name master + Traceback (most recent call last): + File "/home/peterc/src/moin2iki/git-map", line 125, in <module> + if __name__ == "__main__": sys.exit(main(sys.argv[1:])) + File "/home/peterc/src/moin2iki/git-map", line 117, in main + print git_map_file('commit', new_head) + File "/home/peterc/src/moin2iki/git-map", line 33, in git_map_file + f(inproc.stdout, outproc.stdin, sha, arg) + File "/home/peterc/src/moin2iki/git-map", line 64, in handle_commit + string, tree = lines.pop(0).split() + IndexError: pop from empty list + +OK, so I created one: + + ikiwiki --setup /etc/ikiwiki/auto.setup + ..... +This process created several files and directories in my home directory: + + wiki.git/ + public_html/wiki/ + wiki.setup + .ikiwiki/ + +Following the instructions on the setup page, I did: + mv wiki.git /data/ikiwiki + ( cd /data/ikiwiki; git clone -l wiki.git wiki; ) + mv .ikiwiki /data/ikiwiki/ikiwiki + mv ~/public_html/wiki /var/ikiwiki/ + +then did again + + PATH=.:/usr/lib/git-core:$PATH ./moin2iki /data/ikiwiki/wiki http://www/wiki + +and saw no output, and no change to the filesystem. + +I'm totally confused. It looks as though the script calls moin2git iff the target directory isn't there, but the script fails in interesting ways if it is. + +The other thing I saw was: + + 2009-12-04 09:00:31,542 WARNING MoinMoin.log:139 using logging configuration read from built-in fallback in MoinMoin.log module! + Traceback (most recent call last): + File "./moin2git", line 128, in <module> + if __name__ == '__main__': main(*sys.argv[1:]) + File "./moin2git", line 43, in main + r = request.RequestCLI() + AttributeError: 'module' object has no attribute 'RequestCLI' + +Moin version is 1.8.5 + +Help please! diff --git a/doc/users/nil.mdwn b/doc/users/nil.mdwn new file mode 100644 index 000000000..e1826cec6 --- /dev/null +++ b/doc/users/nil.mdwn @@ -0,0 +1,8 @@ +nil first used ikiwiki on a site/wiki/blog/something... and felt this approach much more comfortable than the usual web-only ones. +Since then, ikiwiki is a kind of swiss army knife when it comes to build anything for the web. + +Can be reached at nicolas at limare.net + +The current big ikiwiki-powered project is <http://www.ipol.im> + +TODO: document "how to split public/edition interfaces" diff --git a/doc/users/rubykat.mdwn b/doc/users/rubykat.mdwn new file mode 100644 index 000000000..f37d13306 --- /dev/null +++ b/doc/users/rubykat.mdwn @@ -0,0 +1 @@ +See [[KathrynAndersen]]. diff --git a/doc/users/schmonz.mdwn b/doc/users/schmonz.mdwn index 752880561..ec282c990 100644 --- a/doc/users/schmonz.mdwn +++ b/doc/users/schmonz.mdwn @@ -1,4 +1,5 @@ -[Amitai Schlair](http://www.columbia.edu/~ays2105/) recently discovered ikiwiki and finds himself using it for all sorts of things. His attempts at contributing: +[Amitai Schlair](http://www.netbsd.org/~schmonz/) recently discovered ikiwiki and finds himself using it for all sorts of things. His attempts at contributing: -* [[plugins/contrib/unixauth]] -* [[plugins/contrib/cvs]] +[[!map pages="!*/Discussion and ((link(users/schmonz) and plugins/*) or rcs/cvs)"]] + +I've also written a plugin for [WIND authentication](http://www.columbia.edu/acis/rad/authmethods/wind/), which may or may not be of general utility. diff --git a/doc/users/simonraven.mdwn b/doc/users/simonraven.mdwn index 5fc24711e..13681a674 100644 --- a/doc/users/simonraven.mdwn +++ b/doc/users/simonraven.mdwn @@ -1,8 +1,7 @@ ## personal/site info -New ikiwiki site at my web site, blog, kisikew.org home site, for indigenews, and our indigenous-centric wiki (mostly East Coast/Woodlands area). Mediawiki stuff was imported successfully (as noted on this web site). - +Have several ikiwiki-based sites at my web site, blog, kisikew.org home site, for indigenews, and our indigenous-centric wiki (mostly East Coast/Woodlands area). ## ikiwiki branch at github -Maintain my own branch, partly to learn about VCS, git, ikiwiki, Debian packaging, and Perl. I don't recommend anyone pull from it, as I use third-party plugins included on this site that people may not want in a default installation of ikiwiki. This is why I don't push to Joey's -- so it's nothing personal, I just don't want to mess things up for other people, from my mistakes and stumbles. +Maintain my own branch, partly to learn about VCS, git, ikiwiki, Debian packaging, and Perl. Thinking of removing most 3rd-party plugins (found in contrib/). Have some custom plugins to support dual bottom-of-the-page "sidebars" and an attempt at supporting HTTPBL (see projecthoneypot.org). diff --git a/doc/users/svend.mdwn b/doc/users/svend.mdwn index 69d83584f..712a0d3e7 100644 --- a/doc/users/svend.mdwn +++ b/doc/users/svend.mdwn @@ -1,4 +1,4 @@ [[!meta title="Svend Sorensen"]] -* [website](http://www.ciffer.net/~svend/) -* [blog](http://www.ciffer.net/~svend/blog/) +* [website](http://ciffer.net/~svend/) +* [blog](http://ciffer.net/~svend/blog/) diff --git a/doc/users/tschwinge.mdwn b/doc/users/tschwinge.mdwn index bb5cef6a6..414612aff 100644 --- a/doc/users/tschwinge.mdwn +++ b/doc/users/tschwinge.mdwn @@ -9,3 +9,146 @@ web pages and previous wiki pages to a *[[ikiwiki]]* system; and all that while preserving the previous content's history, which was stored in a CVS repository for the HTML web pages and a TWiki RCS repository for the wiki; see <http://www.gnu.org/software/hurd/colophon.html>. + +# Issues to Work On + +## Stability of Separate Builds + +The goal is that separate builds of the same source files should yield the +exactly same HTML code (of course, except for changes due to differences in +Markdown rendering, for example). + + * Timestamps -- [[forum/ikiwiki__39__s_notion_of_time]], [[forum/How_does_ikiwiki_remember_times__63__]] + + Git set's the current *mtime* when checking out files. The result is that + <http://www.gnu.org/software/hurd/contact_us.html> and + <http://www.bddebian.com:8888/~hurd-web/contact_us/> show different *Last + edited* timestamps. + + This can either be solved by adding a facility to Git to set the + checked-out files' *mtime* according to the *AuthorDate* / *CommitDate* + (which one...), or doing that retroactively with the + <http://www.gnu.org/software/hurd/set_mtimes> script before building, or + with a ikiwiki-internal solution. + + * HTML character entities + + <http://www.gnu.org/software/hurd/purify_html> + +## Tags -- [[bugs/tagged__40____41___matching_wikilinks]] + +Tags should be a separate concept from wikilinks. + +### \[[!map]] behavior + +The \[[!map]] on, for example, +<http://www.gnu.org/software/hurd/tag/open_issue_hurd.html>, should not show +the complete hierarchy of pages, but instead just the pages that actually *do* +contain the \[[!tag open_issue_hurd]]. + +> `tagged(open_issue_hurd)` in its pagespec should do that. --[[Joey]] + +>> Well, that's exactly what this page contains: \[[!map +>> pages="tagged(open_issue_hurd) and !open_issues and !*/discussion" +>> show=title]] +>> +>> This is currently rendered as can be seen on +>> <http://www.gnu.org/software/hurd/tag/open_issue_hurd.html>, but I'd imagine +>> it to be rendered by **only** linking to the pages that actually do contain +>> the tag, (**only** the outer leaf ones, which are *capturing stdout and +>> stderr*, *ramdisk*, *syncfs*, ...; but **not** to *hurd*, *debugging*, +>> *translator*, *libstore*, *examples*, ...). Otherwise, the way it's being +>> rendered at the moment, it appears to the reader that *hurd*, *debugging*, +>> *translator*, *libstore*, *examples*, ... were all tagged, too, and not only +>> the outer ones. + +## Anchors -- [[ikiwiki/wikilink/discussion]] + +## Default Content for Meta Values -- [[plugins/contrib/default_content_for___42__copyright__42___and___42__license__42__]] + +This will decrease to be relevant, as we're going to add copyright and +licensing headers to every single file. + +## [[bugs/img vs align]] + +## Texinfo -- [[plugins/contrib/texinfo]] + +Not very important. Have to consider external commands / files / security (see +[[plugins/teximg]] source code)? + +## Shortcuts -- [[plugins/shortcut/discussion]] + +## \[[!meta redir]] -- [[todo/__42__forward__42__ing_functionality_for_the_meta_plugin]] + +Implement a checker that makes sure that no pages that use \[[!meta redir]] +redirect to another page (and are thus considered legacy pages for providing +stable URLs, for example) are linked to from other wiki pages. This is useful +w.r.t. backlinks. Alternative, the backlinks to the \[[!meta redir]]-using +pages could perhaps be passed on to the referred-to page? + +> I found that backlinks was an easy way to find such links to such pages. +> (Although the redirection made it hard to see the backlinks!) --[[Joey]] + +## \[[!meta redir]] -- tell what's going on + +Add functionality that a text like *this page's content has moved to [new +page]; in a few seconds you'll be redirected thither* is displayed on every +page that uses \[[!meta redir]]. + +## Sendmail -- [[todo/passwordauth:_sendmail_interface]] + +## [[bugs/Broken Parentlinks]] + +## Modifying [[plugins/inline]] for showing only an *appetizer* + +Currently ikiwiki's inline plugin will either show the full page or nothing of +it. Often that's too much. One can manually use the [[plugins/toggle]] plugin +-- see the *News* section on <http://www.gnu.org/software/hurd/>. Adding a new +mode to the inline plugin to only show an *appetizer* ending with *... (read +on)* after a customizable amount of characters (or lines) would be a another +possibility. The *... (read on)* would then either toggle the full content +being displayed or link to the complete page. + +> You're looking for [[plugins/more]] (or possibly a way to do that automatically, +> I suppose. --[[Joey]] + +## Prefix For the HTML Title + +The title of each page (as in `<html><head><title>`...) should be prefixed with +*GNU Project - GNU Hurd -*. We can either do this directly in `page.tmpl`, or +create a way to modify the `TITLE` template variable suitably. + +## [[plugins/inline]] feedfile option + +Not that important. Git commit b67632cdcdd333cf0a88d03c0f7e6e62921f32c3. This +would be nice to have even when *not* using *usedirs*. Might involve issues as +discussed in *N-to-M Mapping of Input and Output Files* on +[[plugins/contrib/texinfo]]. + +## Unverified -- these may be bugs, but have yet to be verified + + * ikiwiki doesn't change its internal database when \[[!meta date]] / + \[[!meta updated]] are added / removed, and thusly these meta values are + not promulgated in RSS / Atom feeds. + + > I would rather see this filed as a bug, but FWIW, the problem + > is probably that meta does not override the mdate_3339 + > template variable used by the atom and rss templates. + > (Meta does store ctime directly in the ikiwiki database, but cannot + > store mtime in \%pagemtime because it would mess up detection of when + > actual file mtimes change.) --[[Joey]] + + * Complicated issue w.r.t. *no text was copied in this page* + ([[plugins/cutpaste]]) in RSS feed (only; not Atom?) under some conditions + (refresh only, but not rebuild?). Perhaps missing to read in / parse some + files? + + * [[plugins/recentchanges]] + + * Creates non-existing links to changes. + + * Invalid *directory link* with `--usedirs`. + + * Doesn't honor `$timeformat`. + + * Does create `recentchangees.*` files even if that is overridden. diff --git a/doc/users/tupyakov_vladimir.mdwn b/doc/users/tupyakov_vladimir.mdwn new file mode 100644 index 000000000..95f85adc2 --- /dev/null +++ b/doc/users/tupyakov_vladimir.mdwn @@ -0,0 +1 @@ +всем привет! diff --git a/doc/users/tychoish.mdwn b/doc/users/tychoish.mdwn index 3439afdb3..75b285e4a 100644 --- a/doc/users/tychoish.mdwn +++ b/doc/users/tychoish.mdwn @@ -1,5 +1,10 @@ -I'm [tycho](http://www.tychoish.com/). +I'm Sam Kleinman ([tychoish](http://www.tychoish.com/)), and I use ikiwiki. -I use ikiwiki *a lot* on my local machine to organize thoughts, stage writing projects, and collect information. It's a great thing for me. Someday I'll probably get around to using it to powersome sort of collaborative fiction project. Someday. +I used to use ikiwiki a bunch, in a purely local setup, for various projects and personal note taking of various kinds. I've since switched to using [orgmode](http://www.orgmode.org) and a number of just regular old git repositories with some textfiles in them, which seems to work just as well for my particular use case. -Talk to you soon. +Currently, I'm using ikiwiki on the [cyborg institute wiki](http://www.cyborginstitute.com/wiki/) and I'm working on helping someone develop a collaborative fiction project at the [critical futures wiki](http://www.criticalfutures.com/wiki/)... and I'm thinking about other projects, but nothing that exists enough to mention here. + +I hang out in #ikiwiki on oftc, and I'm working on getting some of my templates in order to share with you all. Soon. For sure. + +Cheers, +tychoish diff --git a/doc/users/ulrik.mdwn b/doc/users/ulrik.mdwn new file mode 100644 index 000000000..02e1c1cbd --- /dev/null +++ b/doc/users/ulrik.mdwn @@ -0,0 +1,3 @@ +I use ikiwiki for a small personal website at <http://kaizer.se/wiki/> + +I have proposed some patches for ikiwiki: <http://kaizer.se/wiki/hacks/ikiwiki_rst/> (subject to change) diff --git a/doc/users/weakish.mdwn b/doc/users/weakish.mdwn index ccd5665ad..a5f252e75 100644 --- a/doc/users/weakish.mdwn +++ b/doc/users/weakish.mdwn @@ -1 +1,3 @@ email: weakish@gmail.com + +website: <http://weakish.github.com> diff --git a/doc/users/xma.mdwn b/doc/users/xma.mdwn deleted file mode 100644 index 89f2ff74c..000000000 --- a/doc/users/xma.mdwn +++ /dev/null @@ -1,28 +0,0 @@ -[[!meta title="Xavier Maillard"]] -# Xavier Maillard - -I just started using [[ikiwiki]] for my own webspace at http://maillard.mobi/~xma/wiki - -I am learning how to effectively use it. - -Anyway, [[ikiwiki]] is really *awesome* ! - -## More about me - -I am CLI user living in the linux console. More precisely, I live in an [[GNU_Emacs]] frame all day long. My main computer is an EeePC 901 running Slackware GNU/Linux 12.1. I do not have X installed (too lazy) but when in X, I am running an instance of [[CLFSWM]]. - -## Contacting me - -Various channels to contact me: - -- mail: xma@gnu.org -- jabber: xma01@jabber.fr -- mobile: +33 621-964-362 (I only anwser to people I know though) - -Voila. - -## Plans - -I am planning to make a presentation of [[ikiwiki]]to my [local LUG](http://lolica.org) for our next montly meeting. Any help would be greatly appreciated. - -We are discussing to replace our old unmaintained (and unmaintainable) [SPIP](http://spip.net) website with a wiki. This is why I would like using ikiwiki ;) diff --git a/doc/w3mmode.mdwn b/doc/w3mmode.mdwn index 3afee5c9b..04e37ba04 100644 --- a/doc/w3mmode.mdwn +++ b/doc/w3mmode.mdwn @@ -1,5 +1,5 @@ It's possible to use all of ikiwiki's web features (page editing, etc) in -the `w3m` web browser without using a web server. `w3m` supports local CGI +the [`w3m`](http://w3m.sourceforge.net/) web browser without using a web server. `w3m` supports local CGI scripts, and ikiwiki can be set up to run that way. This requires some special configuration: diff --git a/doc/wikitemplates.mdwn b/doc/wikitemplates.mdwn deleted file mode 100644 index 6c0480cea..000000000 --- a/doc/wikitemplates.mdwn +++ /dev/null @@ -1,50 +0,0 @@ -ikiwiki uses the HTML::Template module as its template engine. This -supports things like conditionals and loops in templates and is pretty easy -to learn. - -The aim is to keep almost all html out of ikiwiki and in the templates. - -It ships with some basic templates which can be customised. These are -located in /usr/share/ikiwiki/templates by default. - -* `page.tmpl` - Used for displaying all regular wiki pages. -* `misc.tmpl` - Generic template used for any page that doesn't - have a custom template. -* `editpage.tmpl` - Create/edit page. -* `change.tmpl` - Used to create a page describing a change made to the wiki. -* `passwordmail.tmpl` - Not a html template, this is used to - generate a mail with an url the user can use to reset their password. -* `rsspage.tmpl` - Used for generating rss feeds for [[blogs|blog]]. -* `rssitem.tmpl` - Used for generating individual items on rss feeds. -* `atompage.tmpl` - Used for generating atom feeds for blogs. -* `atomitem.tmpl` - Used for generating individual items on atom feeds. -* `inlinepage.tmpl` - Used for adding a page inline in a blog - page. -* `archivepage.tmpl` - Used for listing a page in a blog archive page. -* `microblog.tmpl` - Used for showing a microblogging post inline. -* `blogpost.tmpl` - Used for a form to add a post to a blog (and a rss/atom links) -* `feedlink.tmpl` - Used to add rss/atom links if blogpost.tmpl is not used. -* `aggregatepost.tmpl` - Used by the [[plugins/aggregate]] plugin to create - a page for a post. -* `searchform.tmpl` - Used by the [[plugins/search]] plugin to add a search - form to wiki pages. -* `searchquery.tmpl` - This is an omega template, used by the - [[plugins/search]] plugin. -* `comment.tmpl` - This template is used to display a comment - by the [[plugins/comments]] plugin. -* `editcomment.tmpl` - This template is the comment post form for the - [[plugins/comments]] plugin. -* `commentmoderation.tmpl` - This template is used to produce the comment - moderation form. -* `recentchanges.tmpl` - This template is used for listing a change - on the RecentChanges page. - -The [[plugins/pagetemplate]] plugin can allow individual pages to use a -different template than `page.tmpl`. - -The [[plugins/template]] plugin also uses templates, though those -[[templates]] are stored in the wiki and inserted into pages. - -The [[plugins/edittemplate]] plugin is used to make new pages default to -containing text from a template, which can be filled as out the page is -edited. diff --git a/doc/wikitemplates/discussion.mdwn b/doc/wikitemplates/discussion.mdwn deleted file mode 100644 index f97444e5f..000000000 --- a/doc/wikitemplates/discussion.mdwn +++ /dev/null @@ -1,46 +0,0 @@ -## Place for local templates -Where does one put any locally modified templates for an individual ikiwiki? --Ivan Z. - -> You can put them whereever you like; the `templatedir` controls -> where ikiwiki looks for them. --[[Joey]] - -Thank you for your response! My question arose out of my intention to make -custom templates for a wiki--specifically suited for the kind of content -it will have--so, that would mean I would want to distribute them through -git together with other content of the wiki. So, for this case the -separation of conceptually ONE thing (the content, the templates, and the -config option which orders to use these templates) into THREE separate -files/repos (the main content repo, the repo with templates, and the config -file) is not convenient: instead of distributing a single repo, I have to -tell people to take three things if they want to replicate this wiki. How -would you solve this inconvenience? Perhaps, a default location of the -templates *inside* the source repo would do?--Ivan Z. - -> I would avoid putting the templates in a subdirectory of the ikiwiki srcdir. -> (I'd also avoid putting the ikiwiki setup file there.) -> While it's safe to do either in some cases, there are configurations where -> it's unsafe. For example, a malicious user could use attachment handling to -> replace those files with their own, bad versions. -> -> So, two ideas for where to put the templatedir and ikiwiki setup. - -> * The easiest option is to put your wiki content in a subdirectory -> ("wiki", say) and point `srcdir` at that. -> then you can have another subdirectory for the wikitemplates, -> and put the setup file at the top. -> * Another option if using git would be to have a separate branch, -> in the same git repository, that holds wikitemplates and the setup file. -> Then you check out the repository once to make the `srcdir` available, -> and have a second checkout, of the other branch, to make the other stuff -> available. -> -> Note that with either of these methods, you have to watch out if -> giving other direct commit access to the repository. They could -> still edit the setup file and templates, so only trusted users should -> be given access. (It is, however, perfectly safe to let people edit -> the wiki via the web, and is even safe to configure -> [[tips/untrusted_git_push]] to such a repository.) --[[Joey]] - -Thanks, that's a nice and simple idea: to have a subdirectory! I'll try it. --Ivan Z. - -A [[!taglink wish|wishlist]]: the ikiwiki program could be improved so that it follows the same logic as git in looking for its config: it could ascend directories until it finds an `.ikiwiki/` directory with `.ikiwiki/setup` and then uses that configuration. Now I'm tired to always type `ikiwiki --setup path/to/the/setup --refresh` when working in my working clone of the sources; I'd like to simply type `ikiwiki` instead, and let it find the setup file. The default location to look for templates could also be made to be a sibling of the setup file: `.ikiwiki/templates/`. --Ivan Z. |