diff options
Diffstat (limited to 'doc')
768 files changed, 18811 insertions, 2209 deletions
diff --git a/doc/banned_users/discussion.mdwn b/doc/banned_users/discussion.mdwn index ec9dfc07b..ca873e64c 100644 --- a/doc/banned_users/discussion.mdwn +++ b/doc/banned_users/discussion.mdwn @@ -24,4 +24,8 @@ whether OpenID users are stored in .ikiwiki/userdb file. --[[Paweł|ptecza]] >>>> BTW, have you had a sleep this night? ;) Here, in Poland, the time is >>>> 1:30 PM (in your timezone is 7:30 AM), but we're just discussing for ->>>> about 3 hours... --[[Paweł|ptecza]]
\ No newline at end of file +>>>> about 3 hours... --[[Paweł|ptecza]] + +---- + +I would be quite interested in inheriting a banned users list from ikiwiki. More generally, the ikiwiki community might benefit from sharing ban lists amongst each other. Some way to achieve that as part of the existing work/data flows (git pulls etc.) would be interesting. That might require defining banned users in other places than just the setup file, though. -- [[Jon]] diff --git a/doc/branches.mdwn b/doc/branches.mdwn new file mode 100644 index 000000000..5149d79f9 --- /dev/null +++ b/doc/branches.mdwn @@ -0,0 +1,25 @@ +In order to refer to a branch in one of the [[git]] repositories, for +example when submitting a [[patch]], you can use the +[[templates/gitbranch]] template. For example: + + \[[!template id=gitbranch branch=yourrepo/amazingbranch author="\[[yourname]]"]] + +Branches that have been [[reviewed]] and need work will not be listed +here. + +Branches referred to in open [[bugs]] and [[todo]]: + +[[!inline pages="(todo/* or bugs/*) and link(/branches) and !link(bugs/done) +and !link(todo/done) and !*/*/*" show=0 archive=yes]] + +Long-lived branches in the main git repository: + +* `debian-stable` is used for updates to the old version included in + Debian's stable release, and `debian-testing` is used for updates to + Debian's testing release. (These and similar branches will be rebased.) +* `ignore` gets various branches merged to it that [[Joey]] wishes to ignore + when looking at everyone's unmerged changes. +* `pristine-tar` contains deltas that + [pristine-tar](http://kitenet.net/~joey/code/pristine-tar) + can use to recreate released tarballs of ikiwiki + diff --git a/doc/bugs.mdwn b/doc/bugs.mdwn index f634b6e78..f16a4f8e1 100644 --- a/doc/bugs.mdwn +++ b/doc/bugs.mdwn @@ -3,6 +3,10 @@ elsewhere. Link items to [[bugs/done]] when done. Also see the [Debian bugs](http://bugs.debian.org/ikiwiki). +There are [[!pagecount pages="bugs/* and !bugs/done and !bugs/discussion and +!link(patch) and !link(bugs/done) and !bugs/*/*" +feedpages="created_after(bugs/no_commit_mails_for_new_pages)"]] "open" bugs: + [[!inline pages="bugs/* and !bugs/done and !bugs/discussion and !link(patch) and !link(bugs/done) and !bugs/*/*" feedpages="created_after(bugs/no_commit_mails_for_new_pages)" diff --git a/doc/bugs/2.45_Compilation_error.mdwn b/doc/bugs/2.45_Compilation_error.mdwn index c69c2fc25..63147b656 100644 --- a/doc/bugs/2.45_Compilation_error.mdwn +++ b/doc/bugs/2.45_Compilation_error.mdwn @@ -189,3 +189,10 @@ Would you suggest I try rebuilding perl without this patch? Debian has a huge pe it's not straightforward for me to see if they do something similar to Arch. > I think Debian has a similar patch. + +--- + +[[done]] -- apparently this was a problem due to a distribution's +customisation to perl, or something. Seems to late now to track down what, +unfortunatly. And ikiwiki's Makefile no longer uses the "-libdir" switch +that seemed to trigger the bug. --[[Joey]] diff --git a/doc/bugs/404_plugin_and_lighttpd.mdwn b/doc/bugs/404_plugin_and_lighttpd.mdwn new file mode 100644 index 000000000..8508d0dcd --- /dev/null +++ b/doc/bugs/404_plugin_and_lighttpd.mdwn @@ -0,0 +1,45 @@ +Lighttpd apparently sets REDIRECT_STATUS=200 for the server.error-handler-404 page. This breaks the [[plugins/404]] plugin which checks this variable for 404 before processing the URI. It also doesn't seem to set REDIRECT_URL. + +> For what it's worth, the first half is <http://redmine.lighttpd.net/issues/1828>. +> One workaround would be to make this script your 404 handler: +> +> #!/bin/sh +> REDIRECT_STATUS=404; export REDIRECT_STATUS +> REDIRECT_URL="$SERVER_NAME$REQUEST_URI"; export REDIRECT_URL +> exec /path/to/your/ikiwiki.cgi "$@" +> +> --[[smcv]] + +I was able to fix my server to check the REQUEST_URI for ikiwiki.cgi and to continue processing if it was not found, passing $ENV{SEVER_NAME} . $ENV{REQUEST_URI} as the first parameter to cgi_page_from_404. However, my perl is terrible and I just made it work rather than figuring out exactly what to do to get it to work on both lighttpd and apache. + +This is with lighttpd 1.4.19 on Debian. + +> /cgi-bin/ikiwiki.cgi?do=goto also provides redirection in the same way, +> if that's any help? You might need to set the lighttpd 404 handler to +> that, then compose REDIRECT_URL from other variables if necessary. +> +> I originally wrote the plugin for Apache; [[weakish]] contributed the +> lighttpd docs and might know more about how to make it work there. +> --[[smcv]] + +>> As I said, I got it working for me, but somebody who knows perl should probably look at it with the aim of making it work for everyone. +>> I considered having lighttpd construct a proper url for the 404 redirect itself, but I don't know if it can do something like that or not. +>> For what it's worth, here's the change I made to the module: + + sub cgi ($) { + my $cgi=shift; + if ($ENV{REQUEST_URI} !~ /ikiwiki\.cgi/) { + my $page = cgi_page_from_404( + Encode::decode_utf8($ENV{SERVER_NAME} . $ENV{REQUEST_URI}), + $config{url}, $config{usedirs}); + IkiWiki::Plugin::goto::cgi_goto($cgi, $page); + } + + # if (exists $ENV{REDIRECT_STATUS} && + # $ENV{REDIRECT_STATUS} eq '404') { + # my $page = cgi_page_from_404( + # Encode::decode_utf8($ENV{REDIRECT_URL}), + # $config{url}, $config{usedirs}); + # IkiWiki::Plugin::goto::cgi_goto($cgi, $page); + # } + } diff --git a/doc/bugs/Building_a_sidebar_does_not_regenerate_the_subpages.mdwn b/doc/bugs/Building_a_sidebar_does_not_regenerate_the_subpages.mdwn index 596719a8b..419292930 100644 --- a/doc/bugs/Building_a_sidebar_does_not_regenerate_the_subpages.mdwn +++ b/doc/bugs/Building_a_sidebar_does_not_regenerate_the_subpages.mdwn @@ -6,4 +6,3 @@ If sandbox/page.mdwn has been generated and sandbox/sidebar.mdwn is created, the # adding a new sidebar page. So adding such a page # currently requires a wiki rebuild. add_depends($page, $sidebar_page); - diff --git a/doc/bugs/Cannot_inline_pages_with_apostrophes_in_title.mdwn b/doc/bugs/Cannot_inline_pages_with_apostrophes_in_title.mdwn index 7daf52f2a..3e1fe823e 100644 --- a/doc/bugs/Cannot_inline_pages_with_apostrophes_in_title.mdwn +++ b/doc/bugs/Cannot_inline_pages_with_apostrophes_in_title.mdwn @@ -3,3 +3,5 @@ page produces nothing. It looks like the inline plugin is failing to do the translation from apostrophe to `_39_` that other parts of the system do, so although one can make wikilinks to such pages and have them detected as existing (for instance, by the conditional plugin), inline looks in the wrong place and doesn't see the page. > I can't reproduce that (btw, an apostrophe would be `__39__`) --[[Joey]] + +[[done]] diff --git a/doc/bugs/Checksum_errors_on_the_pristine-tar_branch.mdwn b/doc/bugs/Checksum_errors_on_the_pristine-tar_branch.mdwn new file mode 100644 index 000000000..471dad98c --- /dev/null +++ b/doc/bugs/Checksum_errors_on_the_pristine-tar_branch.mdwn @@ -0,0 +1,112 @@ +I'm in the process of installing ikiwiki on my home page (hooray), and wants to have the newest stable version available. I suppose that's the one on the `pristine-tar` branch. + +> You can check out the latest released version with: +> +> git tag # outputs a list of tags +> git checkout 3.20110124 # or use the latest one, if different +> +> If you're using git already, there's no need to use pristine-tar, +> unless you particularly want a tarball for some reason. +> +> Downloading the tarball from Debian is the other recommended way to +> [[download]] the source code. --[[smcv]] + +>> Thanks for your responses, smcv. I'll use that method and install the newest version when I'm more familiar with the way ikiwiki works. For now I'm using version 3.20100122 installed with apt-get. Works great so far, but I'm looking forward to the new install. -- [[sunny256]] 2011-02-22 19:30+0100 + +But I'm unable to recreate the newest `.tar` file, in fact there's errors in all these `.tar.gz` files on that branch: + +* `ikiwiki_2.20.tar.gz` +* `ikiwiki_2.30.tar.gz` +* `ikiwiki_2.31.1.tar.gz` +* `ikiwiki_2.46.tar.gz` +* `ikiwiki_2.47.tar.gz` +* `ikiwiki_2.48.tar.gz` +* `ikiwiki_2.49.tar.gz` +* `ikiwiki_2.50.tar.gz` +* `ikiwiki_2.51.tar.gz` +* `ikiwiki_2.62.1.tar.gz` +* `ikiwiki_2.62.tar.gz` +* `ikiwiki_3.20101129.tar.gz` +* `ikiwiki_3.20101201.tar.gz` +* `ikiwiki_3.20101231.tar.gz` +* `ikiwiki_3.20110105.tar.gz` +* `ikiwiki_3.20110122.tar.gz` +* `ikiwiki_3.20110123.tar.gz` +* `ikiwiki_3.20110124.tar.gz` + +The operation fails on these files with a "Checksum validation failed" error from `xdelta`(1). The `pristine-tar`(1) version is 1.00, installed with `apt-get` on Ubuntu 10.04.2 LTS. Is this version too old, or are there some errors on this branch? + +> I get similar errors on Debian unstable, but not on all of the same versions; +> for instance, my `ikiwiki_3.20110124.tar.gz` is OK. In some cases, xdelta +> complains, but the tarball is produced successfully. However, I do see actual +> failures for 2.62 and 2.62.1, for instance. --[[smcv]] + +> Yes, on Debian unstable I got failures on only old ones, but not in +> contiguous blocks: --[[Joey]] +> +> ikiwiki_2.20.tar.gz +> ikiwiki_2.30.tar.gz +> ikiwiki_2.31.1.tar.gz +> ikiwiki_2.46.tar.gz +> ikiwiki_2.47.tar.gz +> ikiwiki_2.48.tar.gz +> ikiwiki_2.49.tar.gz +> ikiwiki_2.50.tar.gz +> ikiwiki_2.51.tar.gz +> ikiwiki_2.62.1.tar.gz +> ikiwiki_2.62.tar.gz +> +> Probably what would help debug this problem is if someone can +> reproduce with one or more of the other ones that do **not** fail +> for me, pass `-dk` to pristine-tar, and send me a copy of its temp directory +> (joey@kitenet.net), and the versions of pristine-tar, tar, gzip. +> Then I can compare the good and bad recreated +> tarballs and identify the difference. Or pass them to the tar developers, +> who have helped before. +> +> The only cause that I can think of is that perhaps tar's output +> has changed compared with the version used to create those. The +> only tar output change I know of involved filenames that were +> exactly 100 bytes long -- and pristine-tar 1.11 works around that +> when run with tar 1.25-2 on Debian. FWIW, I am only seeing +> this in ikiwiki's pristine-tar info, not other packages'. +> (Checked all of debhelper's and alien's and etckeeper's +> and pristine-tar's tarballs.) --[[Joey]] +> +>> It looks as though I only get the same failures as you, so that's no help +>> (reassuring, though, since we're presumably both running recent Debian). +>> sunny256's failure cases might just result from the older tar and pristine-tar +>> on Ubuntu 10.04? --[[smcv]] + +>>> Yes, I can reproduce the same failures sunny256 saw using Debian oldstable. Once I +>>> upgrade pristine-tar and tar, it goes away, so I think it is the 100 +>>> byte filename bug affecting those. +>>> +>>> As to the ones we all see fail, I dunno what it is, but probably +>>> has to do with some kind of historical issue in the versions of +>>> pristine-tar/tar used to create them. We may never know what went wrong +>>> there. --[[Joey]] [[done]] + +A complete output of the "pristine-tar checkout" of all files is stored on <https://gist.github.com/836720> . + +For now, I'll download the `.tar.gz` from <http://packages.debian.org/unstable/source/ikiwiki>, or maybe install `ikiwiki_3.20110124_all.deb`. Would you recommend using that `.deb` file on Ubuntu 10.04.2 LTS, or is it Debian-specific? -- [[sunny256]] 2011-02-21 08:42+0100 + +> The .deb from Debian unstable is likely to work on Ubuntu; I've +> generally been able to compile snapshots on Debian unstable and +> install them onto Debian lenny (older than that Ubuntu release) +> without modification. If in doubt, build it from source. --[[smcv]] + +> > The .deb file `ikiwiki_3.20110124_all.deb` from Debian unstable seems to +> > work great. I'm now the happy user of the newest stable version, yay. There +> > were some errors or warnings, though. This is the first one: + +> > > `You are overwriting a locally defined method (finished) with an accessor +> > > at /usr/lib/perl5/Moose/Meta/Attribute.pm line 570` + +> > Along with loads of other suspicious stuff. Have posted the whole output at +> > <https://gist.github.com/842789>. I'll dig around a bit in the source to +> > see if there's something I need to worry about. It looks good so far. +> > -- [[sunny256]] <small>2011-02-24 20:27Z</small> + +> > > Looks like a bug in [[!cpan Net::Amazon::S3::Client::Bucket]] or in something +> > > it uses, rather than in ikiwiki itself. --[[smcv]] diff --git a/doc/bugs/Comments_dissapeared.mdwn b/doc/bugs/Comments_dissapeared.mdwn new file mode 100644 index 000000000..787f18c98 --- /dev/null +++ b/doc/bugs/Comments_dissapeared.mdwn @@ -0,0 +1,69 @@ +Although I have comments enabled and I have been using them successfully for ages now, I've come to notice that they have stopped working in the last week or two. + +I am running version 3.20100312 with the following configuration: + +<http://static.natalian.org/2010-03-27/natalian.txt> + +In my (HTML5 modified page.tmpl) it doesn't seem to enter the "TMPL_IF COMMENTS" block anymore. I tried the stock page.tmpl and they didn't seem to work either, so the variable name hasn't changed has it? + +Any other ideas? With thanks, + + comments_pagespec => 'archives/* and !*/Discussion', + +> Your setup file only allows comments to pages under archives. That +> seems unlikely to be right, so I guess it is causing your problem. +> --[[Joey]] + +That's the only place where I want comments. <http://natalian.org/archives/> +Has the pagespec changed? Is it `archives/*/*` or something like that? + +It worked just fine with this configuration. I swear I have not modified it. :) -- [[Kai Hendry]] + +> No changes that I can think of. 'archives/*' will match *all* pages under +> archives. Anyway, I can see in your site's rss feed that comments are +> enabled for posts, since they have comments tags there. And +> in fact I see comments on eg +> <http://natalian.org/archives/2010/03/25/BBC_News_complaints/>. +> +> So I suspect you have simply not rebuilt your wiki after making some +> change that fixed the comments, and so only newer pages are getting them. +> --[[Joey]] + +I have tried rebuilding on my squeeze system and still comments don't appear. Any clues how to debug this? +<http://natalian.org/comments/> + +I was worried is was due to a time skew problem I was experiencing on my VPS in the last month, though the time is right now and still comments do not appear on blog posts like <http://natalian.org/archives/2010/03/25/BBC_News_complaints/> + +# Debugging templates + +`sudo apt-get install libhtml-template-compiled-perl` + + hendry@webconverger templates$ cat test-template.perl + #!/usr/bin/perl + use HTML::Template::Compiled; + local $HTML::Template::Compiled::DEBUG = 1; + my $htc = HTML::Template::Compiled->new( + filename => "$ARGV[0]", + ); + eval { + print $htc->output; + }; + if ($@) { + # reports as text + my $msg = $htc->debug_code; + # reports as a html table + my $msg_html = $htc->debug_code('html'); + } + hendry@webconverger templates$ ./test-template.perl page.tmpl + Missing closing tag for 'IF' atend of page.tmpl line 159 + + +I think the problem was before that it was `<TMPL_IF COMMENTS>` and now it is `<TMPL_IF NAME="COMMENTS">` ? + + + +# Solved + +A merge with the templates in master with my [html5](http://git.webconverger.org/?p=ikiwiki;a=shortlog;h=refs/heads/html5) branch looks like it has solved the problem. Also see [[bugs/html5_support]]. + +[[bugs/done]] diff --git a/doc/bugs/Error:_Your_login_session_has_expired._.mdwn b/doc/bugs/Error:_Your_login_session_has_expired._.mdwn index 046d6e10d..b993cd8e7 100644 --- a/doc/bugs/Error:_Your_login_session_has_expired._.mdwn +++ b/doc/bugs/Error:_Your_login_session_has_expired._.mdwn @@ -41,4 +41,6 @@ Whilst trying to edit http://hugh.vm.bytemark.co.uk/ikiwiki.cgi via OpenID. Any Thanks for you excellent analysis. The bug was due to old pre-3.0 **templates** laying about. After deleting them, ikiwiki defaults to its own templates. Clever. :-) +Great, this saved me big time! It is a google 1st hit. I had the same with accidentally using old templates. Thanks! --[[cstamas]] + [[bugs/done]] diff --git a/doc/bugs/Error:_no_text_was_copied_in_this_page_--_missing_page_dependencies.mdwn b/doc/bugs/Error:_no_text_was_copied_in_this_page_--_missing_page_dependencies.mdwn new file mode 100644 index 000000000..0082eed4d --- /dev/null +++ b/doc/bugs/Error:_no_text_was_copied_in_this_page_--_missing_page_dependencies.mdwn @@ -0,0 +1,46 @@ +That one has bitten me for some time; here is the minimal testcase. There is +also an equivalent (I suppose) problem when using another plugin, but I hope +it's enough to track it down for this one. + + $ tar -xj < [bug-dep_order.tar.bz2](http://schwinge.homeip.net/~thomas/tmp/bug-dep_order.tar.bz2) + $ cd bug-dep_order/ + $ ./render_locally + [...] + $ find "$PWD".rendered/ -print0 | xargs -0 grep 'no text was copied' + $ [no output] + $ touch news/2010-07-31.mdwn + $ ./render_locally + refreshing wiki.. + scanning news/2010-07-31.mdwn + building news/2010-07-31.mdwn + building news.mdwn, which depends on news/2010-07-31 + building index.mdwn, which depends on news/2010-07-31 + done + $ find "$PWD".rendered/ -print0 | xargs -0 grep 'no text was copied' + /home/thomas/tmp/hurd-web/bug-dep_order.rendered/news.html:<p>[[!paste <span class="error">Error: no text was copied in this page</span>]]</p> + /home/thomas/tmp/hurd-web/bug-dep_order.rendered/news.html:<p>[[!paste <span class="error">Error: no text was copied in this page</span>]]</p> + +This error shows up only for *news.html*, but not in *news/2010-07-31* or for +the aggregation in *index.html* or its RSS and atom files. + +--[[tschwinge]] + +> So the cutpaste plugin, in order to support pastes +> that come before the corresponding cut in the page, +> relies on the scan hook being called for the page +> before it is preprocessed. +> +> In the case of an inline, this doesn't happen, if +> the page in question has not changed. +> +> Really though it's not just inline, it's potentially anything +> that preprocesses content. None of those things guarantee that +> scan gets re-run on it first. +> +> I think cutpaste is going beyond the intended use of scan hooks, +> which is to gather link information, not do arbitrary data collection. +> Requiring scan be run repeatedly could be a lot more work. +> +> Using `%pagestate` to store the cut content when scanning would be +> one way to fix this bug. It would mean storing potentially big chunks +> of page content in the indexdb. [[done]] --[[Joey]] diff --git a/doc/bugs/Exception:_Unknown_function___96__this__39___.mdwn b/doc/bugs/Exception:_Unknown_function___96__this__39___.mdwn new file mode 100644 index 000000000..189ba740f --- /dev/null +++ b/doc/bugs/Exception:_Unknown_function___96__this__39___.mdwn @@ -0,0 +1,70 @@ +I'm very excited to try out ikiwiki, since it should fit my purposes extremely well, but I'm having trouble with the search plugin. I'm pretty sure that right after I installed ikiwiki and needed dependencies, the search plugin was working fine. However, now when I try to use search, I get "Exception: Unknown function `this'" error on a blank page. I'm not sure how I should go about debugging this issue - my server's (I use Lighttpd 1.4.22) error log has no mention of the exception and there's nothing in /var/log/syslog either. + +What might be causing this exception and how I might go about debugging exceptions? + +> Appears to be coming from your xapian omega cgi binary. If you +> run `strings /usr/lib/cgi-bin/omega/omega` you can see it has +> "Exception: " in it, and I have found some similar (but not identical) +> error messages from xapian in a web search. +> +> I don´t know what to suggest, other than upgrade/downgrade/reinstall +> xapian-omega, and contacting the xapian developers for debugging. +> You could try rebuilding your wiki in case it is somehow +> caused by a problem with the xapian database. Failing everything, you +> could switch to [[google_search_plugin|plugins/google]]. --[[Joey]] + +>> Thanks, Joey. With your help I was able to figure out what was wrong. It's a fun little bug (or feature): the title of my wiki had string `$this` in title and that's what was causing the omega binary to choke. My wiki's title was inserted without escaping into the query template used by omega. Omega treated `$this` in the title as a function name and threw an exception because no such function was defined. To avoid this behavior, I used an html entity in the title, so `$this` became `$this`. I don't think that the wiki title should be inserted into the template without escaping - it can produce an error that's not trivial to debug. If users want to modify the html in the title, they should be editing respective templates, not typing html in the wiki title input. What do you think? --[[dkobozev]] + +>>> Sounds like a bug in omega, and one that probably would affect other +>>> users of omega too. Ikiwiki could work around it by pre-escaping +>>> data before passing it to xapian. I have not quite managed to reproduce it though; +>>> tried setting a page title to '$this' and 'foo $this'. +>>> That's with version 1.0.18 of omega. +>>> --[[Joey]] + +>>>> I tried it with both omega 1.0.13 and omega 1.0.18 and the issue is present in both. If I view the contents of {$srcdir}/.ikiwiki/xapian/templates/query, I can see that the wiki title is inserted verbatim and there are calls to `$setmap`, `$set` and `$def` etc in the template. --[[dkobozev]] + +>>>>> I don't see how that's relevant. It would help if you showed me +>>>>> exactly something that could be inserted into a page to cause the +>>>>> problem. --[[Joey]] + +>>>>>> Correct me if I'm wrong: ikiwiki generates an Omega template from its own templates, such as searchquery.tmpl and puts it into {$srcdir}/.ikiwiki/xapian/templates/query. Omega has its own template syntax, where function names are prefixed with dollar signs (`$`). So, when I call my wiki `$foobar`, ikiwiki generates an Omega template that looks like this snippet: + + <div id="container"> + <div class="pageheader"> + <div class="header"> + <span> + <a href="http://example.com">$foobar</ a>/search + </span> + </div> + </div> <!-- .pageheader --> + + <div id="content"> + $setmap{prefix,title,S} + $setmap{prefix,link,XLINK} + $set{thousand,$.}$set{decimal,.}$setmap{BN,,Any Country,uk,England,fr,France} + ${ + $def{PREV, + $if{$ne{$topdoc,0},<INPUT TYPE=image NAME="<" ALT="<" + SRC="/images/xapian-omega/prev.png" BORDER=0 HEIGHT=30 WIDTH=30>, + <IMG ALT="" SRC="/images/xapian-omega/prevoff.png" HEIGHT=30 WIDTH=30>} + +>>>>>> So `$foobar` clashes with Omega's template tags. Does this help? + +>>>>>>> Ahh. I had somehow gotten it into my head that you were talking +>>>>>>> about the title of a single page, not of the whole wiki. But +>>>>>>> you were clear all along it was the wiki title. Sorry for +>>>>>>> misunderstanding. I've put in a complete fix for this problem. +>>>>>>> if this was in [[bugs]], I'd close it. :) --[[Joey]] + +>>>>>>>> Rather than escaping `$` as an HTML entity, it would be more natural +>>>>>>>> to escape it as `$$` (since you are escaping it for Omega, not for +>>>>>>>> the web browser. +>>>>>>>> +>>>>>>>> Also if ikiwiki can put arbitrary text inside the parameters of an +>>>>>>>> OmegaScript command, you should also escape `{`, `}` and `,` as +>>>>>>>> `$(`, `$)` and `$.`. It's only necessary to do so inside the +>>>>>>>> parameters of a command, but it will work and be easier to escape +>>>>>>>> them in any substituted text. --OllyBetts + +[[done]] diff --git a/doc/bugs/External_link:_underscore_conversion.mdwn b/doc/bugs/External_link:_underscore_conversion.mdwn new file mode 100644 index 000000000..6ea421d84 --- /dev/null +++ b/doc/bugs/External_link:_underscore_conversion.mdwn @@ -0,0 +1,25 @@ +Hi, + +found one strange thing here: + +If i enter a link like this + + [#Wikipedia:Mollison]: <http://www.tagari.com/bills_journal> + +the underscore appears like this (i inserted a space in the undercore-string to make it 'visible'): + + <a href="http://www.tagari.com/billsb14a7b8059d9c05 5954c92674ce60032journal">http://www.tagari.com/billsb14a7b8059d9c05 5954c92674ce60032journal</a> + +Am i doing something wrong? + +Thanks for your support and best wishes, +Tobias. + +> I believe you're hitting some kind of Markdown-processing but (so not +> strictly Ikiwiki related). Could you provide a minimal page source +> exhibiting the problem, and mention the exact nature of the processor +> you use? (Markdown, MultiMarkdown, pandoc, ...) --GB + +> Insertation of weird hashes into some output is a [known bug](http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=380212) in the old +> perl markdown. This is one of the main reasons why use of Text::Markdown +> instead is recommended. --[[Joey]] [[done]] diff --git a/doc/bugs/External_links_with_Creole.mdwn b/doc/bugs/External_links_with_Creole.mdwn new file mode 100644 index 000000000..3d800b04e --- /dev/null +++ b/doc/bugs/External_links_with_Creole.mdwn @@ -0,0 +1,3 @@ +When using Creole for markup, creating an external link appears to be impossible. Neither \[[Outside URL|http://example.com]] nor <<http://example.com>> nor \[Outside URL]\(http://example.com) work. The first gets rendered as a broken WikiLink, the second get eaten and the last is not parsed in anyway so you end up with that exact text in your page. + +I'd have made this as a Creole page as a practical demonstration, but that doesn't seem possible here. Here's a page with an example: <https://www.icanttype.org//demo/CreoleExternalLinks> diff --git a/doc/bugs/Git:_changed_behavior_w.r.t._timestamps.mdwn b/doc/bugs/Git:_changed_behavior_w.r.t._timestamps.mdwn new file mode 100644 index 000000000..164e62075 --- /dev/null +++ b/doc/bugs/Git:_changed_behavior_w.r.t._timestamps.mdwn @@ -0,0 +1,214 @@ +After some months, I just updated my local ikiwiki sources, and rebuilt +the Hurd web pages, <http://git.savannah.gnu.org/cgit/hurd/web.git/>. + +I was confused, having switched to the new automatic (thanks!) --gettime +mechanism, why on some pages the timestamps had changed compared to my +previous use of --getctime and setting files' mtimes (using a script) +according to the last Git commit. For example: + +community/weblogs/ArneBab/2008-08-02-gnu_hurd_and_x.html + +old: + + Last edited <span class="date">2008-09-11 18:11:53 UTC</span> + <!-- Created <span class="date">2008-09-11 17:47:08 UTC</span> --> + +new: + + Last edited <span class="date">2008-09-11 18:12:22 UTC</span> + <!-- Created <span class="date">2008-09-11 17:47:50 UTC</span> --> + + +I had a look at what git.pm is doing, and began to manually replay / +investigate: + + $ git log --pretty=fuller --name-only --relative -- community/weblogs/ArneBab/2008-08-02-gnu_hurd_and_x.mdwn + commit 8f1b97bfe45b2f173e3a7d55dee226a9e289a695 + Author: arnebab <arne_bab@web.de> + AuthorDate: Thu Sep 11 20:11:53 2008 +0200 + Commit: arnebab <arne_bab@web.de> + CommitDate: Thu Sep 11 20:11:53 2008 +0200 + + Added a link to the X.org guide in this wiki. + + community/weblogs/ArneBab/2008-08-02-gnu_hurd_and_x.mdwn + + commit 3ef8b7d80d80572c436c4c60c71879bc74409816 + Author: arnebab <arne_bab@web.de> + AuthorDate: Thu Sep 11 19:47:08 2008 +0200 + Commit: arnebab <arne_bab@web.de> + CommitDate: Thu Sep 11 19:47:08 2008 +0200 + + Minor update on the enty trying to get X working -> 'watch this place for updates' + + community/weblogs/ArneBab/2008-08-02-gnu_hurd_and_x.mdwn + +OK, these are my old dates. + + $ git log --pretty=format:%ci --name-only --relative -- community/weblogs/ArneBab/2008-08-02-gnu_hurd_and_x.mdwn + 2008-09-11 20:11:53 +0200 + community/weblogs/ArneBab/2008-08-02-gnu_hurd_and_x.mdwn + + 2008-09-11 19:47:08 +0200 + community/weblogs/ArneBab/2008-08-02-gnu_hurd_and_x.mdwn + + $ git log --pretty=format:%ct --name-only --relative -- community/weblogs/ArneBab/2008-08-02-gnu_hurd_and_x.mdwn + 1221156713 + community/weblogs/ArneBab/2008-08-02-gnu_hurd_and_x.mdwn + + 1221155228 + community/weblogs/ArneBab/2008-08-02-gnu_hurd_and_x.mdwn + + $ date -d @1221156713 + Thu Sep 11 18:11:53 UTC 2008 + $ date -d @1221155228 + Thu Sep 11 17:47:08 UTC 2008 + +That's all consistent. + + +But: + + $ perl -le 'use Storable; my $index=Storable::retrieve("indexdb"); use Data::Dumper; print Dumper $index' + [...] + 'community/weblogs/ArneBab/2008-08-02-gnu_hurd_and_x.mdwn' => { + 'ctime' => '1221155270', + 'dest' => [ + 'community/weblogs/ArneBab/2008-08-02-gnu_hurd_and_x.html' + ], + 'typedlinks' => { + 'tag' => {} + }, + 'mtime' => 1221156742, + 'depends_simple' => { + 'sidebar' => 1 + }, + 'links' => [ + 'community/weblogs/ArneBab/2008-08-02-gnu_hurd_and_x/discussion', + 'Hurd/DebianXorg' + ], + 'state' => { + [...] + + $ date -d @1221156742 + Thu Sep 11 18:12:22 UTC 2008 + $ date -d @1221155270 + Thu Sep 11 17:47:50 UTC 2008 + +That's different, and it matches what the new ikiwiki writes into the +HTML file. + + +Back to Git again, this time without specifying the file: + + $ git log --pretty=format:%ct --name-only --relative + [...] + 1221255713 + 1221255655 + unsorted/PortingIssues.mdwn + + 1221156742 [Thu Sep 11 18:12:22 UTC 2008] + 1221156713 [Thu Sep 11 18:11:53 UTC 2008] + community/weblogs/ArneBab/2008-08-02-gnu_hurd_and_x.mdwn + + 1221156267 + 1221156235 + index.mdwn + + 1221156122 + 1221156091 + index.mdwn + + 1221155942 + 1221155910 + index.mdwn + + 1221155270 [Thu Sep 11 17:47:50 UTC 2008] + 1221155228 [Thu Sep 11 17:47:08 UTC 2008] + community/weblogs/ArneBab/2008-08-02-gnu_hurd_and_x.mdwn + + 1221154986 + community/gsoc.mdwn + community/gsoc/project_ideas.mdwn + + 1221147244 + whatsnew.html + [...] + +Aha! + +... and some more detail: + + $ git log --pretty=fuller --name-only --relative + [...] + commit e4e89e1683012c879012522105a3471a00714613 + Author: Samuel Thibault <samuel.thibault@ens-lyon.org> + AuthorDate: Fri Sep 12 23:40:55 2008 +0200 + Commit: Samuel Thibault <samuel.thibault@ens-lyon.org> + CommitDate: Fri Sep 12 23:40:55 2008 +0200 + + MSG_NOSIGNAL and IPV6_PKTINFO got fixed + + unsorted/PortingIssues.mdwn + + commit c389fae98dff86527be62f895ff7272e4ab1932c + Merge: 0339e3e 8f1b97b + Author: GNU Hurd wiki engine <web-hurd@gnu.org> + AuthorDate: Thu Sep 11 18:12:22 2008 +0000 + Commit: GNU Hurd wiki engine <web-hurd@gnu.org> + CommitDate: Thu Sep 11 18:12:22 2008 +0000 + + Merge branch 'master' of wiki@192.168.10.50:wiki + + commit 8f1b97bfe45b2f173e3a7d55dee226a9e289a695 + Author: arnebab <arne_bab@web.de> + AuthorDate: Thu Sep 11 20:11:53 2008 +0200 + Commit: arnebab <arne_bab@web.de> + CommitDate: Thu Sep 11 20:11:53 2008 +0200 + + Added a link to the X.org guide in this wiki. + + community/weblogs/ArneBab/2008-08-02-gnu_hurd_and_x.mdwn + [...] + +So, merges are involved there. + +What (the new) ikiwiki code does, is use the timestamp when the merge was +done instead of the timestamp when the commit was done. Is this +intentional? Otherwise I could supply a patch. + +--[[tschwinge]] + +> In order to be nice and fast, the git backend runs git log once +> and records data for all files. Rather than looking at the log for a +> given file. So amoung other things, it does not follow renames. +> +> AFAICS, git log only shows merges modifying files if it was a conflicted +> merge. As the file is then actually modified to resolve the merge +> I think it makes sense to count the merge as the last modification in +> that case. --[[Joey]] + +>> That'd be reasonable, but `git log` will also show merges that are not +>> conflicting (as in my case). + +>>> Actually when displaying a merge, `git log --stat` only lists files that +>>> were actually modified in a new way as part of the merge resolution. +>>> Ie, if the merge resolution only joins together some of the parent +>>> hunks, the file is not listed as having been modified. +>>> +>>> So, no, ikiwiki's use of git log will not show files modified in +>>> non-conflicting merges. +>>> --[[Joey]] + +>> Yet, I'm not totally disagreeing with your choice. With this `git +>> log` invocation, you're not able to tell from its output whether a +>> conflict was resolved or not. + +>> Also, it's a bit like the *should we use the **author timestamp** or +>> **commit timestamp*** discussion. Your code will always use the +>> latest timestamp. + +>> I guess I'll get my head wrapped around that, and it's fine, so this is +>> [[done]]. + +>> --[[tschwinge]] diff --git a/doc/bugs/HTML_for_parentlinks_makes_theming_hard.mdwn b/doc/bugs/HTML_for_parentlinks_makes_theming_hard.mdwn index 0cbef403d..bc934d109 100644 --- a/doc/bugs/HTML_for_parentlinks_makes_theming_hard.mdwn +++ b/doc/bugs/HTML_for_parentlinks_makes_theming_hard.mdwn @@ -39,7 +39,7 @@ I understand the logic behind doing this (on the front page it is the title as w I'll just modify the templates for my own site but I thought I'd report it as a bug in the hopes that it will be useful to others. Cheers, -Adam. +[[AdamShand]]. ---- > I just noticed that it's also different on the comments, preferences and edit pages. I'll come up with a diff and see what you guys think. -- Adam. diff --git a/doc/bugs/Highlight_extension_uses_hard_coded_paths.mdwn b/doc/bugs/Highlight_extension_uses_hard_coded_paths.mdwn new file mode 100644 index 000000000..275661fb8 --- /dev/null +++ b/doc/bugs/Highlight_extension_uses_hard_coded_paths.mdwn @@ -0,0 +1,3 @@ +The [[plugins/highlight]] plugin hard codes some paths up the top of the plugin. This means that you need to edit the ikiwiki source if you have highlight installed in a non-standard location (e.g. if you have done a user-level install of the highlight package). + +> configurable now, [[done]] --[[Joey]] diff --git a/doc/bugs/Links_to_missing_pages_should_always_be_styled.mdwn b/doc/bugs/Links_to_missing_pages_should_always_be_styled.mdwn new file mode 100644 index 000000000..73213209a --- /dev/null +++ b/doc/bugs/Links_to_missing_pages_should_always_be_styled.mdwn @@ -0,0 +1,5 @@ +When the CGI URL is not defined, links to missing pages appear as plain, unstyled text. I think the 'createlink' span should always wrap this text, even when the actual question mark linking to the CGI for the create action is missing. This ensures consistent styling regardless of whether the CGI is available or not (and is thus useful for example when the same wiki has clones with the CGI link and clones without). + +A proposed patch is available [on my ikiwiki clone](http://git.oblomov.eu/ikiwiki/patch/290d1b498f00f63e6d41218ddb76d87e68ed5081) + +[[!tag patch cgi done]] diff --git a/doc/bugs/More_permission_checking.mdwn b/doc/bugs/More_permission_checking.mdwn new file mode 100644 index 000000000..6cd6cb0ec --- /dev/null +++ b/doc/bugs/More_permission_checking.mdwn @@ -0,0 +1,17 @@ +I'm often confused about permissions and I wish ikiwiki could stamp it's foot down and ensure all the permissions are correctly (canonically?) setup. + +I keep ending up having to `sudo chown -R :www-data` and `sudo chmod -R g+w` on srcdir, destdir. I'm never quite sure what is the best practice for the srcdirs' `/srv/git/` is. Currently everything looks like `hendry:www-data` with ug+rw. + +I think I've triggered these problems by (not thinking and) running `ikiwiki --rebuild --setup /home/hendry/.ikiwiki/mywiki.setup` as my user. + +I don't know if there can be some lookup with `/etc/ikiwiki/wikilist`. Though shouldn't everything be under the `www-data` group in reality? + +Also when I use `sudo ikiwiki -setup /etc/ikiwiki/auto.setup`, I think I create a ton of problems for myself since everything is created as the root user, right? And `/etc/ikiwiki/wikilist` doesn't seem to have the latest created wiki added. I have to reluctantly manually do this. + +> You should never make files be owned by www-data user or group. +> Ikiwiki is designed to run as a single user, which can just +> be your login user; all files should be owned by that user, the +> ikiwiki.cgi and other wrappers suid to that user. And then there are +> never any permissions problems. --[[Joey]] + +[[done]] diff --git a/doc/bugs/New_comments_are_not_always_displayed__59___need_page_refresh_to_appear.mdwn b/doc/bugs/New_comments_are_not_always_displayed__59___need_page_refresh_to_appear.mdwn new file mode 100644 index 000000000..ac079f5b8 --- /dev/null +++ b/doc/bugs/New_comments_are_not_always_displayed__59___need_page_refresh_to_appear.mdwn @@ -0,0 +1,35 @@ +I noticed this a few times in Google Chrome 12 (dev channel) a few times, already: + +I added a comment to + + http://git-annex.branchable.com/forum/performance_improvement:_git_on_ssd__44___annex_on_spindle_disk/ + +and left the page. Later, I revisited + + http://git-annex.branchable.com/forum/ + +and clicked on + + http://git-annex.branchable.com/forum/performance_improvement:_git_on_ssd__44___annex_on_spindle_disk/ + +My own comment did not appear. I pressed F5 and eh presto. + +My assumption is that ikiwiki does not tell Chrome to reload the page as the cache is stale. + + +Richard + +> There is some lurking bug with certian web browsers, web servers, or +> combination of the two that makes modifications to html files not +> always be noticed by web browsers. See +> [[bugs/firefox_doesn__39__t_want_to_load_updated_pages_at_ikiwiki.info]] +> see also <http://bugs.debian.org/588623>. +> +> On Branchable, we work around this problem with an apache configuration: +> «ExpiresByType text/html "access plus 0 seconds"» +> +> There seems to be no way to work around it in ikiwiki's generated html, +> aside from using the cache-control setting that is not allowed in html5. +> +> And, which browsers/web servers have the problem, and where the bug is, +> seems very hard to pin down. --[[Joey]] diff --git a/doc/bugs/Pages_with_non-ascii_characters_like_öäå_in_name_not_found_directly_after_commit.mdwn b/doc/bugs/Pages_with_non-ascii_characters_like_öäå_in_name_not_found_directly_after_commit.mdwn new file mode 100644 index 000000000..8fb09f9d6 --- /dev/null +++ b/doc/bugs/Pages_with_non-ascii_characters_like_öäå_in_name_not_found_directly_after_commit.mdwn @@ -0,0 +1,145 @@ +At least my setup on kapsi.fi always prints 404 Not Found after adding a page with non-ascii characters in name. But the page exists and is visible after the 404 with url encoding and the blog page is inlined correctly on the feed page. + +Apparently ikiwiki.info does not complain with 404. Should the character encoding be set in wiki config? + +Happens also after editing the page. Here's an example: + + * page name displayed in 404: http://mcfrisk.kapsi.fi/skiing/posts/Iso-Sy%F6te%20Freeride%202011%20Teaser.html?updated + * page name in the blog feed: http://mcfrisk.kapsi.fi/skiing/posts/Iso-Sy%C3%B6te%20Freeride%202011%20Teaser.html + +Difference is in the word Iso-Syöte. Pehaps also the browsers is part of +the game, I use Iceweasel from Debian unstable with default settings. + +> I remember seeing this problem twice before, and both times it was caused +> by a bug in the *web server* configuration. I think at least one case it was +> due to an apache rewrite rule that did a redirect and mangled the correct +> encoding. +> +> I recommend you check there. If you cannot find the problem with your web +> server, I recommend you get a http protocol dump while saving the page, +> and post it here for analysis. You could use tcpdump, or one of the +> browser plugins that allows examining the http protocol. --[[Joey]] + +Server runs Debian 5.0.8 but I don't have access to the Apache configs. Here's the tcp stream from wireshark without cookie data, page name is testiä.html. I guess page name is in utf-8 but in redirect after post it is given to browser with 8859-1. + + POST /ikiwiki.cgi HTTP/1.1 + Host: mcfrisk.kapsi.fi + User-Agent: Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.1.16) Gecko/20110107 Iceweasel/3.5.16 (like Firefox/3.5.16) + Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8 + Accept-Language: en-us,en;q=0.5 + Accept-Encoding: gzip,deflate + Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7 + Keep-Alive: 300 + Connection: keep-alive + Referer: http://mcfrisk.kapsi.fi/ikiwiki.cgi + Cookie: XXXX + Content-Type: multipart/form-data; boundary=---------------------------138059850619952014921977844406 + Content-Length: 1456 + + -----------------------------138059850619952014921977844406 + Content-Disposition: form-data; name="_submitted" + + 2 + -----------------------------138059850619952014921977844406 + Content-Disposition: form-data; name="do" + + edit + -----------------------------138059850619952014921977844406 + Content-Disposition: form-data; name="sid" + + 93c956725705aa0bbdff98e57efb28f4 + -----------------------------138059850619952014921977844406 + Content-Disposition: form-data; name="from" + + + -----------------------------138059850619952014921977844406 + Content-Disposition: form-data; name="rcsinfo" + + 5419fbf402e685643ca965d577dff3dafdd0fde9 + -----------------------------138059850619952014921977844406 + Content-Disposition: form-data; name="page" + + testi.. + -----------------------------138059850619952014921977844406 + Content-Disposition: form-data; name="type" + + mdwn + -----------------------------138059850619952014921977844406 + Content-Disposition: form-data; name="editcontent" + + test + -----------------------------138059850619952014921977844406 + Content-Disposition: form-data; name="editmessage" + + + -----------------------------138059850619952014921977844406 + Content-Disposition: form-data; name="_submit" + + Save Page + -----------------------------138059850619952014921977844406 + Content-Disposition: form-data; name="attachment"; filename="" + Content-Type: application/octet-stream + + + -----------------------------138059850619952014921977844406-- + HTTP/1.1 302 Found + Date: Wed, 02 Feb 2011 19:45:49 GMT + Server: Apache/2.2 + Location: /testi%E4.html?updated + Content-Length: 0 + Keep-Alive: timeout=5, max=500 + Connection: Keep-Alive + Content-Type: text/plain + + GET /testi%E4.html?updated HTTP/1.1 + Host: mcfrisk.kapsi.fi + User-Agent: Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.1.16) Gecko/20110107 Iceweasel/3.5.16 (like Firefox/3.5.16) + Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8 + Accept-Language: en-us,en;q=0.5 + Accept-Encoding: gzip,deflate + Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7 + Keep-Alive: 300 + Connection: keep-alive + Referer: http://mcfrisk.kapsi.fi/ikiwiki.cgi + Cookie: XXXX + + HTTP/1.1 404 Not Found + Date: Wed, 02 Feb 2011 19:45:55 GMT + Server: Apache/2.2 + Content-Length: 279 + Keep-Alive: timeout=5, max=499 + Connection: Keep-Alive + Content-Type: text/html; charset=iso-8859-1 + + <!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN"> + <html><head> + <title>404 Not Found</title> + </head><body> + <h1>Not Found</h1> + <p>The requested URL /testi..html was not found on this server.</p> + <hr> + <address>Apache/2.2 Server at mcfrisk.kapsi.fi Port 80</address> + </body></html> + +Getting the pages has worked every time: + + GET /testi%C3%A4.html HTTP/1.1 + Host: mcfrisk.kapsi.fi + User-Agent: Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.1.16) Gecko/20110107 Iceweasel/3.5.16 (like Firefox/3.5.16) + Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8 + Accept-Language: en-us,en;q=0.5 + Accept-Encoding: gzip,deflate + Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7 + Keep-Alive: 300 + Connection: keep-alive + Cookie: XXXX + If-Modified-Since: Wed, 02 Feb 2011 19:45:54 GMT + If-None-Match: "1b518d-7c0-49b51e5a55c5f" + Cache-Control: max-age=0 + + HTTP/1.1 304 Not Modified + Date: Wed, 02 Feb 2011 20:01:43 GMT + Server: Apache/2.2 + Connection: Keep-Alive + Keep-Alive: timeout=5, max=500 + ETag: "1b518d-7c0-49b51e5a55c5f" diff --git a/doc/bugs/Patch:_Fix_error_in_style.css.mdwn b/doc/bugs/Patch:_Fix_error_in_style.css.mdwn new file mode 100644 index 000000000..3a160454e --- /dev/null +++ b/doc/bugs/Patch:_Fix_error_in_style.css.mdwn @@ -0,0 +1,37 @@ +[[!tag patch css]] +[[!template id=gitbranch branch=sunny256/css-fix author="[[sunny256]]"]] + +This trivial patch fixes an error in `styles.css` and is ready to be merged from the `css-fix` branch at `git://github.com/sunny256/ikiwiki.git` : + + From e3b5eab2971109d18332fe44fd396322bb148cfc Mon Sep 17 00:00:00 2001 + From: =?UTF-8?q?=C3=98yvind=20A.=20Holm?= <sunny@sunbase.org> + Date: Tue, 22 Feb 2011 18:14:21 +0100 + Subject: [PATCH] style.css: Replace obsolete -moz-outline-style property with outline-style + + The "-moz-outline-style" property generates an error at the W3C CSS + validator, saying the property doesn't exist. According to + <https://developer.mozilla.org/en/CSS/-moz-outline-style>, this property + is obsolete and the use of "outline-style" is preferred. + --- + doc/style.css | 2 +- + 1 files changed, 1 insertions(+), 1 deletions(-) + + diff --git a/doc/style.css b/doc/style.css + index 922b82a..fa413cf 100644 + --- a/doc/style.css + +++ b/doc/style.css + @@ -485,7 +485,7 @@ a.openid_large_btn:focus { + outline: none; + } + a.openid_large_btn:focus { + - -moz-outline-style: none; + + outline-style: none; + } + .openid_selected { + border: 4px solid #DDD; + -- + 1.7.4.1.55.gdca3d + +--[[sunny256]] 2011-02-22 20:11+0100 + +> [[Applied|done]]. --[[Joey]] diff --git a/doc/bugs/Perl_scripts_depend_on___47__usr__47__bin__47__perl.mdwn b/doc/bugs/Perl_scripts_depend_on___47__usr__47__bin__47__perl.mdwn new file mode 100644 index 000000000..d68d506f7 --- /dev/null +++ b/doc/bugs/Perl_scripts_depend_on___47__usr__47__bin__47__perl.mdwn @@ -0,0 +1,6 @@ +> On FreeBSD, perl defaults to installation in `/usr/local/bin/perl` since it is not a part of the base system. If the option to create symlinks in `/usr/bin` is not selected, > building and running ikiwiki will fail because the shebang lines use `#!/usr/bin/perl [args]`. Changing this to `#!/usr/bin/env -S perl [args]` fixes the issue. + +I think this should be a concern of ikiwiki's official FreeBSD port. + +At any rate, even if it is decided that ikiwiki should be fixed, then it is probably better to use +`$installbin/perl` from `-MConfig` and not the `env` hack. diff --git a/doc/bugs/Problems_with_graphviz.pm_plug-in.mdwn b/doc/bugs/Problems_with_graphviz.pm_plug-in.mdwn index c9f698158..bc80125ad 100644 --- a/doc/bugs/Problems_with_graphviz.pm_plug-in.mdwn +++ b/doc/bugs/Problems_with_graphviz.pm_plug-in.mdwn @@ -9,31 +9,12 @@ The graphviz.pm plug-in currently attempts to read PNG data in UTF-8 mode, which It also generates image URLs relative to the page being rendered, which means the URLs wont work when previewing a graph from the CGI script. +(preview bug split to [[Problems_with_graphviz.pm_plug-in_previews]]) + >> Here is an updated patch againt ikiwiki-2.5: >>> [[Applied|done]], thanks. --[[Joey]] - --- IkiWiki/Plugin/graphviz.pm.orig 2007-07-27 11:35:05.000000000 +0200 - +++ IkiWiki/Plugin/graphviz.pm 2007-07-27 11:36:02.000000000 +0200 - @@ -69,7 +69,12 @@ sub render_graph (\%) { - } - } - - - return "<img src=\"".urlto($dest, $params{page})."\" />\n"; - + if ($params{preview}) { - + return "<img src=\"".urlto($dest, "")."\" />\n"; - + } - + else { - + return "<img src=\"".urlto($dest, $params{page})."\" />\n"; - + } - } - - sub graph (@) { - - ->> --[[HenrikBrixAndersen]] - - The patch below fixes these two issues. --- graphviz.pm.orig Thu Jun 7 15:45:16 2007 diff --git a/doc/bugs/Problems_with_graphviz.pm_plug-in_previews.mdwn b/doc/bugs/Problems_with_graphviz.pm_plug-in_previews.mdwn new file mode 100644 index 000000000..c77bbeeaf --- /dev/null +++ b/doc/bugs/Problems_with_graphviz.pm_plug-in_previews.mdwn @@ -0,0 +1,54 @@ +(split from [[Problems_with_graphviz.pm_plug-in]]) + +[graphviz] generates image URLs relative to the page being rendered, which means the URLs wont work when previewing a graph from the CGI script. + +>> Here is an updated patch againt ikiwiki-2.5: + +>>> Applied, thanks. --[[Joey]] + + --- IkiWiki/Plugin/graphviz.pm.orig 2007-07-27 11:35:05.000000000 +0200 + +++ IkiWiki/Plugin/graphviz.pm 2007-07-27 11:36:02.000000000 +0200 + @@ -69,7 +69,12 @@ sub render_graph (\%) { + } + } + + - return "<img src=\"".urlto($dest, $params{page})."\" />\n"; + + if ($params{preview}) { + + return "<img src=\"".urlto($dest, "")."\" />\n"; + + } + + else { + + return "<img src=\"".urlto($dest, $params{page})."\" />\n"; + + } + } + + sub graph (@) { + + +>> --[[HenrikBrixAndersen]] + +>>> Despite this patch I am still experiencing the problem. Normal page source for a graph contains: + + <div id="content"> + <p><img src="./graph-c9fd2a197322feb417bdedbca5e99f5aa65b3f06.png" /></p> + + </div> + +>>> preview contains + + <div id="preview"> + <p><img src="./demo/diagrams/graph-c9fd2a197322feb417bdedbca5e99f5aa65b3f06.png" /></p> + + </div> + +>>> I don't quite understand why, this makes sense from the CGI path (in my +>>> case from the root of the site). The browsers appear to be trying to fetch +>>> `/demo/diagrams/demo/diagrams/graph-c9fd2a197322feb417bdedbca5e99f5aa65b3f06.png` +>>> (i.e., prepending the required relpath twice). -- [[Jon]] + +>>>> Yeah, that patch may have been right once, but it's wrong now; +>>>> preview mode uses `<base>` to make urls work the same as they would +>>>> when viewing the html page. +>>>> +>>>> Perhaps this was not noticed for a while while because it only +>>>> shows up if previewing an *unchanged* graph on a page that has already +>>>> been built before. Fixed now. [[done]] --[[Joey]] diff --git a/doc/bugs/SSI_include_stripped_from_mdwn.mdwn b/doc/bugs/SSI_include_stripped_from_mdwn.mdwn index 5519e45c6..270da86d3 100644 --- a/doc/bugs/SSI_include_stripped_from_mdwn.mdwn +++ b/doc/bugs/SSI_include_stripped_from_mdwn.mdwn @@ -10,7 +10,7 @@ If I have a <--#include virtual="foo" --> in some file, it gets stripped, > Anyway, it makes sense for the htmlscrubber to strip server-side > includes because otherwise your wiki could be attacked > by them being added to it. If you want to use both the htmlscrubber and -> SSI together, I'd suggest you modify the [[wikitemplates]] +> SSI together, I'd suggest you modify the [[templates]] > and put the SSI on there. > > Ie, `page.tmpl` has a diff --git a/doc/bugs/Search_summary_includes_text_from_navigational_elements.mdwn b/doc/bugs/Search_summary_includes_text_from_navigational_elements.mdwn new file mode 100644 index 000000000..b774c4531 --- /dev/null +++ b/doc/bugs/Search_summary_includes_text_from_navigational_elements.mdwn @@ -0,0 +1,22 @@ +Each listed result for a search will show some example text from the beginning of the linked page. It strips out HTML elements, but if there's any navigational text items, they will stay. + +For example, each search result on ikiwiki.info shows "(title) ikiwiki/ (title) Edit RecentChanges History Preferences Discussion" at the start of its results. + +A way to name some CSS ids that should be removed in search results within the ikiwiki setup file would work. Here's something similar that a friend proposed: + +http://leaf.dragonflybsd.org/mailarchive/users/2009-11/msg00077.html + +(bin attachment on that page is actually a .diff.) + +> So I was looking at this and I relized that while the search plugin used +> to use the format hook, and so there was no way to avoid it seeing all +> the gunk around the page body, it was changed a while ago for different +> reasons to use its own hook, postscan. So there's really no reason not +> to move postscan so it runs before said gunk is added to the page. +> (Aside from a small risk of breaking other third-party plugins that +> somehow use postscan.) +> +> I've implemented that in git, and it drops the navigation elements nicely. +> It's perhaps less general than allowing specific divs to be skipped from +> search, but it seems good enough. Please thank the dragonfly guys for their +> work on this. [[done]] --[[Joey]] diff --git a/doc/bugs/Site_title_not_clickable_while_adding_a_comment.mdwn b/doc/bugs/Site_title_not_clickable_while_adding_a_comment.mdwn new file mode 100644 index 000000000..1347be4b0 --- /dev/null +++ b/doc/bugs/Site_title_not_clickable_while_adding_a_comment.mdwn @@ -0,0 +1,9 @@ +When I add a comment to a page, its title should be a hyperlink. This would make re-opening it to re-read parts of it, either. + +I.e. when adding a comment to this page, the last part should be a hyperlink, as well: + + ikiwiki/ bugs/ creating Site title not clickable while adding a comment + + + +Richard diff --git a/doc/bugs/Tab_delimited_tables_don__39__t_work.mdwn b/doc/bugs/Tab_delimited_tables_don__39__t_work.mdwn new file mode 100644 index 000000000..39d57a4fe --- /dev/null +++ b/doc/bugs/Tab_delimited_tables_don__39__t_work.mdwn @@ -0,0 +1,22 @@ +Table directive should support tab-delimited data, especially important since this is the format you will get if copy/pasting from an HTML table or spreadsheet (Gnumeric, OO Calc, Excel). Test case which fails: + +[[!table format=dsv delimiter="\t" data=""" +1 2 +2 4 +"""]] + +> They do work, but C-style backslash escapes aren't recognised, +> so the syntax `delimiter="\t"` (as in your test case) looks +> for the literal string `\t`. Replacing `\t` with a literal +> tab character makes it work - here's a test (I changed the data +> to make the table layout more obvious): +> +> [[!table format=dsv delimiter=" " data=""" +left 2 +2 right +alpha beta +"""]] +> +> So, I think this can be considered [[not_a_bug|done]]? --[[smcv]] + +>> I've clarified the documentation. --[[smcv]] diff --git a/doc/bugs/UTF-16_and_UTF-32_are_unhandled.mdwn b/doc/bugs/UTF-16_and_UTF-32_are_unhandled.mdwn new file mode 100644 index 000000000..21df334a8 --- /dev/null +++ b/doc/bugs/UTF-16_and_UTF-32_are_unhandled.mdwn @@ -0,0 +1,20 @@ +Wide characters should probably be supported, or, at the very least, warned about. + +Test case: + + mkdir -p ikiwiki-utf-test/raw ikiwiki-utf-test/rendered + for page in txt mdwn; do + echo hello > ikiwiki-utf-test/raw/$page.$page + for text in 8 16 16BE 16LE 32 32BE 32LE; do + iconv -t UTF$text ikiwiki-utf-test/raw/$page.$page > ikiwiki-utf-test/raw/$page-utf$text.$page; + done + done + ikiwiki --verbose --plugin txt --plugin mdwn ikiwiki-utf-test/raw/ ikiwiki-utf-test/rendered/ + www-browser ikiwiki-utf-test/rendered/ || x-www-browser ikiwiki-utf-test/rendered/ + # rm -r ikiwiki-utf-test/ # some browsers rather stupidly daemonize themselves, so this operation can't easily be safely automated + +BOMless LE and BE input is probably a lost cause. + +Optimally, UTF-16 (which is ubiquitous in the Windows world) and UTF-32 should be fully supported, probably by converting to mostly-UTF-8 and using `&#xXXXX;` or `&#DDDDD;` XML escapes where necessary. + +Suboptimally, UTF-16 and UTF-32 should be converted to UTF-8 where cleanly possible and a warning printed where impossible. diff --git a/doc/bugs/__34__Currently_enabled_SSH_keys:__34___shows_only_first_139_characters_of_each_key.mdwn b/doc/bugs/__34__Currently_enabled_SSH_keys:__34___shows_only_first_139_characters_of_each_key.mdwn new file mode 100644 index 000000000..3c3352f66 --- /dev/null +++ b/doc/bugs/__34__Currently_enabled_SSH_keys:__34___shows_only_first_139_characters_of_each_key.mdwn @@ -0,0 +1,12 @@ +At least at http://free-thursday.pieni.net/ikiwiki.cgi the "SSH keys" page shows only the first 139 characters of each SSH key. I'm using iceweasel in 1024x768 resolution and there are not scrollbars visible. + +Please contact me at timo.lindfors@iki.fi + +> I have access to the same wiki, and do not see the problem Timo sees. I see 380 chars of the SSH keys, and do have a scrollbar. +> Weird. --liw + +> Also, that's a Branchable.com site and the bug, if any is +> in ikiwiki-hosting's plugin, not ikiwiki proper. Moved +> [here](http://ikiwiki-hosting.branchable.com/bugs/__34__Currently_enabled_SSH_keys:__34___shows_only_first_139_characters_of_each_key/) --[[Joey]] + +[[!tag done]] diff --git a/doc/bugs/__34__First_post__34___deletion_does_not_refresh_front_page.mdwn b/doc/bugs/__34__First_post__34___deletion_does_not_refresh_front_page.mdwn new file mode 100644 index 000000000..2367335a7 --- /dev/null +++ b/doc/bugs/__34__First_post__34___deletion_does_not_refresh_front_page.mdwn @@ -0,0 +1,6 @@ +When I created an ikiwiki site (on Branchable) using the blog template, it added a "First post", which was fine. +Deleting that post removed it, but the front page did not get the re-generated, so it was still there. +--[[liw]] + +> This is a bug involving the `page()` pagespec. Deleted +> pages matching this pagespec are not noticed. --[[Joey]] [[done]] diff --git a/doc/bugs/__96__wiki__95__file__95__chars__96___setting_not_propagated_to_CGI_wrapper.mdwn b/doc/bugs/__96__wiki__95__file__95__chars__96___setting_not_propagated_to_CGI_wrapper.mdwn new file mode 100644 index 000000000..f04b3404b --- /dev/null +++ b/doc/bugs/__96__wiki__95__file__95__chars__96___setting_not_propagated_to_CGI_wrapper.mdwn @@ -0,0 +1,28 @@ +I've set `wiki_file_chars` to a non-standard value in the setup file: + + wiki_file_chars => "-[:alnum:]+/.:_\x{1f310}\x{1f430}", + +(In case you're wondering, [this is the page](http://xn--9dbdkw.se/🌐/).) + +ikiwiki recognises my pages when I run it from the command line, but +when I edit something through the CGI "script", ikiwiki would suddenly +not recognise them. + +By running `strings` on the CGI wrapper I found that the option +`wiki_file_regexp` was still at its original setting. So as a workaround, +I added this to the setup file and everything worked: + + wiki_file_regexp => qr/(^[-[:alnum:]+\/.:_\x{1f310}\x{1f430}]+$)/, + +Maybe the CGI wrapper should specially call `checkconfig`, which is +the function responsible for updating `wiki_file_regexp`? + +--[[legoscia]] + +> You have to regrenerate the cgi wrapper after changing your setup file +> for the configuration changes to take effect. +> +> I tested it, setting `wiki_file_chars => "moocow"`, +> running ikiwiki -refresh -wrappers my.setup, and looking at strings: +> `'wiki_file_regexp' => qr/(?-xism:(^[moocow]+$))/` +> So, this appears to have been user error. [[done]] --[[Joey]] diff --git a/doc/bugs/absolute_sizes_in_default_CSS.mdwn b/doc/bugs/absolute_sizes_in_default_CSS.mdwn new file mode 100644 index 000000000..bb3c0c7a0 --- /dev/null +++ b/doc/bugs/absolute_sizes_in_default_CSS.mdwn @@ -0,0 +1,39 @@ +While toying around with some font sizes on my persona ikiwiki I discovered that some font sizes in the default CSS are fixed rather than relative. Here's a git patch that replaces them with relative font sizes (assuming the default 12pt/16px base font size recommended by the W3C): + +[[done]] --[[Joey]] + +<pre> +From 01c14db255bbb727d8dd1e72c3f6f2f25f07e757 Mon Sep 17 00:00:00 2001 +From: Giuseppe Bilotta <giuseppe.bilotta@gmail.com> +Date: Tue, 17 Aug 2010 00:48:24 +0200 +Subject: [PATCH] Use relative font-sizes + +--- + doc/style.css | 4 ++-- + 1 files changed, 2 insertions(+), 2 deletions(-) + +diff --git a/doc/style.css b/doc/style.css +index 66d962b..fa4b2a3 100644 +--- a/doc/style.css ++++ b/doc/style.css +@@ -14,7 +14,7 @@ nav { + + .header { + margin: 0; +- font-size: 22px; ++ font-size: 140%; + font-weight: bold; + line-height: 1em; + display: block; +@@ -22,7 +22,7 @@ nav { + + .inlineheader .author { + margin: 0; +- font-size: 18px; ++ font-size: 112%; + font-weight: bold; + display: block; + } +-- +1.7.2.rc0.231.gc73d +</pre> diff --git a/doc/bugs/aggregate_generates_long_filenames.mdwn b/doc/bugs/aggregate_generates_long_filenames.mdwn new file mode 100644 index 000000000..fae8333ab --- /dev/null +++ b/doc/bugs/aggregate_generates_long_filenames.mdwn @@ -0,0 +1,37 @@ +the [[plugins/aggregate]] plugin mashes the `title` of an aggregated post into a filename. This results in long filenames. I have hit a filesystem length limitation on several occasions. Some (ab)uses of RSS, e.g., twitter, +generate long titles. Especially once you throw escaping into the mix: + + $ ikiwiki --setup testsetup --aggregate --refresh + failed to write ./test/lifestream/Hidden_Features_Of_Perl__44___PHP__44___Javascript__44___C__44___C++__44___C__35____44___Java__44___Ruby___46____46____46__._aggregated.ikiwiki-new: File name too long + aggregation failed with code 9216 + $ echo $? + 25 + +It would also appear this abrubtly terminates aggregate processing (if not ikiwiki itself). Only after moving my test repo to `/tmp` to shorten the filename did I see newer RSS feeds (from a totally different source) picked up. + + +-- [[Jon]] + +> I have to wonder what filesystem you have there where 147 characters +> is a long filename. Ikiwiki already uses `POSIX::pathconf` on the srcdir +> to look up `_PC_NAME_MAX` +> to see if the filename is too long, and shortens it, so it seems +> that, in additional to having a rather antique long filename limit, your +> system also doesn't properly expose it via pathconf. Not sure what +> ikiwiki can do here. --[[Joey]] + +>> This is an ext4 filesystem with default settings (which appears to mean +>> 256 bytes for pathnames). Despite the error saying file name, it's +>> definitely a path issue since moving my test repo to `/tmp`from +>> `/home/jon/wd/mine/www` hides the problem. I note the following comment +>> in `aggregate.pm`: + + # Make sure that the file name isn't too long. + # NB: This doesn't check for path length limits. + +>> I don't fully grok the aggregate source yet, but I wouldn't rule out +>> a bug in the path length checking, personally. I'm happy to try and +>> find it myself though :) -- [[Jon]] + +>>> Path length seems unlikely, since the max is 4096 there. +>>> --[[Joey]] diff --git a/doc/bugs/align_doesn__39__t_always_work_with_img_plugin_.mdwn b/doc/bugs/align_doesn__39__t_always_work_with_img_plugin_.mdwn new file mode 100644 index 000000000..e986bdc82 --- /dev/null +++ b/doc/bugs/align_doesn__39__t_always_work_with_img_plugin_.mdwn @@ -0,0 +1,7 @@ +Using the img plugin to inline an image, the "align" parameter doesn't work as expected if you also include a "caption". + +As best as I can tell this is because the "caption" parameter works by wrapping the image inside a table which means that the "align" parameter is aligning within the table cell rather then the page itself. + +-- AdamShand + +> I agree, this is annoying... and [[done]]! --[[Joey]] diff --git a/doc/bugs/anonok_vs._httpauth.mdwn b/doc/bugs/anonok_vs._httpauth.mdwn new file mode 100644 index 000000000..bff37e18b --- /dev/null +++ b/doc/bugs/anonok_vs._httpauth.mdwn @@ -0,0 +1,118 @@ +I've got a wiki where editing requires [[plugins/httpauth]] (with +`cgiauthurl` working nicely). I now want to let the general public +edit Discussion subpages, so I enabled [[plugins/anonok]] and set +`anonok_pagespec` to `'*/Discussion'`, but HTTP auth is still being +required for those. + +(Actually, what I'll really want to do is probably [[plugins/lockedit]] +and a whitelist of OpenIDs in `locked_pages`...) + +--[[schmonz]] + +> The only way I can see to support this combination is for httpauth with +> cgiauthurl to work more like other actual login types. Which would mean +> that on editing a page that needs authentication, ikiwiki would redirect +> them to the Signin page, which would then have a link they could follow +> to bounce through the cgiauthurl and actually sign in. This would be +> significantly different than the regular httpauth process, in which the +> user signs in in passing. --[[Joey]] + +>> My primary userbase has grown accustomed to the seamlessness of +>> httpauth with SPNEGO, so I'd rather not reintroduce a seam into +>> their web-editing experience in order to let relatively few outsiders +>> edit relatively few pages. When is the decision made about whether +>> the current page can be edited by the current user (if any)? What +>> if there were a way to require particular auth plugins for particular +>> PageSpecs? --[[schmonz]] + +>>> The decision about whether a user can edit a page is made by plugins +>>> such as signinedit and lockedit, that also use canedit hooks to redirect +>>> the user to a signin page if necessary. +>>> +>>> A tweak on my earlier suggestion would be to have httpauth notice when the +>>> Signin page is being built and immediatly redirect to the cgiauthurl +>>> before the page can be shown to the user. This would, though, not play +>>> well with other authentication methods like openid, since the user +>>> would never see the Signin form. --[[Joey]] + +>>>> Would I be able to do what I want with a local plugin that +>>>> abuses canedit (and auth) to reach in and call the appropriate +>>>> plugin's auth method -- e.g., if the page matches */Discussion, +>>>> call `openid:auth()`, else `httpauth:auth()`? --[[schmonz]] + +>>>>> That seems it would be +>>>>> annoying for httpauth users (who were not currently authed), +>>>>> as they would then see the openid signin form when going to edit a +>>>>> Discussion page. +>>>>> --[[Joey]] + +>>>>>> I finally see the problem, I think. When you initially +>>>>>> suggested "a link they could follow to bounce through the +>>>>>> cgiauthurl", presumably this could _be_ the Edit link for +>>>>>> non-Discussion pages, so that the typical case of an httpauth +>>>>>> user editing an editable-only-by-httpauth page doesn't visibly +>>>>>> change. And then the Edit link for Discussion subpages could do +>>>>>> as you suggest, adding one click for the httpauth user, who won't +>>>>>> often need to edit those subpages. --[[schmonz]] + +>> On reflection, I've stopped being bothered by the +>> redirect-to-signin-page approach. (It only needs to happen once per +>> browser session, anyway.) Can we try that? --[[schmonz]] + +Here is an attempt. With this httpauth will only redirect to the +`cgiauth_url` when a page is edited, and it will defer to other plugins +like anonok first. I have not tested this. --[[Joey]] + +<pre> +diff --git a/IkiWiki/Plugin/httpauth.pm b/IkiWiki/Plugin/httpauth.pm +index 127c321..a18f8ca 100644 +--- a/IkiWiki/Plugin/httpauth.pm ++++ b/IkiWiki/Plugin/httpauth.pm +@@ -9,6 +9,8 @@ use IkiWiki 3.00; + sub import { + hook(type => "getsetup", id => "httpauth", call => \&getsetup); + hook(type => "auth", id => "httpauth", call => \&auth); ++ hook(type => "canedit", id => "httpauth", call => \&canedit, ++ last => 1); + } + + sub getsetup () { +@@ -33,9 +35,21 @@ sub auth ($$) { + if (defined $cgi->remote_user()) { + $session->param("name", $cgi->remote_user()); + } +- elsif (defined $config{cgiauthurl}) { +- IkiWiki::redirect($cgi, $config{cgiauthurl}.'?'.$cgi->query_string()); +- exit; ++} ++ ++sub canedit ($$$) { ++ my $page=shift; ++ my $cgi=shift; ++ my $session=shift; ++ ++ if (! defined $cgi->remote_user() && defined $config{cgiauthurl}) { ++ return sub { ++ IkiWiki::redirect($cgi, $config{cgiauthurl}.'?'.$cgi->query_string()); ++ exit; ++ }; ++ } ++ else { ++ return undef; + } + } + +</pre> + +> With `anonok` enabled, this works for anonymous editing of an +> existing Discussion page. auth is still needed to create one. --[[schmonz]] + +>> Refreshed above patch to fix that. --[[Joey]] + +>> Remaining issue: This patch will work with anonok, but not openid or +>> passwordauth, both of which want to display a login page at the same +>> time that httpauth is redirecting to the cgiauthurl. As mentioned above, +>> the only way to deal with that would be to add a link to the signin page +>> that does the httpauth signin. --[[Joey]] + +>>> That's dealt with in final version. [[done]] --[[Joey]] diff --git a/doc/bugs/argument_isn__39__t_numeric:_mixing_templates_and_creation__95__date.mdwn b/doc/bugs/argument_isn__39__t_numeric:_mixing_templates_and_creation__95__date.mdwn new file mode 100644 index 000000000..ff98ba55f --- /dev/null +++ b/doc/bugs/argument_isn__39__t_numeric:_mixing_templates_and_creation__95__date.mdwn @@ -0,0 +1,62 @@ +I get the following error when building my wiki + + Argument "\x{3c}\x{54}..." isn't numeric in numeric eq (==) at /usr/share/perl5/IkiWiki.pm line 2547. + Argument "\x{3c}\x{54}..." isn't numeric in numeric eq (==) at /usr/share/perl5/IkiWiki.pm line 2547. + +that line corresponds to + + sub match_creation_year ($$;@) { + if ((localtime($IkiWiki::pagectime{shift()}))[5] + 1900 == shift) { <-- this one + return IkiWiki::SuccessReason->new('creation_year matched'); + } + +A git bisect shows that the offending commit introduced this hunk + + + --- /dev/null + +++ b/templates/all_entry.mdwn + @@ -0,0 +1,23 @@ + +## <TMPL_VAR year> + + + +There + +<TMPL_IF current> + +have been + +<TMPL_ELSE> + +were + +</TMPL_IF> + +[[!pagecount pages=" + +log/* and !tagged(aggregation) and !*/Discussion and !tagged(draft) + +and creation_year(<TMPL_VAR year>) + +and !*.png and !*.jpg + +"]] posts + +<TMPL_IF current> + +so far + +</TMPL_IF> + +in <TMPL_VAR year>. + + + +[[!inline pages=" + + log/* and !tagged(aggregation) and !*/Discussion and !tagged(draft) + + and creation_year(<TMPL_VAR year>) + + and !*.png and !*.jpg + + " archive=yes feeds=no]] + +The lines which feature creation_year(<TMPL_VAR year>) are most likely the culprits. That would explain why the error was repeated twice, and would tally with the file in `templates/` being rendered, rather than the inclusionists. + +A workaround is to move the template outside of the srcdir into the external templates directory and include the file suffix when using it, e.g. + + \[[!template id=all_entry.tmpl year=2010 current=true]] + +I believed (until I tested) that the [[ikiwiki/directive/if]] directive, with the `included()` test, would be an option here, E.g. + + \[[!if test="included()" then=""" + ...template... + """ else=""" + Nothing to see here. + """]] + +However this doesn't work. I assume "included" in this context means e.g. via an `inline` or `map`, not template trans-clusion. -- [[Jon]] + +> As far as I know, this bug was fixed in +> 4a75dee651390b79ce4ceb1d951b02e28b3ce83a on October 20th. [[done]] --[[Joey]] + +>> Sorry Joey, I'll make sure to reproduce stuff against master in future. [[Jon]] diff --git a/doc/bugs/attachment_upload_does_not_work_for_windows_clients.mdwn b/doc/bugs/attachment_upload_does_not_work_for_windows_clients.mdwn new file mode 100644 index 000000000..4e8c7bdcf --- /dev/null +++ b/doc/bugs/attachment_upload_does_not_work_for_windows_clients.mdwn @@ -0,0 +1,34 @@ +It seems as if windows clients (IE) submit filenames with backslash as directory separator. +(no surprise :-). + +But the attachment plugin translates these backslashes to underscore, making the +whole path a filename. + +> As far as I can see, that just means that the file will be saved with +> a filename something like `c:__92__My_Documents__92__somefile`. +> I don't see any "does not work" here. Error message? +> +> Still, I don't mind adding a special case, though obviously not in +> `basename`. [[done]] --[[Joey]] + +>> Well, it's probably something else also, I get **bad attachment filename**. +>> Now, that could really be a bad filename, problem is that it wasn't. I even +>> tried applying the **wiki_file_prune_regexps** one by one to see what was +>> causing it. No problem there. The strange thing is that the error shows up +>> when using firefox on windows too. But the backslash hack fixes at least the +>> incorrect filename from IE (firefox on windows gave me the correct filename. +>> I'll do some more digging... :-) /jh + +This little hack fixed the backslash problem, although I wonder if that +really is the problem? +(Everything works perfectly from linux clients of course. :-) + + sub basename ($) { + my $file=shift; + + $file=~s!.*/+!!; + $file=~s!.*\\+!!; + return $file; + } + +Should probably be `$file=~s!.*[/\\]+!!` :-) diff --git a/doc/bugs/barfs_on_recentchange_entry_for_a_change_removing_an_invalid_pagespec.mdwn b/doc/bugs/barfs_on_recentchange_entry_for_a_change_removing_an_invalid_pagespec.mdwn index 42e6b9e27..c3cbff43e 100644 --- a/doc/bugs/barfs_on_recentchange_entry_for_a_change_removing_an_invalid_pagespec.mdwn +++ b/doc/bugs/barfs_on_recentchange_entry_for_a_change_removing_an_invalid_pagespec.mdwn @@ -39,3 +39,6 @@ a year ago in September 2007. > ikiwiki. (Doesn't quite seem to be version 2.53.x either) Try with a current > version, and see if you can send me a source tree that can reproduce the > problem? --[[Joey]] + +Did not hear back, so calling this [[done]], unless I hear differently. +--[[Joey]] diff --git a/doc/bugs/bestlink_change_update_issue.mdwn b/doc/bugs/bestlink_change_update_issue.mdwn index 8a526e821..c26e40d10 100644 --- a/doc/bugs/bestlink_change_update_issue.mdwn +++ b/doc/bugs/bestlink_change_update_issue.mdwn @@ -23,7 +23,10 @@ Keeping a copy of the backlinks has some merit. It could also be incrementally updated. + This old bug still exists as of 031d1bf5046ab77c796477a19967e7c0c512c417. + * And if Foo/Bar/Baz is then removed, Foo/Bar gets a broken link, instead of changing back to linking to Foo/Baz. -This old bug still exists as of 031d1bf5046ab77c796477a19967e7c0c512c417. + This part was finally fixed by commit + f1ddf4bd98821a597d8fa1532092f09d3d9b5483. diff --git a/doc/bugs/bestlink_returns_deleted_pages.mdwn b/doc/bugs/bestlink_returns_deleted_pages.mdwn new file mode 100644 index 000000000..874f18ead --- /dev/null +++ b/doc/bugs/bestlink_returns_deleted_pages.mdwn @@ -0,0 +1,75 @@ +To reproduce: + +1. Add the backlinkbug plugin below to ikiwiki. +2. Create a page named test.mdwn somewhere in the wiki. +3. Refresh ikiwiki in verbose mode. Pages whose bestlink is the test.mwdn page will be printed to the terminal. +4. Delete test.mdwn. +5. Refresh ikiwiki in verbose mode again. The same pages will be printed to the terminal again. +6. Refresh ikiwiki in verbose mode another time. Now no pages will be printed. + +bestlink() checks %links (and %pagecase) to confirm the existance of the page. +However, find_del_files() does not remove the deleted page from %links (and %pagecase). + +Since find_del_files removes the deleted page from %pagesources and %destsources, +won't it make sense for bestlink() to check %pagesources first? --[[harishcm]] + +> This same problem turned out to also be the root of half of ikiwiki's +> second-oldest bug, [[bestlink_change_update_issue]]. +> +> Fixing it is really a bit involved, see commit +> f1ddf4bd98821a597d8fa1532092f09d3d9b5483. The fix I committed fixes +> bestlink to not return deleted pages, but only *after* the needsbuild and +> scan hooks are called. So I was able to fix it for every case except the +> one you gave! Sorry for that. To fix it during beedsbuild and scan, +> a much more involved approach would be needed. AFAICS, no existing plugin +> in ikiwiki uses bestlink in needsbuild or scan though. +> +> If the other half of [[bestlink_change_update_issue]] is fixed, +> maybe by keeping a copy of the old backlinks info, then that fix could be +> applied here too. --[[Joey]] + +>> Cool that was fast! Well at least half the bug is solved :) For now I'll +>> probably try using a workaround if using bestlink within the needsbuild +>> or scan hooks. Maybe by testing if pagemtime equals zero. --[[harishcm]] + +>>> Yeah, and bestlink could also do that. However, it feels nasty to have +>>> it need to look at pagemtime. --[[Joey]] + +---- + + #!/usr/bin/perl + # Plugin to reproduce bestlink returning deleted pages. + # Run with ikiwiki in verbose mode. + + package IkiWiki::Plugin::bestlinkbug; + + use warnings; + use strict; + use IkiWiki 3.00; + + sub import { + hook(type => "getsetup", id => "bestlinkbug", call => \&getsetup); + hook(type => "needsbuild", id => "bestlinkbug", call => \&needsbuild); + } + + sub getsetup () { + return + plugin => { + safe => 1, + rebuild => 0, + }, + } + + sub needsbuild (@) { + my $needsbuild=shift; + + foreach my $page (keys %pagestate) { + my $testpage=bestlink($page, "test") || next; + + debug("$page"); + } + } + + 1 + + diff --git a/doc/bugs/blog_spam_plugin_not_allowing_non-ASCII_chars__63__.mdwn b/doc/bugs/blog_spam_plugin_not_allowing_non-ASCII_chars__63__.mdwn new file mode 100644 index 000000000..59bf93d14 --- /dev/null +++ b/doc/bugs/blog_spam_plugin_not_allowing_non-ASCII_chars__63__.mdwn @@ -0,0 +1,15 @@ +Hi, + +I'm trying to add a comment, and ikiwiki fails with this error message: + + Error: HTTP::Message content must be bytes at /usr/share/perl5/RPC/XML/Client.pm line 308 + +This seems to happen because I had a non-ASCII character in the comment (an ellipse, …). +The interesting part is that the comment preview works fine, just the save fails. Probably +this means that the blogspam plugin is the culprit (hence the error in RPC::XML::Client library). +I'm using version 3.20100815~bpo50+. Thanks! + +> I've filed an upstream bug about this on RPC::XML: +> <https://rt.cpan.org/Ticket/Display.html?id=61333> +> +> Worked around it in blogspam by decoding. [[done]] --[[Joey]] diff --git a/doc/bugs/both_inline_and_comment_create_elements_id__61__feedlink.mdwn b/doc/bugs/both_inline_and_comment_create_elements_id__61__feedlink.mdwn new file mode 100644 index 000000000..170f3810e --- /dev/null +++ b/doc/bugs/both_inline_and_comment_create_elements_id__61__feedlink.mdwn @@ -0,0 +1,15 @@ +The [[plugins/inline]] and [[plugins/comments]] plugins both generate feed links. + +In both cases, the generated markup include an element with `id="feedlink"`. + +[XHTML 1.0 Strict](http://www.w3.org/TR/xhtml1/#h-4.10) (Ikiwiki's default output type) forbids multiple elements with the same ID: + +> In XML, fragment identifiers are of type ID, and there can only be a single attribute of type ID per element. Therefore, in XHTML 1.0 the id attribute is defined to be of type ID. In order to ensure that XHTML 1.0 documents are well-structured XML documents, XHTML 1.0 documents MUST use the id attribute when defining fragment identifiers on the elements listed above. See the HTML Compatibility Guidelines for information on ensuring such anchors are backward compatible when serving XHTML documents as media type text/html. + +As does [W3C's HTML5](http://www.w3.org/TR/html5/elements.html#the-id-attribute). + +Any page with both a comments feed and an inline feed will be invalid XHTML 1.0 Strict or HTML 5. + +-- [[Jon]] + +> [[news/version_3.2011012]] suggests this is fixed for `inline`, at least, I will test to see if it is cleared up for comments too. -- [[Jon]] diff --git a/doc/bugs/broken_parentlinks.mdwn b/doc/bugs/broken_parentlinks.mdwn index caf1eeb0e..556d89b65 100644 --- a/doc/bugs/broken_parentlinks.mdwn +++ b/doc/bugs/broken_parentlinks.mdwn @@ -10,7 +10,7 @@ a dead link for every subpage. This is a bug, but fixing it is very tricky. Consider what would happen if example.mdwn were created: example/page.html and the rest of example/* -would need to be updated to change the parentlink from a bare work to a +would need to be updated to change the parentlink from a bare word to a link to the new page. Now if example.mdwn were removed again, they'd need to be updated again. So example/* depends on example. But it's even more tricky, because if example.mdwn is modified, we _don't_ want to rebuild @@ -19,6 +19,10 @@ example/*! ikiwiki doesn't have a way to represent this dependency and can't get one without a lot of new complex code being added. +> Note that this code has now been added. In new terms, example/* has a +> presence dependency on example. So this bug is theoretically fixable now. +> --[[Joey]] + For now the best thing to do is to make sure that you always create example if you create example/foo. Which is probably a good idea anyway.. @@ -27,3 +31,20 @@ example if you create example/foo. Which is probably a good idea anyway.. Note that this bug does not exist if the wiki is built with the "usedirs" option, since in that case, the parent link will link to a subdirectory, that will just be missing the index.html file, but still nicely usable. +--[[Joey]] + +---- + +<http://www.gnu.org/software/hurd/hurd/translator/writing.html> does not exist. +Then, on +<http://www.gnu.org/software/hurd/hurd/translator/writing/example.html>, in the +*parentlinks* line, *writing* links to the top-level *index* file. It should +rather not link anywhere at all. --[[tschwinge]] + +> So, the bug has changed behavior a bit. Rather than a broken link, we get +> a link to the toplevel page. This, FWIW, is because the template now +> uses this for each parentlink: + + <a href="<TMPL_VAR URL>"><TMPL_VAR PAGE></a>/ + +> Best workaround is still to enable usedirs. --[[Joey]] diff --git a/doc/bugs/build_fails_oddly_when_older_ikiwiki_is_installed.mdwn b/doc/bugs/build_fails_oddly_when_older_ikiwiki_is_installed.mdwn new file mode 100644 index 000000000..7b252031b --- /dev/null +++ b/doc/bugs/build_fails_oddly_when_older_ikiwiki_is_installed.mdwn @@ -0,0 +1,31 @@ +I got this failure when trying to build ikiwiki version 3.20100403: + + $ perl Makefile.PL INSTALL_BASE=/opt/ikiwiki PREFIX= + Writing Makefile for IkiWiki + $ make + +*...snip...* + + ./pm_filter /opt/ikiwiki 3.20100403 /opt/ikiwiki/lib/perl5 < ikiwiki.in > ikiwiki.out + chmod +x ikiwiki.out + ./pm_filter /opt/ikiwiki 3.20100403 /opt/ikiwiki/lib/perl5 < ikiwiki-transition.in > ikiwiki-transition.out + chmod +x ikiwiki-transition.out + ./pm_filter /opt/ikiwiki 3.20100403 /opt/ikiwiki/lib/perl5 < ikiwiki-calendar.in > ikiwiki-calendar.out + chmod +x ikiwiki-calendar.out + HOME=/home/me /usr/bin/perl -Iblib/lib ikiwiki.out -libdir . -dumpsetup ikiwiki.setup + Use of uninitialized value $IkiWiki::Setup::config{"setuptype"} in concatenation (.) or string at IkiWiki/Setup.pm line 53. + Can't locate IkiWiki/Setup/.pm in @INC (@INC contains: . /opt/ikiwiki/lib/perl5/i486-linux-gnu-thread-multi /opt/ikiwiki/lib/perl5 blib/lib /etc/perl /usr/local/lib/perl/5.10.1 /usr/local/share/perl/5.10.1 /usr/lib/perl5 /usr/share/perl5 /usr/lib/perl/5.10 /usr/share/perl/5.10 /usr/local/lib/site_perl .) at (eval 35) line 3. + + make: *** [ikiwiki.setup] Error 2 + +Note that I had been trying to upgrade with an installed ikiwiki 3.20091114 +already in place under /opt/ikiwiki. The build does not fail for me +if I first remove the old ikiwiki installation, nor does it fail with +3.20100403 or newer installed at /opt/ikiwiki. Hence this is not +really a critical bug, although it's somewhat perplexing to me why it +ought to make a difference. + +> So, using INSTALL_BASE causes a 'use lib' to be hardcoded into the `.out` +> files; which overrides the -libdir and the -I, and so the old version +> of IkiWiki.pm is used. +> [[fixed|done]] --[[Joey]] diff --git a/doc/bugs/bzr_2.0_breaks_bzr_plugin.mdwn b/doc/bugs/bzr_2.0_breaks_bzr_plugin.mdwn new file mode 100644 index 000000000..39500af20 --- /dev/null +++ b/doc/bugs/bzr_2.0_breaks_bzr_plugin.mdwn @@ -0,0 +1,87 @@ +Version 2.0 of bzr seems to break the bzr plugin. + +I traced this to the bzr_log method in the plugin, and patching that seems to fix it. The plugin just needs to parse the input little bit differently. +--liw + +> Patch applied, [[done]] (but, it would be good if it could be tested with +> an older bzr, and it's a pity bzr's human-targeted log has to be parsed, +> I assume there is no machine-targeted version?) --[[Joey]] + + From fb897114124e627fd3acf5af8e784c9a77419a81 Mon Sep 17 00:00:00 2001 + From: Lars Wirzenius <liw@liw.fi> + Date: Sun, 4 Apr 2010 21:05:07 +1200 + Subject: [PATCH] Fix bzr plugin to work with bzr 2.0. + + The output of "bzr log" seems to have changed a bit, so we change the + parsing accordingly. This has not been tested with earlier versions of + bzr. + + Several problems seemed to occur, all in the bzr_log subroutine: + + 1. The @infos list would contain an empty hash, which would confuse the + rest of the program. + 2. This was because bzr_log would push an empty anonymous hash to the + list whenever it thought a new record would start. + 3. However, a new record marker (now?) also happens at th end of bzr log + output. + 4. Now we collect the record to a hash that gets pushed to the list only + if it is not empty. + 5. Also, sometimes bzr log outputs "revno: 1234 [merge]", so we catch only + the revision number. + 6. Finally, there may be non-headers at the of the output, so we ignore + those. + --- + IkiWiki/Plugin/bzr.pm | 23 ++++++++++++++++------- + 1 files changed, 16 insertions(+), 7 deletions(-) + + diff --git a/IkiWiki/Plugin/bzr.pm b/IkiWiki/Plugin/bzr.pm + index 1ffdc23..e813331 100644 + --- a/IkiWiki/Plugin/bzr.pm + +++ b/IkiWiki/Plugin/bzr.pm + @@ -73,28 +73,37 @@ sub bzr_log ($) { + my @infos = (); + my $key = undef; + + + my $hash = {}; + while (<$out>) { + my $line = $_; + my ($value); + if ($line =~ /^message:/) { + $key = "message"; + - $infos[$#infos]{$key} = ""; + + $$hash{$key} = ""; + } + elsif ($line =~ /^(modified|added|renamed|renamed and modified|removed):/) { + $key = "files"; + - unless (defined($infos[$#infos]{$key})) { $infos[$#infos]{$key} = ""; } + + unless (defined($$hash{$key})) { $$hash{$key} = ""; } + } + elsif (defined($key) and $line =~ /^ (.*)/) { + - $infos[$#infos]{$key} .= "$1\n"; + + $$hash{$key} .= "$1\n"; + } + elsif ($line eq "------------------------------------------------------------\n") { + + if (keys %$hash) { + + push (@infos, $hash); + + } + + $hash = {}; + $key = undef; + - push (@infos, {}); + } + - else { + + elsif ($line =~ /: /) { + chomp $line; + - ($key, $value) = split /: +/, $line, 2; + - $infos[$#infos]{$key} = $value; + + if ($line =~ /^revno: (\d+)/) { + + $key = "revno"; + + $value = $1; + + } else { + + ($key, $value) = split /: +/, $line, 2; + + } + + $$hash{$key} = $value; + } + } + close $out; + -- + 1.7.0 diff --git a/doc/bugs/can__39__t_mix_template_vars_inside_directives.mdwn b/doc/bugs/can__39__t_mix_template_vars_inside_directives.mdwn new file mode 100644 index 000000000..e91a8923d --- /dev/null +++ b/doc/bugs/can__39__t_mix_template_vars_inside_directives.mdwn @@ -0,0 +1,61 @@ +I often find myself wrapping the same boiler plate around [[ikiwiki/directives/img]] img directives, so I tried to encapsulate it using the following [[ikiwiki/directives/template]]: + + + <div class="image"> + [\[!img <TMPL_VAR raw_href> + size="<TMPL_VAR raw_size>" + + <TMPL_IF alt> + alt="<TMPL_VAR raw_alt>" + <TMPL_ELSE> + <TMPL_IF caption> + alt="<TMPL_VAR raw_alt>" + <TMPL_ELSE> + alt="[pic]" + </TMPL_IF> + </TMPL_IF> + + ]] + <TMPL_IF caption> + <p><TMPL_VAR raw_caption></p> + </TMPL_IF> + </div> + +The result, even with htmlscrubber disabled, is mangled, something like + + <div class="image"> + <span class="createlink"><a href="http://jmtd.net/cgi? + page=size&from=log0.000000old_new_test&do=create" + rel="nofollow">?</a>size</span> + + </div> + +Any suggestions gladly received. -- [[Jon]] + +> Well, you *should* be able to do things like this, and in my testing, I +> *can*. I used your exact example above (removing the backslash escape) +> and invoked it as: +> \[[!template id=test href=himom.png size=100x]] +> +> And got just what you would expect. +> +> I don't know what went wrong for you, but I don't see a bug here. +> My guess, at the moment, is that you didn't specify the required href +> and size parameters when using the template. If I leave those off, +> I of course reproduce what you reported, since the img directive gets +> called with no filename, and so assumes the size parameter is the image +> to display.. [[done]]? --[[Joey]] + +>> Hmm, eek. Just double-checked, and done a full rebuild. No dice! Version 3.20100831. Feel free to leave this marked done, It probably *is* PEBKAC. I shall look again in day time. -- [[Jon]] + +>>> As always, if you'd like to mail me a larger test case that reproduces a +>>> problem for you, I can take a look at it. --[[Joey]] + +>>>> <s>Thank you for the offer. I might still take you up on it. I've just proven that this +>>>> does work for a clean repo / bare bones test case. -- [[Jon]]</s> Figured it out. The +>>>> problem was I'd copied a page (old_new) which had two images embedded in it to test. +>>>> I'd stored the images under a subdir "old_new". The new page was called "old_new_test" +>>>> and the images thus could not be found by a pagespec "some-image.jpg". Adjusting the +>>>> href argument to the template (consequently the src argument to img) to +>>>> "old_new/some-image.jpg" fixed it all. [[done]], PEBKAC. Thank you for your time :) +>>>> -- [[Jon]] diff --git a/doc/bugs/class_parameter_of_img_directive_behave_not_as_documented.mdwn b/doc/bugs/class_parameter_of_img_directive_behave_not_as_documented.mdwn new file mode 100644 index 000000000..e7797765f --- /dev/null +++ b/doc/bugs/class_parameter_of_img_directive_behave_not_as_documented.mdwn @@ -0,0 +1,31 @@ +On [[ikiwiki/directive/img/]] I read that + +> You can also pass alt, title, class, align, id, hspace, and vspace +> parameters. These are passed through unchanged to the html img tag. + +but when I pass `class="myclass"` to an img directive, I obtain + + <img class="myclass img" ... + +I found that this behaviour was added in commit f6db10d: + +> img: Add a margin around images displayed by this directive. +> +> Particularly important for floating images, which could before be placed +> uncomfortably close to text. + +which adds to img.pm: + + if (exists $params{class}) { + $params{class}.=" img"; + } + else { + $params{class}="img"; + } + +I would prefer if the `img` class were only added if no class attribute is +passed. + +If you keep the current behaviour, please document it. + +> [[done]] --[[Joey]] diff --git a/doc/bugs/clearenv_not_present_at_FreeBSD_.mdwn b/doc/bugs/clearenv_not_present_at_FreeBSD_.mdwn new file mode 100644 index 000000000..f38c86e03 --- /dev/null +++ b/doc/bugs/clearenv_not_present_at_FreeBSD_.mdwn @@ -0,0 +1,5 @@ +When build wrapper on FreeBSD system, is error occured with clearenv reference. clearenv() das not exists at FreeBSD system, use workaround environ[0]=NULL; +P.S. new git instalation, FreeBSD 7.x + +> `#include <stupid-standards.h>` fixed with nasty ifdefs to handle tcc w/o +> breaking everything else. [[done]] --[[Joey]] diff --git a/doc/bugs/clearenv_not_present_at_FreeBSD_/discussion.mdwn b/doc/bugs/clearenv_not_present_at_FreeBSD_/discussion.mdwn new file mode 100644 index 000000000..713198b61 --- /dev/null +++ b/doc/bugs/clearenv_not_present_at_FreeBSD_/discussion.mdwn @@ -0,0 +1 @@ +Mmmm... i see. But it not setup under FreeBSD without magic manual passes. diff --git a/doc/bugs/comments_appear_two_times.mdwn b/doc/bugs/comments_appear_two_times.mdwn new file mode 100644 index 000000000..2ae081844 --- /dev/null +++ b/doc/bugs/comments_appear_two_times.mdwn @@ -0,0 +1,24 @@ +When a comment is added to page named "directory/page" it also appears in the page "directory". + +This seems to happen at least with versions 3.20100815.6 and 3.20110225. Id didn't happen in version from about a year ago. I created a testing ikiwiki installation demonstrating this bug. The same comment can be seen at <http://rtime.felk.cvut.cz/~sojka/blog/posts/directory/post/> and at <http://rtime.felk.cvut.cz/~sojka/blog/posts/directory/>. The corresponding git repo can be cloned by + + git clone git://rtime.felk.cvut.cz/~sojka/blog.git + +> Unfortunatly, that git repo seems to be empty. +> Perhaps you forgot to push to it? Thank you for working +> to provide a way to reproduce this! +> +> Myself, I cannot reproduce it. Eg, my blog has all posts +> under <http://kitenet.net/~joey/blog/entry/>, but that page +> shows none of the comments to my blog posts. And here on ikiwiki.info, +> posts on the forum have comments, but they don't show up as comments +> to the [[forum]] page. +> --[[Joey]] + +>> The repo can be cloned now. There was a problem with permissions. --[[wentasah]] + +>>> I see the bug now. Probably most configs hide it by setting +>>> `comments_pagespec` more tightly. It was introduced by +>>> d9d910f6765de6ba07508ab56a5a0f93edb4c8ad, and/or later +>>> changes to actually use the `comments()` PageSpec. +>>> Fixed in git! [[done]] --[[Joey]] diff --git a/doc/bugs/comments_not_searchable.mdwn b/doc/bugs/comments_not_searchable.mdwn new file mode 100644 index 000000000..6fda89bd2 --- /dev/null +++ b/doc/bugs/comments_not_searchable.mdwn @@ -0,0 +1,19 @@ +The text of comments (and other internal pages) does not get indexed by the +search plugin. + +Search indexes content passed to the postscan hook. +Comments are inlined, but inline's speed hack avoids adding inlined +content to the page until the format hook. + +And hmm, that's somewhat desirable, because we don't want searches +to find content that is inlined onto another page. + +That suggests that the fix could be to call the postscan hook +for internal pages. + +However, the search postscan hook tells xapian the page url, +and uses `urlto($page)` to do it. And that won't work for +an internal page. Guess it could be modified to tell xapian the +permalink. --[[Joey]] + +> [[done]] --[[Joey]] diff --git a/doc/bugs/comments_preview_unsafe_with_allowdirectives.mdwn b/doc/bugs/comments_preview_unsafe_with_allowdirectives.mdwn new file mode 100644 index 000000000..7f9fb67e9 --- /dev/null +++ b/doc/bugs/comments_preview_unsafe_with_allowdirectives.mdwn @@ -0,0 +1,8 @@ +If `comments_allowdirectives` is set, previewing a comment can run +directives that create files. (Eg, img.) Unlike editpage, it does not +keep track of those files and expire them. So the files will linger in +destdir forever. + +Probably when the user then tries to save the comment, ikiwiki will refuse +to overwrite the unknown file, and will crash. +--[[Joey]] diff --git a/doc/bugs/conflicts.mdwn b/doc/bugs/conflicts.mdwn new file mode 100644 index 000000000..bef0f54cd --- /dev/null +++ b/doc/bugs/conflicts.mdwn @@ -0,0 +1,32 @@ +The `conflicts` testcase has 4 failing test cases. The underlaying problem +is that there are multiple possible source files that can create the same +destination files. + +1. `foo.mdwn` is in srcdir, rendered to destdir. Then + it is removed, and `foo` is added, which will be rendered + raw to destdir. Since the `foo/` directory still exists, + it fails. +1. `foo` is added to srcdir, rendered raw to destdir. + Then it is removed from srcdir, and `foo.mdwn` is added. + The `foo` file is still present in the destdir, and mkdir + of the directory `foo/` fails. +1. `foo.mdwn` renders to `foo/index.html`. Then `foo/index.html` + is added to the srcdir, using rawhtml. It renders to the same + thing. +1. `foo/index.html` in srcdir is rendered to same thing in destdir + using rawhtml. Then `foo.mdwn` is added; renders same thing. + +Note that another case, that of page `foo.mdwn` and page `foo.txt`, that +both render to `foo/index.html`, used to cause problems, but no longer +crashes ikiwiki. It now only complains in this situation, and which +file "wins" is undefined. The fix for this relied on both pages being +named `foo`; but in the above cases, the source files have different +pagenames. + +One approach: Beef up checking in `will_render` to detect when the same +destination file is rendered by multiple pages. Or when one page renders +a file that is a parent directory of the rendered file of another page. +It could warn, rather than erroring. The last page rendered would "win"; +generating the destdir file. + +[[done]] diff --git a/doc/bugs/creating_page_from_comment_creates_a_comment.mdwn b/doc/bugs/creating_page_from_comment_creates_a_comment.mdwn new file mode 100644 index 000000000..0eff756de --- /dev/null +++ b/doc/bugs/creating_page_from_comment_creates_a_comment.mdwn @@ -0,0 +1,9 @@ +If a comment contains a WikiLink, for a page that doesn't exist, and the +user clicks on the edit link, and creates the page, it will itself be saved +as a comment, with "._comment" extension. + +This is very surprising and wrong behavior. The page editor tries to +preserve the linking page's format type, but it shouldn't do so if the page +is an internal page. --[[Joey]] + +[[done]] --[[Joey]] diff --git a/doc/bugs/cutpaste.pm:_missing_filter_call.mdwn b/doc/bugs/cutpaste.pm:_missing_filter_call.mdwn new file mode 100644 index 000000000..4b22fd06c --- /dev/null +++ b/doc/bugs/cutpaste.pm:_missing_filter_call.mdwn @@ -0,0 +1,55 @@ +Consider this: + + $ wget http://schwinge.homeip.net/~thomas/tmp/cutpaste_filter.tar.bz2 + $ wget http://schwinge.homeip.net/~thomas/tmp/cutpaste_filter.patch + + $ tar -xj < cutpaste_filter.tar.bz2 + $ cd cutpaste_filter/ + $ ./render_locally + $ find "$PWD".rendered/ -type f -print0 | xargs -0 grep -H -E 'FOO|BAR' + [notice one FOO in there] + $ rm -rf .ikiwiki "$PWD".rendered + + $ cp /usr/share/perl5/IkiWiki/Plugin/cutpaste.pm .library/IkiWiki/Plugin/ + $ patch -p0 < ../cutpaste_filter.patch + $ ./render_locally + $ find "$PWD".rendered/ -type f -print0 | xargs -0 grep -H -E 'FOO|BAR' + [correct; notice no more FOO] + +I guess this needs a general audit -- there are other places where `preprocess` +is being doing without `filter`ing first, for example in the same file, `copy` +function. + +--[[tschwinge]] + +> So, in English, page text inside a cut directive will not be filtered. +> Because the cut directive takes the text during the scan pass, before +> filtering happens. +> +> Commit 192ce7a238af9021b0fd6dd571f22409af81ebaf and +> [[bugs/po_vs_templates]] has to do with this. +> There I decided that filter hooks should *only* act on the complete +> text of a page. +> +> I also suggested that anything that wants to reliably +> s/FOO/BAR/ should probably use a sanitize hook, not a filter hook. +> I think that would make sense in this example. +> +> I don't see any way to make cut text be filtered while satisfying these +> constraints, without removing cutpaste's ability to have forward pastes +> of text cut laster in the page. (That does seems like an increasingly +> bad idea..) --[[Joey]] + +> > OK -- so the FOO/BAR thing was only a very stripped-down example, of +> > course, and the real thing is being observed with the +> > *[[plugins/contrib/getfield]]* plugin. This one needs to run *before* +> > `preprocess`ing, for its `{{$page#field}}` syntax is (a) meant to be usable +> > inside ikiwiki directives, and (b) the field values are meant to still be +> > `preprocess`ed before being embedded. That's why it's using the `filter` +> > hook instead of `sanitize`. + +> > Would adding another kind of hook be a way to fix this? My idea is that +> > *cut* (and others) would then take their data not during `scan`ning, but +> > *after* `filter`ing. + +> > --[[tschwinge]] diff --git a/doc/bugs/default__95__pageext_not_working.mdwn b/doc/bugs/default__95__pageext_not_working.mdwn new file mode 100644 index 000000000..b7064206f --- /dev/null +++ b/doc/bugs/default__95__pageext_not_working.mdwn @@ -0,0 +1,16 @@ +default_pageext in the setup file does not work for me. + +I tried to set it as 'txt' and as a custom plugin I am developing but when I edit a page it only ever loads with Markdown selected. + +Yes I am only trying to set it to loaded and working plugins. + +ikiwiki version 3.20101129 + +> I've tested `default_pageext` with 3.20110124, and it works fine. +> +> It seems to me from what you describe that you expect +> it to have an effect when you go and edit an existing page. +> That's not what it's for, it only chooses the default used +> when creating a new page. +> +> Closing this bug as apparent user error. --[[Joey]] [[done]] diff --git a/doc/bugs/deletion_warnings.mdwn b/doc/bugs/deletion_warnings.mdwn new file mode 100644 index 000000000..668626b49 --- /dev/null +++ b/doc/bugs/deletion_warnings.mdwn @@ -0,0 +1,89 @@ +Seen while deleting a blog's calendar pages: + +--[[Joey]] + +[[done]] -- the new `page()` pagespec needed to check if there was a source +file for the page, and was leaking undef. + +<pre> + 427250f..ff6c054 master -> origin/master +Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688. +Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668. +Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692. +Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688. +Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668. +Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692. +Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688. +Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668. +Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692. +Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688. +Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668. +Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692. +Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688. +Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668. +Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692. +Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688. +Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668. +Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692. +Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688. +Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668. +Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692. +Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688. +Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668. +Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692. +Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688. +Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668. +Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692. +Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688. +Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668. +Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692. +Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688. +Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668. +Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692. +Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688. +Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668. +Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692. +Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688. +Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668. +Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692. +Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688. +Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668. +Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692. +Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688. +Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668. +Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692. +Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688. +Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668. +Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692. +Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688. +Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668. +Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692. +Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688. +Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668. +Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692. +Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688. +Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668. +Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692. +Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688. +Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668. +Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692. +Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688. +Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668. +Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692. +Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688. +Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668. +Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692. +Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688. +Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668. +Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692. +Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688. +Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668. +Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692. +Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688. +Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668. +Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692. +Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688. +Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668. +Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692. +</pre> + diff --git a/doc/bugs/depends_simple_mixup.mdwn b/doc/bugs/depends_simple_mixup.mdwn new file mode 100644 index 000000000..a5910d02e --- /dev/null +++ b/doc/bugs/depends_simple_mixup.mdwn @@ -0,0 +1,88 @@ +The [[bugs]] page, at least before I commit this, has a bug at the top that +has been modified to link to done, and ikiwiki's dependency calculations +failed to notice and update the bugs page. Looking at the indexdb, I saw +that the page was not included in the `depends_simple` of the bugs page. + +I was able to replicate the problem locally by starting off with the page +marked done (when it did appear in the bugs page `depends_simple` +(appropriatly as a link dependency, since a change to the page removing the +link would make it match)), then removing the done link. + +At that point, it vanished from `depends_simple`. Presumably because +the main (pagespec) depends for the bugs page now matched it, as a content +dependency. But, it seems to me it should still be listed in +`depends_simple` here. This, I think, is the cause of the bug. + +Then re-add the done link, and the dependency calc code breaks down, +not noticing that bugs dependeded on the page and needs to be updated. + +Ok.. Turns out this was not a problem with the actual influences +calculation or dependency calculation code. Whew! `match_link` +just didn't set the influence correctly when failing. fixed + +--[[Joey]] + +--- + +Update: Reopening this because the fix for it rather sucks. + +I made `match_link` return on failure an influence of +type DEPEND_LINKS. So, a tag page that inlines `tagged(foo)` +gets a `depends_simple` built up that contains link dependencies for +*every* page in the wiki. A very bloaty way to represent the dependency! + +Per [[todo/dependency_types]], `link(done)` only needs to list in +`depends_simple` the pages that currently match. If a page is modified +to add the link, the regular dependency calculation code notices that +a new page matches. If a page that had the link is modified to remove it, +the `depends_simple` lets ikiwiki remember that the now non-matching page +matched before. + +Where that fell down was `!link(done)`. A page matching that was not added +to `depends_simple`, because the `link(done)` did not match it. If the page +is modified to add the link, the regular dependency calculation code +didn't notice, since the pagespec no longer matched. + +In this case, `depends_simple` needs to contain all pages +that do *not* match `link(done)`, but before my change, it contained +all pages that *do* match. After my change, it contained all pages. + +---- + +So, seems what is needed is a way for influence info to be manipulated by +the boolean operations that are applied. One way would be to have two +sets of influences be returned, one for successful matches, and one for +failed matches. Normally, these would be the same. For successful +`match_link`, the successful influence would be the page. +For failed `match_link`, the failed influence would be the page. + +Then, when NOTting a `*Reason`, swap the two sets of influences. +When ANDing/ORing, combine the individual sets. Querying the object for +influences should return only the successful influences. + +---- + +Would it be possible to avoid the complication of maintianing two sets of +influence info? + +Well, notice that the influence of `pagespec_match($page, "link(done)")` +is $page. Iff the match succeeds. + +Also, the influence of `pagespec_match($page, "!link(done)")` is +$page. Iff the (overall) match succeeds. + +Does that hold for all cases? If so, the code that populates +`depends_simple` could just test if the pagespec was successful, and +if not, avoid adding $page influences, while still adding any other, +non-$page influences. + +---- + +Hmm, commit f2b3d1341447cbf29189ab490daae418fbe5d02d seems +thuroughly wrong. So, what about influence info for other matches +like `!author(foo)` etc? Currently, none is returned, but it should +be a content influence. (Backlink influence data seems ok.) + +---- + +[[done]] again! diff --git a/doc/bugs/disable_sub-discussion_pages.mdwn b/doc/bugs/disable_sub-discussion_pages.mdwn index 5e9c8c9f9..39d9ba528 100644 --- a/doc/bugs/disable_sub-discussion_pages.mdwn +++ b/doc/bugs/disable_sub-discussion_pages.mdwn @@ -6,6 +6,12 @@ I do want discussion subpage, but I don't want to have, for example: discussion/ > Discussion pages should clearly be a special case that don't get Discussion > links put at the top... aaand.. [[bugs/done]]! --[[Joey]] +>> This bug appears to have returned. For example, +>> [[plugins/contrib/unixauth/discussion]] has a Discussion link. -- [[schmonz]] + +>>> Lots of case issues this time. Audited for and fixed them all. [[done]] +>>> --[[Joey]] + >>> Joey, I've just seen that you closed that bug in ikiwiki 1.37, but it seems >>> you fixed it only for English "discussion" page. The bug still occurs >>> for the international "discussion" pages. I have backported ikiwiki 1.40 diff --git a/doc/bugs/editmessage_turned_off_in_web_interface__63__.mdwn b/doc/bugs/editmessage_turned_off_in_web_interface__63__.mdwn new file mode 100644 index 000000000..d8c6c3a08 --- /dev/null +++ b/doc/bugs/editmessage_turned_off_in_web_interface__63__.mdwn @@ -0,0 +1,10 @@ +the "Optional comment about this change:" text area is not showing up on my wiki when I edit pages. I just see the label "Optional comment about this change:" and no box in which to put the comment. + +Is it possible I turned this off by messing around with plugins? Even if so, then it's strange that I see the "optional comment" text without the corresponding text area. + +If the answer isn't immediately obvious you can see for yourself at <http://metameso.org/aa/ikiwiki.cgi?page=index&do=edit> (UN: guest PW: guest2011). + +> This happened to me. It was due to overriding either one of the ikiwiki templates based on an earlier version than current ikiwiki, or overriding style.css, instead of using local.css. It doesn't look like you are doing the former. Are you overriding the ikiwiki template dir with an out-of-date editpagel template? -- [[Jon]] + +>> Yes, every time I've diagnosed this, it was an old page.tmpl. [[done]] +>> --[[Joey]] diff --git a/doc/bugs/enumerations_of_dates_not_formatted_correctly.mdwn b/doc/bugs/enumerations_of_dates_not_formatted_correctly.mdwn new file mode 100644 index 000000000..263ddd78b --- /dev/null +++ b/doc/bugs/enumerations_of_dates_not_formatted_correctly.mdwn @@ -0,0 +1,43 @@ +When an enumeration contains entries starting with ordinal numbers, e.g., for lists of meeting dates, ikiwiki turns them all into the 1st. + +Testcase: + +*The following lists should read: 1. January, 27. March, 99. November, 42. April* +**But instead it reads:** + +* 1. January +* 27. March +* 99. November +* 42. April + +> That's a consequence of Markdown syntax. The syntax for ordered lists +> (HTML `<ol>`) in Markdown is to use arbitrary numeric prefixes in that style, +> so your text gets parsed as: +> +> <ul> +> <li> +> <ol> +> <li>January</li> +> </ol> +> </li> +> ... +> +> You can avoid that interpretation by escaping the dot with a backslash +> (`1\. January`) like so: +> +> * 1\. January +> * 27\. March +> +> or by writing "1st January" and so on. --[[smcv]] + +>> I think that this is a bug in Text::Markdown (and probably other +>> versions of markdown). The [markdown spec)(http://daringfireball.net/projects/markdown/syntax.text), +>> though unmaintained and bitrotted into near illegibility, seems to say +>> that list items can only be preceeded by whitespace: +>> +>>> "List markers typically start at the left margin, but may be indented by +>>> up to three spaces." +>> +>> So "* * * 1. 2. 3." should not be parsed as a deeply nested list. +>> +>> Forwarded to [upsteam RT](https://rt.cpan.org/Ticket/Display.html?id=65116). [[done]] --[[Joey]] diff --git a/doc/bugs/external_links_inside_headings_don__39__t_work.mdwn b/doc/bugs/external_links_inside_headings_don__39__t_work.mdwn index 5bab283fd..51d6ad475 100644 --- a/doc/bugs/external_links_inside_headings_don__39__t_work.mdwn +++ b/doc/bugs/external_links_inside_headings_don__39__t_work.mdwn @@ -21,4 +21,4 @@ It works fine with h2 and deeper. The square brackets also appear in the output > [[done]] --[[Joey]] ->> It works here but it definitely does *not* work on my wiki; but on further experimentation, I believe my problem is being caused by JasonBlevins' [h1title](http://code.jblevins.org/ikiwiki/plugins.git/plain/h1title.pm) plugin. +>> It works here but it definitely does *not* work on my wiki; but on further experimentation, I believe my problem is being caused by JasonBlevins' [h1title](http://jblevins.org/git/ikiwiki/plugins.git/plain/h1title.pm) plugin. diff --git a/doc/bugs/filecheck_failing_to_find_files.mdwn b/doc/bugs/filecheck_failing_to_find_files.mdwn new file mode 100644 index 000000000..6501508e4 --- /dev/null +++ b/doc/bugs/filecheck_failing_to_find_files.mdwn @@ -0,0 +1,65 @@ +Using the attachment plugin, when filecheck was checking the mime-type of the attachment before allowing the attachment to be removed, it was returning with an error saying that the mime-type of the file was "unknown" (when the mime-type definitely was known!) + +It turns out that the filecheck plugin couldn't find the file, because it was merely using the $pagesources hash, rather than finding the absolute path of the file in question. + +> I don't understand why the file was not in `%pagesources`. Do you? +> --[[Joey]] + +>> The file *was* in `%pagesources`, but what returns from that is the filename relative to the `srcdir` directory; for example, `foo/bar.gif`. +>> When File::MimeInfo::Magic::magic is given that, it can't find the file. +>> But if it is given `/path/to/srcdir/foo/bar.gif` instead, then it *can* find the file, and returns the mime-type correctly. +>> --[[KathrynAndersen]] + +>>> Ok, so it's not removal specific, can in fact be triggered by using +>>> testpagespec (or really anything besides attachment, which passes +>>> the filename parameter). Nor is it limited to mimetype, all the tests in +>>> filecheck have the problem. --[[Joey]] + +>>>> Alas, not fixed. It seems I was mistaken in some of my assumptions. +>>>> It still happens when attempting to remove attachments. +>>>> With your fix, the `IkiWiki::srcfile` function is only called when the filename is not passed in, but it appears that in the case of removing attachments, the filename IS passed in, but it is the relative filename as mentioned above. Thus, the file is still not found, and the mime-type comes back as unknown. +>>>> The reason my patch worked is because, rather than checking whether a filename was passed in before applying IkiWiki::srcfile to the filename, it checks whether the file can be found, and if it cannot be found, then it applies IkiWiki::srcfile to the filename. +>>>> --[[KathrynAndersen]] + +>>>>> Can you test if this patch fixes that? --[[Joey]] + +>>>>>> Yes, it works! --[[KathrynAndersen]] + +applied && [[done]] + +<pre> +diff --git a/IkiWiki/Plugin/remove.pm b/IkiWiki/Plugin/remove.pm +index f59d026..0fc180f 100644 +--- a/IkiWiki/Plugin/remove.pm ++++ b/IkiWiki/Plugin/remove.pm +@@ -49,7 +49,7 @@ sub check_canremove ($$$) { + # This is sorta overkill, but better safe than sorry. + if (! defined pagetype($pagesources{$page})) { + if (IkiWiki::Plugin::attachment->can("check_canattach")) { +- IkiWiki::Plugin::attachment::check_canattach($session, $page, $file); ++ IkiWiki::Plugin::attachment::check_canattach($session, $page, "$config{srcdir}/$file"); + } + else { + error("removal of attachments is not allowed"); +diff --git a/IkiWiki/Plugin/rename.pm b/IkiWiki/Plugin/rename.pm +index 3908443..1a9da63 100644 +--- a/IkiWiki/Plugin/rename.pm ++++ b/IkiWiki/Plugin/rename.pm +@@ -50,7 +50,7 @@ sub check_canrename ($$$$$$) { + IkiWiki::check_canedit($src, $q, $session); + if ($attachment) { + if (IkiWiki::Plugin::attachment->can("check_canattach")) { +- IkiWiki::Plugin::attachment::check_canattach($session, $src, $srcfile); ++ IkiWiki::Plugin::attachment::check_canattach($session, $src, "$config{srcdir}/$srcfile"); + } + else { + error("renaming of attachments is not allowed"); +@@ -85,7 +85,7 @@ sub check_canrename ($$$$$$) { + if ($attachment) { + # Note that $srcfile is used here, not $destfile, + # because it wants the current file, to check it. +- IkiWiki::Plugin::attachment::check_canattach($session, $dest, $srcfile); ++ IkiWiki::Plugin::attachment::check_canattach($session, $dest, "$config{srcdir}/$srcfile"); + } + } +</pre> diff --git a/doc/bugs/firefox_doesn__39__t_want_to_load_updated_pages_at_ikiwiki.info.mdwn b/doc/bugs/firefox_doesn__39__t_want_to_load_updated_pages_at_ikiwiki.info.mdwn index 8cb47f864..558eb90c8 100644 --- a/doc/bugs/firefox_doesn__39__t_want_to_load_updated_pages_at_ikiwiki.info.mdwn +++ b/doc/bugs/firefox_doesn__39__t_want_to_load_updated_pages_at_ikiwiki.info.mdwn @@ -3,3 +3,12 @@ I'm using firefox-3.0.8-alt0.M41.1 (Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1 Only explicitly pressing "reload" helps. Is it a bug? I haven't been noticing such problems usually on other sites. --Ivan Z. + +This remains to be true now, with Epiphany 2.26.3 (Mozilla/5.0 (X11; U; Linux i686; en; rv:1.9.1.4pre) Gecko/20080528 Epiphany/2.22 Firefox/3.5). --Ivan Z. + +> In the most recent ikiwiki release, I added a Cache-Control hack +> explicitly to work around firefox's broken over-caching. +> +> (When I tested epiphany and chromium, neither had firefox's problem.) +> +> [[!tag done]] diff --git a/doc/bugs/git.pm_should_prune_remote_branches_when_fetching.mdwn b/doc/bugs/git.pm_should_prune_remote_branches_when_fetching.mdwn new file mode 100644 index 000000000..5dc4250e3 --- /dev/null +++ b/doc/bugs/git.pm_should_prune_remote_branches_when_fetching.mdwn @@ -0,0 +1,14 @@ +The _git_ module does not appear ever to prune obsolete remote branches in the _srcdir_ repository, leading to spurious errors when fetching. + +Pruning remote branches can be done automatically with the --prune option to "git fetch" or in a separate command "git remote prune". + +--[[blipvert]] + +> I'll need more information than that before I add extra processing +> work to the current git commands it uses. I don't see any errors here +> from obsolete remote branches. --[[Joey]] + +Suppose a remote repository contains a branch named "foo", and you fetch from it. Then, someone renames that branch to "foo/bar". The next time you fetch from that repository, you will get an error because the obsolete branch "foo" is blocking the branch "foo/bar" from being created (due to the way git stores refs for branches). Pruning gets around the problem. It doesn't really add much overhead to the fetch, and in fact it can *save* overhead since obsolete branches do consume resources (any commits they point to cannot be garbage collected). --[[blipvert]] + +> Ok, so git pull --prune can be used to do everything in one command. +> [[done]] --[[Joey]] diff --git a/doc/bugs/git_commit_adds_files_that_were_not_tracked.mdwn b/doc/bugs/git_commit_adds_files_that_were_not_tracked.mdwn new file mode 100644 index 000000000..587650c61 --- /dev/null +++ b/doc/bugs/git_commit_adds_files_that_were_not_tracked.mdwn @@ -0,0 +1,19 @@ +Commit 3650d0265bc501219bc6d5cd4fa91a6b6ecd793a seems to have been caused by +a bug in ikiwiki. recentchanges/* was added to the git repo incorrectly. + +Part of the problem seems to be that git's `rcs_commit` does a git add followed +by a `rcs_commit_staged`, and so calling `rcs_commit` on files that were +not checked in before adds them, incorrectly. + +I'm unsure yet why the recentchanges files were being committed. Possibly +because of the link fixup code run when renaming a page. --[[Joey]] + +> See also [[bugs/rename fixup not attributed to author]]. --[[smcv]] + +> Ok, there was a call to `rcs_commit` that was still using non-named +> parameters, which confused it thuroughly, and I think caused +> 'git add .' to be called. I've fixed that. +> +> I think there is still potential for the problem I described above to +> occur during a rename or possibly otherwise. Ok.. fixed `rcs_commit` +> to not add too. [[done]] --[[Joey]] diff --git a/doc/bugs/git_stderr_output_causes_problems.mdwn b/doc/bugs/git_stderr_output_causes_problems.mdwn index c25ef6927..d8e14db42 100644 --- a/doc/bugs/git_stderr_output_causes_problems.mdwn +++ b/doc/bugs/git_stderr_output_causes_problems.mdwn @@ -40,3 +40,6 @@ Ikiwiki's git handling is sending a bunch of output to stderr. The following pa >> I'm happy with the wrapper script solution, so this is [[done]]. >> And this report is now here to point others to that solution. + +This is also useful when running ikiwiki behind a nginx proxy, because nginx +considers this stderr as invalid headers and reports a server error. -- [[nil]] diff --git a/doc/bugs/gitremotes_script_picks_up_tags_from_anywhere.mdwn b/doc/bugs/gitremotes_script_picks_up_tags_from_anywhere.mdwn new file mode 100644 index 000000000..9bd8938c5 --- /dev/null +++ b/doc/bugs/gitremotes_script_picks_up_tags_from_anywhere.mdwn @@ -0,0 +1,22 @@ +[[!tag patch]] +[[!template id=gitbranch branch=smcv/ready/no-tags author="[[smcv]]"]] + +The `gitremotes` script picks up tags from any repository, including those +used for local .debs that were never actually present in Debian: + + smcv@reptile% git tag | grep -c nmu + 52 + +This can be avoided with the `tagopt = --no-tags` option in .git/config; +see <http://git.pseudorandom.co.uk/smcv/ikiwiki.git?a=shortlog;h=refs/heads/ready/no-tags> + +> *done* thanks. Also cleared propigated tags out of origin. +> +> Hmm, in testing I still see tags get pulled the first time a remote +> is added. If those are then locally deleted, it doesn't pull them again +> with the `--no-tags`. +> --[[Joey]] + +>> Oh, I see why. Try the same branch again... --[[smcv]] + +>>> [[done]] --[[Joey]] diff --git a/doc/bugs/html5_support.mdwn b/doc/bugs/html5_support.mdwn index 239474275..ba67d532b 100644 --- a/doc/bugs/html5_support.mdwn +++ b/doc/bugs/html5_support.mdwn @@ -9,10 +9,59 @@ HTML5](http://www.w3.org/TR/html5-diff/). * [ikiwiki instance with HTML5 templates](http://natalian.org) * [HTML5 outliner tool](http://gsnedders.html5.org/outliner/) -- to check you have the structure of your markup correct +> Kai, thanks enormously for working on this. I switched a page to +> the html5 doctype today, and was rather pleasently suprised that it +> validated, except for the new Cache-Control meta tag. Now I see you're +> well ahead of me. --[[Joey]] +> +> So, how should ikiwiki support html5? There are basically 3 approaches: +> +> 1. Allow users to add html5 tags to their existing xhtml pages. +> What has been done so far, can be extended. Basically works +> in browsers, if you don't care about standards. A good prerequisite +> for anything else, anyway. +> 2. Have both a html5 and a xhtml mode, allow user to select. +> 3. Switch to html5 in eg, ikiwiki 4; users have to deal with +> any custom markup on their pages/templates that breaks then. +> +> The second option seems fairly tractable from what I see here and in +> your branch. You made only relatively minor changes to 10 templates. +> It would probably not be too dreadful to put them in ifdefs. I've made a +> small start at doing that. +> +> I've made ikiwiki use the time element and all the new semantic elements +> in html5 mode. +> +> Other ideas: +> +> * Use details tag instead of the javascript in the toggle plugin. +> (Need to wait on browser support probably.) +> * Use figure and figcaption for captions in img. However, I have not +> managed to style it to look as good as the current table+caption +> approach. +> +> --[[Joey]] + # htmlscrubber.pm needs to not scrub new HTML5 elements * [new elements](http://www.w3.org/TR/html5-diff/#new-elements) +> Many added now. +> +> Things I left out, too hard to understand today: +> Attributes contenteditable, +> data-\*, draggable, role, aria-\*. +> Tags command, keygen, output. +> +> Clearly unsafe: embed. +> +> Apparently cannot be used w/o javascript: menu. +> +> I have not added the new `ping` attribute, because parsing a +> space-separeated list of urls to avoid javascript injection is annoying, +> and the attribute seems generally dubious. +> --[[Joey]] + # HTML5 Validation and t/html.t [validator.nu](http://validator.nu/) is the authorative HTML5 validator, @@ -25,6 +74,9 @@ In the future, hopefully ikiwiki can test for valid HTML5 using [Relax NG schema](http://syntax.whattf.org/) using a Debian package tool [rnv](http://packages.qa.debian.org/r/rnv.html). +> Validation in the test suite is nice, but I am willing to lose those +> tests for a while. --[[Joey]] + # HTML5 migration issues # [article](http://www.whatwg.org/specs/web-apps/current-work/multipage/semantics.html#the-article-element) element @@ -37,6 +89,8 @@ This element is poorly supported by browsers. As a workaround, `style.css` needs Internet Explorer will display it as a block, though you can't seem to be able to further control the style. +> done (needed for header too) --[[Joey]] + ## Time element The [time element](http://www.whatwg.org/specs/web-apps/current-work/multipage/text-level-semantics.html#the-time-element) ideally needs the datatime= attribute set by a template variable with what [HTML5 defines as a valid datetime string](http://www.whatwg.org/specs/web-apps/current-work/multipage/infrastructure.html#valid-global-date-and-time-string). @@ -45,3 +99,19 @@ As a workaround: au:~% grep timeformat natalian.setup timeformat => '%Y-%m-%d', + +> Also, the [[plugins/relativedate]] plugin needs to be updated to +> support relatatizing the contents of time elements. --[[Joey]] + +> Done and done; in html5 mode it uses the time tag, and even +> adds pubdate when displaying ctimes. --[[Joey]] + +## tidy plugin + +Will reformat html5 to html4. + +---- + + +Ok, I consider this [[done]], at least as a first pass. Html5 mode +is experimental, but complete enough. --[[Joey]] diff --git a/doc/bugs/html5_time_element__39__s_pubdate_wrong_when_using_xhtml5___34__mode__34__.mdwn b/doc/bugs/html5_time_element__39__s_pubdate_wrong_when_using_xhtml5___34__mode__34__.mdwn new file mode 100644 index 000000000..def5bcc2a --- /dev/null +++ b/doc/bugs/html5_time_element__39__s_pubdate_wrong_when_using_xhtml5___34__mode__34__.mdwn @@ -0,0 +1,46 @@ +Hi, + +XML error: + + Created <time datetime="2009-03-24T18:02:14Z" pubdate class="relativedate" title="Tue, 24 Mar 2009 14:02:14 -0400">2009-03-24</time> + +The pubdate REQUIRES a date, so e.g. `pubdate="2009-03-24T18:02:14Z"` + +> No, `pubdate="pubdate"`. It's a boolean attribute. applied && [[done]] +> --[[Joey]] +>> awesome, thanks for fixing my fix ;) --[[simonraven]] + +Otherwise the XML parser chokes. + +<http://www.whatwg.org/specs/web-apps/current-work/multipage/text-level-semantics.html#attr-time-pubdate> + +(indented exactly 4 spaces) + +<pre> + diff --git a/IkiWiki.pm b/IkiWiki.pm + index 1f2ab07..6ab5b56 100644 + --- a/IkiWiki.pm + +++ b/IkiWiki.pm + @@ -1004,7 +1004,7 @@ sub displaytime ($;$$) { + my $time=formattime($_[0], $_[1]); + if ($config{html5}) { + return '<time datetime="'.date_3339($_[0]).'"'. + - ($_[2] ? ' pubdate' : ''). + + ($_[2] ? ' pubdate="'.date_3339($_[0]).'"' : ''). + '>'.$time.'</time>'; + } + else { + diff --git a/IkiWiki/Plugin/relativedate.pm b/IkiWiki/Plugin/relativedate.pm + index fe8ef09..8c4a1b4 100644 + --- a/IkiWiki/Plugin/relativedate.pm + +++ b/IkiWiki/Plugin/relativedate.pm + @@ -59,7 +59,7 @@ sub mydisplaytime ($;$$) { + + if ($config{html5}) { + return '<time datetime="'.IkiWiki::date_3339($time).'"'. + - ($pubdate ? ' pubdate' : '').$mid.'</time>'; + + ($pubdate ? ' pubdate="'.IkiWiki::date_3339($time).'"' : '').$mid.'</time>'; + } + else { + return '<span'.$mid.'</span>'; +</pre> diff --git a/doc/bugs/htmlbalance_fails_with_HTML-Tree_v4.mdwn b/doc/bugs/htmlbalance_fails_with_HTML-Tree_v4.mdwn new file mode 100644 index 000000000..92427065d --- /dev/null +++ b/doc/bugs/htmlbalance_fails_with_HTML-Tree_v4.mdwn @@ -0,0 +1,18 @@ +[[!template id=gitbranch branch=smcv/ready/htmlbalance author="[[smcv]]"]] +[[!tag patch]] + +My one-patch htmlbalance branch fixes incompatibility with HTML::Tree 4.0. +From the git commit: + + The HTML::Tree changelog says: + + [THINGS THAT MAY BREAK YOUR CODE OR TESTS] + ... + * Attribute names are now validated in as_XML and invalid names will + cause an error. + + and indeed the regression tests do get an error. + +--[[smcv]] + +[[done]] --[[Joey]] diff --git a/doc/bugs/htmlscrubber_breaks_multimarkdown_footnotes.mdwn b/doc/bugs/htmlscrubber_breaks_multimarkdown_footnotes.mdwn new file mode 100644 index 000000000..343037b45 --- /dev/null +++ b/doc/bugs/htmlscrubber_breaks_multimarkdown_footnotes.mdwn @@ -0,0 +1,18 @@ +I enabled multimarkdown to make use of footnotes in my file. I have the multimarkdown plugin, +as well as the command-line program. If I write a document with footnotes: + + This line has a footnote[^1] + + [^1]: this is the footnote + +and compile it from the cli, the reference becomes a link to the footnote and the footnote +gets a backreferencing link appended. When compiled in ikiwiki with the goodstuff plugin +enabled, the links are created but their hrefs are empty (so they do not actually act as links). +Disabling the htmlscrubber plugin fixes this issue + +[[!tag multimarkdown htmlscrubber]] + +> href was of the form: #fnref:1 , scrubbed by overzealous protocol +> scrubbing. + +[[done]] --[[Joey]] diff --git a/doc/bugs/http_proxy_for_openid.mdwn b/doc/bugs/http_proxy_for_openid.mdwn index 3d0c99b83..dac4d2736 100644 --- a/doc/bugs/http_proxy_for_openid.mdwn +++ b/doc/bugs/http_proxy_for_openid.mdwn @@ -22,8 +22,7 @@ Note that using $ua->proxy(['https'], ...); won't work, you get a "Not Implement Also note that the proxy won't work with liblwpx-paranoidagent-perl, I had to remove liblwpx-paranoidagent-perl first. -Please get the patch from the *.mdwn source. - +<pre> louie:/usr/share/perl5/IkiWiki/Plugin# diff -u openid.pm.old openid.pm --- openid.pm.old 2008-10-26 12:18:58.094489360 +1100 +++ openid.pm 2008-10-26 12:40:05.763429880 +1100 @@ -42,6 +41,11 @@ louie:/usr/share/perl5/IkiWiki/Plugin# diff -u openid.pm.old openid.pm # Store the secret in the session. my $secret=$session->param("openid_secret"); if (! defined $secret) { - +</pre> Brian May + +> Rather than adding config file settings for every useful environment +> variable, there is a ENV config file setting that can be used to set +> any environment variables you like. So, no changed needed. [[done]] +> --[[Joey]] diff --git a/doc/bugs/httpauth_conflicts_with_git_anon_push.mdwn b/doc/bugs/httpauth_conflicts_with_git_anon_push.mdwn new file mode 100644 index 000000000..91507f57a --- /dev/null +++ b/doc/bugs/httpauth_conflicts_with_git_anon_push.mdwn @@ -0,0 +1,25 @@ +Someone tried to report a bug using IRC while I was on vacation. +--[[Joey]] + +<pre> +julm: [11:58:35] han, it's me the problem; I was generating a post-update hook instead of a pre-receive hook +julm: [12:03:59] why does the pre-receive hook return: "Status: 302 Found" and a "Location: <url>"? Is it normal? +julm: [00:08:44] it's Plugin/httpauth.pm which is outputing those Status and Location +julm: [00:09:12] problem is that it's an anonymous push via git:// +julm: [03:28:53] hacked my way to fix it somehow: http://git.internet.alpes.fr.eu.org/?p=web/ikiwiki.git;a=commitdiff;h=7211df4f7457c3afab53822a97cbd21825c473f4 +</pre> + +Analysis: + +* IkiWiki::Receive calls `check_canedit`. +* httpauth's canedit hook returns an error handler function + which redirects the browser through the cgiauthurl. + (Similarly, signinedit's hook may call needsignin, which + can display a signin form. +* That doesn't work well when doing a git anon push. :) +* Also, IkiWiki::Receive calls `check_canattach` and + `check_canremove`, which both also call `check_canedit`. + +So, all these calls need to avoid running the error handler +functions returned by canedit hooks, and just return error +messages. [[done]] --[[Joey]] diff --git a/doc/bugs/ikiwiki-transition_does_not_set_perl_moduels_path_properly.mdwn b/doc/bugs/ikiwiki-transition_does_not_set_perl_moduels_path_properly.mdwn new file mode 100644 index 000000000..b3e87b529 --- /dev/null +++ b/doc/bugs/ikiwiki-transition_does_not_set_perl_moduels_path_properly.mdwn @@ -0,0 +1,17 @@ +When installing ikiwiki the perl module path is setup correctly + + use lib '/usr/local/ikiwiki-3.20100312/share/perl/5.10.0'; + +This is not true for ikiwiki-transition: + + $ PATH=/usr/local/ikiwiki-3.20100312/bin ikiwiki-transition prefix_directives ikiwiki.setup + Can't locate IkiWiki.pm in @INC (@INC contains: /etc/perl /usr/local/lib/perl/5.10.0 + /usr/local/share/perl/5.10.0 /usr/lib/perl5 /usr/share/perl5 /usr/lib/perl/5.10 /usr/share/perl/5.10 /usr/local/lib/site_perl .) + at /usr/local/ikiwiki-3.20100312/bin/ikiwiki-transition line 4. + BEGIN failed--compilation aborted at /usr/local/ikiwiki-3.20100312/bin/ikiwiki-transition line 4. + +The missing line should be added. + +Thanks! + +[[done]] --[[Joey]] diff --git a/doc/bugs/ikiwiki_ignores_PATH_environment.mdwn b/doc/bugs/ikiwiki_ignores_PATH_environment.mdwn new file mode 100644 index 000000000..6781d4b4b --- /dev/null +++ b/doc/bugs/ikiwiki_ignores_PATH_environment.mdwn @@ -0,0 +1,24 @@ +At the very top of the main ikiwiki executable script the `PATH` environment is set like this: + + $ENV{PATH}="/usr/local/bin:/usr/bin:/bin:/opt/local/bin"; + +This makes it a little hard to specify which specific binaries should be used, especially if there is more than one of them available (see c.f. <http://trac.macports.org/ticket/26333> where the MacPorts-supplied, up-to-date subversion should be used and not an arcane one from the base distro / OS). Is there a specific reason why ikiwiki wipes out `$PATH` like this or could that line be improved to + + $ENV{PATH}="$ENV{PATH}:/usr/local/bin:/usr/bin:/bin:/opt/local/bin"; + +? The alternative is of course to patch ikiwiki as suggested in the bug, but I wanted to ask here first :) + +> You can use the ENV setting in your setup file to set any environment +> variables you like. Since ikiwiki.cgi is run by the web browser, that +> is the best way to ensure ikiwiki always runs with a given variable set. +> +> As a suid program, the ikiwiki wrappers have to sanitize the environment. +> The ikiwiki script's own sanitization of PATH was done to make perl taint +> checking happy, but as taint checking is disabled anyway, I have removed +> that. [[done]] --[[Joey]] + +Question: Do ikiwiki.cgi and the RCS post-commit script sanitize the $PATH separately from bin/ikiwiki? If not, then bin/ikiwiki is probably right to sanitize the $PATH; otherwise you've created a security hole with access to the account that ikiwiki is SUID to. It'd be nice if /opt/local/bin were earlier in the $PATH, but that can be changed (as noted) in the setup file. [[Glenn|geychaner@mac.com]] (Also the person who started this by filing an issue with MacPorts; I'm experimenting with ikiwiki for collaborative documentation.) + +> The suid wrappers remove all environment variables except for a few used +> for CGI. PATH is not propigated by them, so when they run ikiwiki it will +> get the system's default path now. --[[Joey]] diff --git a/doc/bugs/ikiwiki_lacks_a_--quiet.mdwn b/doc/bugs/ikiwiki_lacks_a_--quiet.mdwn new file mode 100644 index 000000000..48fa3b068 --- /dev/null +++ b/doc/bugs/ikiwiki_lacks_a_--quiet.mdwn @@ -0,0 +1,29 @@ +When building ikiwiki in the background, having a --quiet which will only +report errors would be nice. -- RichiH + +> Except for building wrappers, and possibly progress cruft output to +> stderr by git (gag), ikiwiki is quiet by default. --[[Joey]] + +>> Correct, which means it's not quite quiet: + + % ikiwiki --setup foo --rebuild + generating wrappers.. + successfully generated foo + successfully generated foo + rebuilding wiki.. + scanning [...] + [...] + building [...] + [...] + done + +Yes, I can simply redirect the output, but an option would be cleaner, imo. -- Richard + +> The output above looks like verbose mode output to me (the scanning/building lines, at least). Check you haven't enabled it in your setup file by accident. I get the following: + + $ ikiwiki --setup setup + successfully generated [cgi] + successfully generated [post-update] + skipping bad filename [...] + +> I've written a patch ([[merged|done]]), pull request sent) that fixes the 'generated...' lines. -- [[Jon]] diff --git a/doc/bugs/img_plugin_and_class_attr.mdwn b/doc/bugs/img_plugin_and_class_attr.mdwn new file mode 100644 index 000000000..7e880b4fc --- /dev/null +++ b/doc/bugs/img_plugin_and_class_attr.mdwn @@ -0,0 +1,27 @@ +The [[plugins/img]] plugin is not generating the proper `class` +attribute in its HTML output. + +The plugin receives something like the following: + + \[[!img 129199047595759991.jpg class="centered"]] + +And is supossed to generate an HTML code like the following: + + <img src="129199047595759991.jpg" class="centered" /> + +But is generating the following + + <img src="129199047595759991.jpg" class="centered img" /> + +This seems to be happening with all images inserted using the plugin (that use +the `class=yaddayadda` argument to the `img` directive.) I remember it didn't +happen before, and I suspect an ikiwiki upgrade is to blame. I tested with a +blog created from scratch, and a single post, and the problem appeared there +too. + +This is happening with version 3.20100815 of ikiwiki. + +[[jerojasro]] + +> How is this a bug? It's perfectly legal html for a class attribute to +> put an element into multiple classes. [[notabug|done]] --[[Joey]] diff --git a/doc/bugs/img_plugin_and_missing_heigth_value.mdwn b/doc/bugs/img_plugin_and_missing_heigth_value.mdwn new file mode 100644 index 000000000..4bc070c95 --- /dev/null +++ b/doc/bugs/img_plugin_and_missing_heigth_value.mdwn @@ -0,0 +1,7 @@ +When I set up my picture page with `\[[!img defaults size=300x]]` then the html validator complains that the value for height is missing and the IE browsers won't show the pictures up at all; no problems with ff tho. If I set up my picture page with `\[[!img defaults size=300x300]]` then all the images are funny stretched. What am I doing wrong? + +> This is a bug. --[[Joey]] + +> And .. [[fixed|done]] --[[Joey]] + +>> Not quite; For some reason it requires me to update wiki pages twice before the height value shows up. diff --git a/doc/bugs/img_vs_align.mdwn b/doc/bugs/img_vs_align.mdwn new file mode 100644 index 000000000..c78465a37 --- /dev/null +++ b/doc/bugs/img_vs_align.mdwn @@ -0,0 +1,38 @@ +The *[[ikiwiki/directive/img]]* directive allows for specifying an +*align* parameter -- which is of limited usability as the image is +embedded as `<p><img ...></p>`. That's at least what I see on +<http://www.bddebian.com:8888/~hurd-web/hurd/status/>. On the other +hand, CSS is supposed to be used instead, I guess. (But how... I forgot +almost of my CSS foo again ;-) it seems.) --[[tschwinge]] + +> [[!img logo/ikiwiki.png align=right]]The [img tag doesn't create P tags](http://git.ikiwiki.info/?p=ikiwiki;a=blob;f=IkiWiki/Plugin/img.pm;h=32023fa97af8ba8e63192cacaff10a4677d20654;hb=HEAD), but if you have surrounded the img directive with newlines, they will result in paragraph tags. +> +> I've edited the URL you provided to demonstrate this -- hope you don't mind! I've also added an inline, right-aligned image to this page.[[!tag done]] +> -- [[Jon]] + +> Contrary to all of the above, html does not care about P tags when +> floating an image to the left or right via align. Proof: +> <http://kitenet.net/~joey/pics/toomanypicturesofjoey/>, where the image +> is in its own paragraph but still floats. Also, I re-modified a local +> copy of the hurd page to enclose the image in a P, and it still floats. +> +> Tested with Chromium and Firefox. --[[Joey]] + +>> Uh, sorry for not confirming what I supposed to be with looking into +>> the relevant standard. It just seemed too obvious to me that the +>> closure of `<p>...</p>` would confine whatever embedded stuff may be +>> doing. (Meaning, I didn't expect that the *img*'s alignment would +>> propagate to the *p*'s and would thus be visible from the outside.) +>> +>> I confirm (Firefox, Ubuntu jaunty) that your picture page is being +>> shown correctly -- thus I suppose that there's a buglet in our CSS +>> scripts again... +>> +>> --[[tschwinge]] + +>>> It seems, the 'align=right' parameter gets filtered in my installation +>>> Are there other plugins, that could throw the parameter away? +>>> --[[jwalzer]] + +>>>> Can't think of anything. htmlscrubber doesn't; tidy doesn't. +>>>> --[[Joey]] diff --git a/doc/bugs/inline_action_buttons_circumvent_exclude_criteria_from_edittemplate__39__s_match__61____34____34___pagespec.mdwn b/doc/bugs/inline_action_buttons_circumvent_exclude_criteria_from_edittemplate__39__s_match__61____34____34___pagespec.mdwn new file mode 100644 index 000000000..2e2d35381 --- /dev/null +++ b/doc/bugs/inline_action_buttons_circumvent_exclude_criteria_from_edittemplate__39__s_match__61____34____34___pagespec.mdwn @@ -0,0 +1,15 @@ +ikiwiki verison: 3.20100815.2 + +If I instruct editemplate to only match the top level pages in a directory using + + match="foo/* and !foo/*/* and !foo/*/*/*" + +everything works as expected for pages created via links on other wiki pages. So, if I open foo/bar (or any other page on the wiki) and create a link to foo/bar/bug, edittemplate appropriately does not insert any text into the new page. + +However, if I use an inline directive like the following + + !inline pages="page(foo/bar/*)" rootpage="foo/bar" postform=yes actions=yes + +every page created via the action buttons incorrectly pulls in the text from the edittemplate registration. Changing the order of the conditions in the match="" pagespec has no impact. + +> [[fixed|done]] --[[Joey]] diff --git a/doc/bugs/inline_archive_crash.mdwn b/doc/bugs/inline_archive_crash.mdwn new file mode 100644 index 000000000..a1139a9bc --- /dev/null +++ b/doc/bugs/inline_archive_crash.mdwn @@ -0,0 +1,6 @@ + \[[!inline Error: Undefined subroutine &HTML::Entities::encode_numeric called at /usr/share/perl5/IkiWiki/Plugin/meta.pm line 292.]] + +This occurred when recentchanges was disabled and building a change +to a page that inlined other pages with archive=yes. I have +committed a fix; filing a bug since the fix won't be landing in Debian stable any +time soon. [[done]] --[[Joey]] diff --git a/doc/bugs/inline_breaks_PERMALINK_variable.mdwn b/doc/bugs/inline_breaks_PERMALINK_variable.mdwn new file mode 100644 index 000000000..fc891bb25 --- /dev/null +++ b/doc/bugs/inline_breaks_PERMALINK_variable.mdwn @@ -0,0 +1,25 @@ +in 3.20091017 the following inline + +> `\[[!inline pages="internal(foo/bar/baz/*)" show=3 archive="yes" feeds="no" template="sometemplates"]]` + +with sometemplate being + +> `<p><a href="<TMPL_VAR PERMALINK>"><TMPL_VAR TITLE></a> (<TMPL_VAR CTIME>)</p>` + +produced output that links nowhere (`<a href="">`) while the other variables do fine. This problem does not occur in 3.1415926. + +> This must be caused by an optimisation that avoids reading the page +> content when using a template that does not use CONTENT. +> +> I guess that it needs to instead check all the variables the template +> uses, and read content if PERMALINK, or probably any other unknown +> variable is used. Unfortunatly, that will lose the optimisation +> for the archivepage template as well -- it also uses PERMALINK. +> +> So, revert the optimisation? Or, make meta gather the permalink +> data on scan? That seems doable, but is not a general fix for +> other stuff that might be a) used in a template and b) gathered +> at preprocess time. +> +> For now, I am going with the special case fix of fixing meta. I may need +> to go for a more general fix later. --[[Joey]] [[!tag done]] diff --git a/doc/bugs/inline_plugin_sets_editurl_even_when_editpage_is_disabled.html b/doc/bugs/inline_plugin_sets_editurl_even_when_editpage_is_disabled.html new file mode 100644 index 000000000..62c91a932 --- /dev/null +++ b/doc/bugs/inline_plugin_sets_editurl_even_when_editpage_is_disabled.html @@ -0,0 +1,16 @@ +see subject, simple patch below +<pre> +--- a/IkiWiki/Plugin/inline.pm ++++ b/IkiWiki/Plugin/inline.pm +@@ -371,7 +371,8 @@ sub preprocess_inline (@) { + } + if (length $config{cgiurl} && defined $type) { + $template->param(have_actions => 1); +- $template->param(editurl => cgiurl(do => "edit", page => $page)); ++ $template->param(editurl => cgiurl(do => "edit", page => $page)) ++ if IkiWiki->can("cgi_editpage"); + } + } +</pre> + +[[done]] --[[Joey]] diff --git a/doc/bugs/inline_raw_broken_on_unknown_pagetype.mdwn b/doc/bugs/inline_raw_broken_on_unknown_pagetype.mdwn new file mode 100644 index 000000000..19aa94e7e --- /dev/null +++ b/doc/bugs/inline_raw_broken_on_unknown_pagetype.mdwn @@ -0,0 +1,29 @@ +When trying to insert the raw content of an attached shell script +called `whatever` using: + + \[[!inline pages="whatever" raw="yes"]] + +The generated HTML contains: + + \[[!inline Erreur: Can't call method "param" on an undefined value + at /usr/local/share/perl/5.10.0/IkiWiki/Plugin/inline.pm + line 346.]] + +Looking at the inline plugin's code, it is clear that `$template` is +undef in such a situation. Defining `$template` just before line 346, +in case it's not defined, removes the error message, but nothing +gets inlined as `get_inline_content` returns the empty string in +this situation. + +If we explicitely don't want to allow raw inlining of unknown page +types, ikiwiki should output a better error message. + +> I have made it just do a direct include if the page type is not known, in +> raw mode. That seems useful if you want to include some other file right +> into a page. You could probably even wrap it in a format directive. +> +> It does allow including binary files right into a page, but nothing is +> stopping you pasting binary data right into the edit form either, so +> while annoying I don't think that will be a security problem. --[[Joey]] + +[[done]] diff --git a/doc/bugs/inline_skip_causes_empty_inline.mdwn b/doc/bugs/inline_skip_causes_empty_inline.mdwn new file mode 100644 index 000000000..e1cbc5470 --- /dev/null +++ b/doc/bugs/inline_skip_causes_empty_inline.mdwn @@ -0,0 +1,10 @@ +When using the [[directive/inline]] directive with the skip parameter i get +emtpy list inline (no output at all). The same inline used to work before +but not in 3.20091031. + +> I need more information to help. Skip is working as expected here. +> Can I download/clone your wiki? --[[Joey]] + +>> The bug occurs only together with archive="yes" as I Just found out: + +>>> Thanks, [[fixed|done]] in git. --[[Joey]] diff --git a/doc/bugs/login_page_non-obvious_with_openid.mdwn b/doc/bugs/login_page_non-obvious_with_openid.mdwn index 1d087985a..9aa702037 100644 --- a/doc/bugs/login_page_non-obvious_with_openid.mdwn +++ b/doc/bugs/login_page_non-obvious_with_openid.mdwn @@ -36,7 +36,7 @@ If you want to keep it as one form, then perhaps using some javascript to disabl > that allows modifying that form, but does not allow creating a separate > form. The best way to make it obvious how to use it currently is to just > disable password auth, then it's nice and simple. :-) Javascript is an -> interesting idea. It's also possible to write a custom [[signin.tmpl wikitemplates]] that +> interesting idea. It's also possible to write a custom [[templates]] that > is displayed instead of the regular signin form, and it should be > possible to use that to manually lay it out better than FormBuilder > manages with its automatic layout. --[[Joey]] @@ -44,4 +44,4 @@ If you want to keep it as one form, then perhaps using some javascript to disabl > I've improved the form, I think it's more obvious now that the openid > stuff is separate. Good enough to call this [[done]]. I think. --[[Joey]] ->> Looks good, thanks! :-) -- [[AdamShand]]
\ No newline at end of file +>> Looks good, thanks! :-) -- [[AdamShand]] diff --git a/doc/bugs/login_page_should_note_cookie_requirement.mdwn b/doc/bugs/login_page_should_note_cookie_requirement.mdwn index 96686053c..17ac12b34 100644 --- a/doc/bugs/login_page_should_note_cookie_requirement.mdwn +++ b/doc/bugs/login_page_should_note_cookie_requirement.mdwn @@ -31,3 +31,9 @@ Best of all would be to use URL-based or hidden-field-based session tokens if co >> don't look static. Are they really? --[MJR](http://mjr.towers.org.uk) >>> As soon as you post an edit page, you are back to a static website. + +>>> It is impossible to get to an edit page w/o a cookie, unless +>>> anonymous edits are allowed, in which case it will save. No data loss. +>>> Since noone is working on this, and the nonsense above has pissed me +>>> off to the point that I will certianly never work on it, I'm going to +>>> close it. --[[Joey]] [[done]] diff --git a/doc/bugs/logout_in_ikiwiki.mdwn b/doc/bugs/logout_in_ikiwiki.mdwn new file mode 100644 index 000000000..6696cc689 --- /dev/null +++ b/doc/bugs/logout_in_ikiwiki.mdwn @@ -0,0 +1,44 @@ +It looks like there is no way to logout of ikiwiki at present, meaning that if you edit the ikiwiki in, say, a cybercafe, the cookie remains... is there some other security mechanism in place that can check for authorization, or should I hack in a logout routine into ikiwiki.cgi? + +> Click on "Preferences". There is a logout button there. --liw + +> It would be nice if it were not buried there, but putting it on the +> action bar statically would be confusing. The best approach might be to +> use javascript. --[[Joey]] + + +>> I agree that javascript seems to be a solution, but my brain falls +>> off the end of the world while looking at ways to manipulate the DOM. +>> (I'd argue also in favor of the openid_provider cookie expiring +>> in less time than it does now, and being session based) + +>>> (The `openid_provider` cookie is purely a convenience cookie to +>>> auto-select the user's openid provider the next time they log +>>> in. As such, it cannot be a session cookie. It does not provide any +>>> personally-identifying information so it should not really matter +>>> when it expires.) --[[Joey]] + +>> It would be nice to move navigational elements to the upper right corner +>> of the page... + +>> I have two kinds of pages (wiki and blog), and three classes of users + +>> anonymous users - display things like login, help, and recentchanges, + +>> non-admin users - on a per subdir basis (blog and !blog) display +>> logout, help, recentchanges, edit, comment + +>> admin users - logout, help, recentchanges, edit, comment, etc + + +I was referred to this page from posting to the forum. I am also interested in being able to use user class and status to modify the page. I will try to put together a plugin. From what I can see there needs to be a few items in it. + +* It should expose a link to a dedicated login page that, once logged in, returns the user to the calling page, or at least the home page. I have started a plugin to do this: [[/plugins/contrib/justlogin]] + +* it needs to expose a link to a little json explaining the type of user and login status. + +* it should expose a link that logs the person out and returns to the calling page, or at least the home page. + +Then there would need to be a little javascript to use these links appropriately. I have little javascript experience but I know that can be done. I am less sure if it is possible to add this functionality to a plugin so I'll start with that. If no one objects I will continue to post here if I make progress. If anyone has any suggestions on how to modify my approach to code it in an easier way I'd appreciate the input. [[justint]] + + diff --git a/doc/bugs/map_sorts_by_pagename_and_not_title_when_show__61__title_is_used.mdwn b/doc/bugs/map_sorts_by_pagename_and_not_title_when_show__61__title_is_used.mdwn new file mode 100644 index 000000000..d12414d55 --- /dev/null +++ b/doc/bugs/map_sorts_by_pagename_and_not_title_when_show__61__title_is_used.mdwn @@ -0,0 +1,20 @@ +The [[ikiwiki/directive/map]] directive sort by pagename. That looks kind of odd, when used together with show=title. I would expect it to sort by title then. + +> This would be quite hard to fix. Map sorts the pages it displays by page +> name, which has the happy effect of making "foo/bar" come after "foo"; +> which it *has* to do, so that it can be displayed as a child of the page +> it's located in. If sorting by title, that wouldn't hold. So, map +> would have to be effectively totally rewritten, to build up each group +> of child pages, and then re-sort those. --[[Joey]] + +>> Ok, you are right, that does would break the tree. This made me think that I do not +>> need to generate a tree for my particular use case just a list, so i thought i could use [[ikiwiki/directive/inline]] instead. +>> This created two new issues: +>> +>> 1. inline also does sort by pagename even when explicitly told to sort by title. +>> +>> 2. I cannot get inline to create a list when the htmltidy plugin is switched on. I have a template which is enclosed in an li tag, and i put the ul tag around the inline manually, but htmltidy breaks this. --martin + +>>>> You might want to check if the [[plugins/contrib/report]] plugin solves your problem. It can sort by title, among other things. --[[KathrynAndersen]] + +>> See also: [[todo/sort_parameter_for_map_plugin_and_directive]] --[[smcv]] diff --git a/doc/bugs/maps_with_nested_directories_sometimes_make_ugly_lists.mdwn b/doc/bugs/maps_with_nested_directories_sometimes_make_ugly_lists.mdwn new file mode 100644 index 000000000..a6546faad --- /dev/null +++ b/doc/bugs/maps_with_nested_directories_sometimes_make_ugly_lists.mdwn @@ -0,0 +1,62 @@ +I'm using the [[map_directive|ikiwiki/directive/map]] to build dynamic navigation menus, and it's working really nicely! + +However on some pages, each nested item each get wrapped in a full set of `<ul>` tags. This doesn't actually hurt anything, but it's does it inconsistently which seems like a bug. I don't like it because it puts extra vertical spacing into my menu bar. + +Here's what I expect it to look like: + + <div class="map"> + <ul> + <li><span class="selflink">Archives</span> + <ul> + <li><a href="./2010/" class="mapitem">2010</a></li> + <li><a href="./2011/" class="mapitem">2011</a></li> + </ul> + </li> + </ul> + </div> + +And here's what it's actually doing: + + <div class="map"> + <ul> + <li><span class="selflink">Archives</span> + <ul> + <li><a href="./2010/" class="mapitem">2010</a></li> + </ul> + <ul> + <li><a href="./2011/" class="mapitem">2011</a></li> + </ul> + </li> + </ul> + </div> + +I've tried to replicate the problem on this site and cannot, I'm not sure if that's because of exactly how I'm using map or if there's something different with my site. I just upgraded ikiwiki to the latest Debian unstable as well as most of the required Perl modules and nothing changed. + +If you look at [this page on my site](http://adam.shand.net/ikidev/archives/) (getsource is enabled) you can see it working as expected in the main page and not working in the side bar. + +But it also doesn't work on the sitemap page: <http://adam.shand.net/ikidev/site/map/> + +This might be really simple, but I've been staring at it too long and it only looks like a bug to me. :-( Any suggestions would be gratefully accepted. -- [[AdamShand]] + +> Okay, I think I've figured this out, it looks like ikiwiki behaves differently depending on the level of heirarchy. I'll post the details once I'm sure. -- [[AdamShand]] + +>> I managed to replicate the issue on my ikiwiki, and I believe it is a +>> bug. The current upstream logic for going up/down by a level opens +>> (and closes) the unnecessary lists that you are seeing. Although the +>> resulting markup is semantically correct, it has superflous stuff +>> that introduces whitespace issues at the very least. + +>> I have a [[patch]] up [on my git repo](http://git.oblomov.eu/ikiwiki/patch/55fa11e8a5fb351f9371533c758d8bd3eb9de245) +>> that ought to fix the issue. + +>>> Wonderful, thank you very much for the help! I've installed the patch and it's working great, but it looks like there a minor bug. Sometimes it doesn't print the top/first map item. Cheers, -- [[AdamShand]] +>>> +>>> <http://adam.shand.net/tmp/map-orig.jpg> +>>> <http://adam.shand.net/tmp/map-patched.jpg> + +>>>> Thanks for testing. I managed to reproduce it and I adjusted the logic. +>>>> An updated [[patch]] can be found [here](http://git.oblomov.eu/ikiwiki/patch/dcfb18b7989a9912ed9489f5ff15f871b6d8c24a) + +>>>>> Seems to work perfectly to me, thanks! -- [[AdamShand]] + +[[applied|done]] --[[Joey]] diff --git a/doc/bugs/minor:_tiny_rendering_error.mdwn b/doc/bugs/minor:_tiny_rendering_error.mdwn new file mode 100644 index 000000000..b2e07eef9 --- /dev/null +++ b/doc/bugs/minor:_tiny_rendering_error.mdwn @@ -0,0 +1,5 @@ +`\[[!inline]]` is rendered with a space in front of the first closing bracket. --[[tschwinge]] + +> I don't think that complicating the directive parser +> is warrented by the minorness of this bug. The result that it outputs is +> still valid. --[[Joey]] diff --git a/doc/bugs/misctemplate_does_not_respect_the_current_page___40__if_any__41__.mdwn b/doc/bugs/misctemplate_does_not_respect_the_current_page___40__if_any__41__.mdwn new file mode 100644 index 000000000..3b0347f5f --- /dev/null +++ b/doc/bugs/misctemplate_does_not_respect_the_current_page___40__if_any__41__.mdwn @@ -0,0 +1,101 @@ +I really like the new approach to having only one main template "page.tmpl". For instance, it improves previews during edits. +But it causes some nasty bugs for plugins that use the pagetemplate hook. It is especially visible with the [[plugins/sidebar]] plugin. + +## Some examples + +### A first example + +* activate sidebars globally and cgi +* create "/sidebar.mdwn" with "[<span></span>[foo]]" inside +* create "/foo.mdwn" with "hello!" inside +* create "/bar.mdwn" +* run ikiwiki +* with the web browser, go to the page "bar" +* notice the sidebar, click on "foo" +* notice the page "foo" is now displayed (hello!) +* return the the page "bar" and click "edit" +* notice the sidebar is still here, click on the "foo" +* -> Problem: 404, the browser goes to "/bar/foo" +* -> Was expected: the browser goes to "/foo" + +> You must have a locally modified `page.tmpl` that omits the "TMPL_IF DYNAMIC" +> that adds a `<base>` tag. That is needed to make all links displayed by +> cgis work reliably. Not just in this page editing case. +> The [[version_3.20100515]] announcment mentions that you need to +> update old `page.tmpl` files to include that on upgrade. --[[Joey]] + +>> I followed the anouncment. I also disabled my custom page.tmpl to confirm the bug. I even produced a step-by-step example to reproduce the bug. +>> In fact, the base tag work for the page links (the content part) but did not works for the sidebar links (the sidebar part) since the sidebar links are generated in the context of the root page. +>> In the examble above: +>> +>> * base="http://www.example.com/bar" relative_link_in_bar=''something" -> absolute_link_in_bar = "http://www.example.com/bar/something" (that is fine) +>> * base="http://www.example.com/bar" relative_link_in_sidebar="foo" (because generated in the context of the root page) -> absolute_link_in_sidebar = "http://www.example.com/bar/foo" (that is not fine) +>> +>> The fix commited work for previewing, but not in other cases : links are still broken. Please juste follow the example step-by-step to reproduce it (I just retried it with a "fixed" version: Debian 3.20100610). If you cannot reproduce, please say it explicitely instead of guessing about my innability to read changelogs. -- [[JeanPrivat]] + +>>> Sorry if my not seeing the bug offended you. [[Fixed|done]] --[[Joey]] + +>>>> Thanks! --[[JeanPrivat]] (I'm not offended) + +### A second example + +* create "/bar/sidebar.mdwn" with "world" +* run ikiwiki +* with the web browser, go to the page "bar" +* notice the sidebar displays "world" +* click "edit" +* -> Problem: the sidebar now shows the foo link (it is the root sidebar!) +* -> Was expecte : the sidebar displays "world" + +> I think it's a misconception to think that the page editing page is the same +> as the page it's editing. If you were deleting that page, would you expect +> the "are you sure" confirmation page to display the page's sidebar? +> --[[Joey]] + +>> It is a very good point and could be argued: +>> +>> * for dynamic page, is the root context more legitimate than the current page context? +>> * when clicking the Edit link, does the user expect to remain in the "same page"? +>> +>> But, as far as something sensible is displayed and that the links work. I'm OK with any choice. -- [[JeanPrivat]] + +### A last example + +* with the web browser edit the page "bar" +* type <code>[<span></span>[!sidebar content="goodby"]]</code> +* click preview +* -> Problem: the sidebar still displays the foo link +* -> Was expected: the sidebar display "goodby" + +> In the specific case of previewing, it is indeed a bug that the +> right sidebar is not displayed. And replacing the regular sidebar +> with the one from the previewed page is probably the best we can do.. +> displaying 2 sidebars would be confusing, and the `page.tmpl` can +> put the sidebar anywhere so we can't just display the preview sidebar +> next to the rest of the page preview. --[[Joey]] + +>> The behavior is fine for me. However, some nitpicking (fell free to ingore) : +>> +>> * If the sidebar is replaced (making the previewing in-place), for consitency, should not the previewed content also shown in-place ? i.e. above the form part +>> * there is no way to come back (without saving or canceling) to the root context (e.g. displaying the root sidebar) i.e. some sort of unpreviewing. +>> +>> -- [[JeanPrivat]] + +## Some superficial hacking + +With the following workaround hacks, I manage to solve the 3 examples shown above: + +1- edit IkiWiki/Plugin/editpage.pm and call showform with additional page and destpage parameters: +<pre>showform($form, \@buttons, $session, $q, forcebaseurl => $baseurl, page => $page, destpage => $page);</pre> + +2- edit /usr/share/perl5/IkiWiki.pm and modify the misctemplate function to use the given page and destpage: +<pre>my %params=@_; +shift->(page => $params{page}, destpage => $params{destpage}, template => $template);</pre> + +I do not guarantee (I do not even expect) that it is the proper way to solve +this bug but it may help developers to find and solve the real problem. + +> Oh, it's pretty reasonable. I don't think it breaks anything. :) +> I modified it a bit, and explicitly made it *not* "fix" the second example. +> --[[Joey]] +>> I removed the done tag (I suspect it is the way to reopen bugs) -- [[JeanPrivat]] diff --git a/doc/bugs/monotone_backend_does_not_support_srcdir_in_subdir.mdwn b/doc/bugs/monotone_backend_does_not_support_srcdir_in_subdir.mdwn new file mode 100644 index 000000000..35f624f78 --- /dev/null +++ b/doc/bugs/monotone_backend_does_not_support_srcdir_in_subdir.mdwn @@ -0,0 +1,5 @@ +The [[rcs/monotone]] backend does not currently support putting the ikiwiki srcdir +in a subdirectory of the repository. It must be at the top. Git has +special code to handle this case. --[[Joey]] + +[[done]] diff --git a/doc/bugs/more_and_RSS_generation.mdwn b/doc/bugs/more_and_RSS_generation.mdwn new file mode 100644 index 000000000..00ab43fa2 --- /dev/null +++ b/doc/bugs/more_and_RSS_generation.mdwn @@ -0,0 +1,20 @@ +I'd like the more plugin and RSS to play better together. In the case of the html generation of the main page of a blog, I'd like to get the first paragraph out, but keep RSS as a full feed. + +Maybe there is a different plugin (I also tried toggle)? + +> I am not a fan of the more directive (thus the rant about it sucking +> embedded in its [[example|ikiwiki/directive/more]]). But I don't think +> that weakening it to not work in rss feeds is a good idea, if someone +> wants to force users to go somewhere to view their full content, +> they should be able to do it, even though it does suck. +> +> The toggle directive will degrade fairly well in an rss feed to +> display the full text. (There is an annoying toggle link that does +> nothing when embedded in an rss feed). --[[Joey]] + +I also note, that at least currently, more seems to break on a few pages, not being parsed at all when aggregated into the front page. + +> It's just a simple directive, it should work anywhere any directive will, +> and does as far as I can see. Details? --[[Joey]] + +see also: [[/bugs/rss_feeds_do_not_use_recommended_encoding_of_entities_for_some_fields/]] diff --git a/doc/bugs/nested_raw_included_inlines.mdwn b/doc/bugs/nested_raw_included_inlines.mdwn index 33433e235..92ea4c4ef 100644 --- a/doc/bugs/nested_raw_included_inlines.mdwn +++ b/doc/bugs/nested_raw_included_inlines.mdwn @@ -32,3 +32,20 @@ Am I missing something? Is this a bug or Ikiwiki not supposed to support this us > currently merges pagespecs, though - maybe the patches I suggested for > [[separating_and_uniquifying_pagespecs|todo/should_optimise_pagespecs]] > would help? --[[smcv]] + +>> That, or something seems to have helped in the meantime... +>> Actually, I think it was the [[transitive_dependencies]] support +>> that did it, though smcv's pagespec stuff was also a crucial improvement. +>> +>> Anyhoo: + + joey@gnu:~/tmp>touch testcase/page2.mdwn + joey@gnu:~/tmp>ikiwiki -v testcase html + refreshing wiki.. + scanning page2.mdwn + building page2.mdwn + building page1.mdwn, which depends on page2 + building page0.mdwn, which depends on page1 + done + +>> I happily think this is [[done]] --[[Joey]] diff --git a/doc/bugs/no_search_button__44___hence_can__39__t_do_search_in_w3m_at_ikiwiki.info.mdwn b/doc/bugs/no_search_button__44___hence_can__39__t_do_search_in_w3m_at_ikiwiki.info.mdwn new file mode 100644 index 000000000..2d600fdbb --- /dev/null +++ b/doc/bugs/no_search_button__44___hence_can__39__t_do_search_in_w3m_at_ikiwiki.info.mdwn @@ -0,0 +1,32 @@ +If I browse <http://ikiwiki.info> in [emacs-w3m](http://www.emacswiki.org/emacs/emacs-w3m) (without Javascript), I +can't do a [[search|plugins/search]]: the text field is there (so I can +enter my search request), but there seems to be no way to make +actually a search request (i.e., no button). + +(A remark on how it works now in the other browsers: +In the more "complete" +browsers (Chromium etc.), the request is done by pressing Enter in the +text field.) +--Ivan Z. + +I see, no Javascript is probably involved in using the search form; +the code is simply: + + <form method="get" action="/ikiwiki.cgi" id="searchform"> + <div> + <input type="text" id="searchbox" name="P" value="" size="16" + /> + </div> + </form> + +So, if the semantics suggested by HTML is such that such a form is to +be submitted by some default form submitting action in the UI and it +doesn't really require a button to be functional, then I'd say it's +not an ikiwiki's problem, but a missing feature in the UI of emacs-w3m +or the underlying w3m... Perhaps I'll report this issue to them. --Ivan Z. + +[[!tag done]] +There is no problem at all! +I'm sorry for this hassle! +In emacs-w3m, there is the <code>w3m-submit-form</code> command +(<kbd>C-c C-c</kbd>) to submit the form at point; it works.--Ivan Z. diff --git a/doc/bugs/pagemtime_in_refresh_mode.mdwn b/doc/bugs/pagemtime_in_refresh_mode.mdwn new file mode 100644 index 000000000..f926ec86c --- /dev/null +++ b/doc/bugs/pagemtime_in_refresh_mode.mdwn @@ -0,0 +1,28 @@ +I'd like a way to always ask the RCS (Git) to update a file's mtime in +refresh mode. This is currently only done on the first build, and later +for `--gettime --rebuild`. But always rebuilding is too heavy-weight for +this use-case. My options are to either manually set the mtime before +refreshing, or to have ikiwiki do it at command. I used to do the +former, but would now like the latter, as ikiwiki now generally does this +timestamp handling. + +From a quick look, the code in `IkiWiki/Render.pm:find_new_files` is +relevant: `if (! $pagemtime{$page}) { [...]`. + +How would you like to tackle this? + +--[[tschwinge]] + +> This could be done via a `needsbuild` hook. The hook is passed +> the list of changed files, and it should be safe to call `rcs_getmtime` +> and update the `pagemtime` for each. +> +> That lets the feature be done by a plugin, which seems good, since +> `rcs_getmtime` varies between very slow and not very fast, depending on +> VCS. +> +> AFAICS, the only use case for doing this is if you commit changes and +> then delay pushing them to a DVCS repo. Since then the file mtime will +> be when the change was pushed, not when it was committed. But I've +> generally felt that recording when a change was published to the repo +> of a wiki as its mtime is good enough. --[[Joey]] diff --git a/doc/bugs/pages_missing_top-level_directory.mdwn b/doc/bugs/pages_missing_top-level_directory.mdwn new file mode 100644 index 000000000..77c31cd27 --- /dev/null +++ b/doc/bugs/pages_missing_top-level_directory.mdwn @@ -0,0 +1,78 @@ +Hi, + +I've rebuilt two sites now, and anything that requires a working directory structure isn't working properly. I have no idea how it's doing this. I don't see anything in my templates, and I haven't messed around with the back-end code much. + +An example would show this best I think. + +<pre> +/ <- root of site +/About/ <- sub-directory + /Policy/ <- sub-sub- +</pre> + +When you're on /About/, any generated links get mapped to /Policy/ and NOT /About/Policy/ - of course this results in a 404 error. + +I used to be able to use relative links or absolute ones to get the links I want, and now I can't do either. The generated link results in a 404 due to the stripping of a directory. + +I don't know if it's related to the fact that I have one ikiwiki install under another (/blog/ under / is also ikiwiki), but both are FUBAR. + +> what do you mean by generated links: do you mean the output of +> [[ikiwiki/wikilink]]s? Or are you generating links some other way? +> When you say "on /About/, any generated links get mapped to +> /Policy/ and NOT /About/Policy" can you provide an example of what +> source generates the link? -- [[Jon]] + +>> No, a \[[map]] call, such as: +>> +>> (actual code)<br /> +>> = = = = =<br /> +>> \[[!map pages="About/*" show="title"]]<br /> +>> = = = = =<br /> +>> +>> The end result is:<br /> +>> (actual code) +>> +<pre> +<div class="map"> +<ul> +<li><a class="mapitem" href="./Policy/">Policy</a> +<ul> +<li><a class="mapitem" href="./Policy/Microblog/">Microblogging subscription policy</a> +</li> +</ul> +</li> +</ul> +</div> +</pre> + +> I'm also confused about what is generating the links. The map directive? +> You? --[[Joey]] + +>> see above :) + +>> I suspect this is due to git scanning everything under the pwd of the .git/ directory, but not totally so. + +>>> Ikiwiki never, ever, looks in directories with names starting with a +>>> dot. --[[Joey]] + +>> Other ikiwiki sites I have don't do this, and work OK, on the same server, but different docroots. + +>>> Well, I've moved my blog to under my site's docroot - in terms of git +>>> and ikiwiki - and it's still cutting out a whole directory level. I +>>> have no idea what's going on. I need to check the code. The site is at +>>> http://simonraven.kisikew.org/ - if you follow the "About" link, you'll +>>> understand exactly what's going on, if you look at the URL in your +>>> status bar (or under your cursor if you're using a text browser). + +>>>> Your page contains the following in its html: +>>>> `<base href="../" />` +>>>> +>>>> Given a link like "./Policy/", which is *correct*, and when on the +>>>> About page will normally link to the About/Policy page, this causes +>>>> the link to really link to ".././Policy/" which is of course broken. +>>>> +>>>> Ikiwiki's standard page templates do not contain this base tag, so +>>>> I guess your customised templates are broken. --[[Joey]] [[done]] + +>>>>> I totally forgot about that tag... good catch. I was thinking it was my template that was broken, since yesterday, but I couldn't see what. Thank you very much for your eyes. + diff --git a/doc/bugs/pagespec:_tagged__40____41____44___globbing.mdwn b/doc/bugs/pagespec:_tagged__40____41____44___globbing.mdwn new file mode 100644 index 000000000..f9cb37487 --- /dev/null +++ b/doc/bugs/pagespec:_tagged__40____41____44___globbing.mdwn @@ -0,0 +1,36 @@ +With the current HEAD (b10d353490197b576ef7bf2e8bf8016039efbd2d), +globbing in `tagged()` pagespecs doesn't work for me. For example, +`tagged(*)` doesn't match any pages. (It does in this wiki installation +here, though.) + +I did not yet do any testing to figure out when this broke. + +--[[tschwinge]] + +[[!map pages="*/a* and tagged(*ose)"]] + +> Are you sure that `tagged()` ever matches pages there? Take globbing +> out of the equasion. +> +> This could be as simple as you having not rebuilt the wiki +> on upgrade to the version that tracks tagged links. --[[Joey]] + +>> Yes, it is a globbing issue: + +>> \[[!map pages="tagged(open_i*ue_gdb)" show=title]] + +>> ... doesn't show anything. + +>> \[[!map pages="tagged(open_issue_gdb)" show=title]] + +>> ... does show a map of eight pages. Also, it's working fine on the +>> autotags pages. + +>> --[[tschwinge]] + +>>> Only way I can reproduce something like this is if tagbase is not set. +>>> I have fixed a bug there, see if it works for you? +>>> --[[Joey]] + +>>>> This is now indeed [[fixed|done]] (thanks!) -- even though I already +>>>> did have tagbase set. diff --git a/doc/bugs/pagespec_error_on_refresh_but_not_rebuild.mdwn b/doc/bugs/pagespec_error_on_refresh_but_not_rebuild.mdwn new file mode 100644 index 000000000..e89545955 --- /dev/null +++ b/doc/bugs/pagespec_error_on_refresh_but_not_rebuild.mdwn @@ -0,0 +1,32 @@ +I'm getting this error message when I refresh my wiki: + + $ hg commit -u me -m "Minor corrections" + refreshing wiki.. + scanning htmletc/moco-conf-rooms.mdwn + building htmletc/moco-conf-rooms.mdwn + Use of uninitialized value in concatenation (.) or string at /usr/local/lib/perl5/site_perl/5.8.9/Text/Typography.pm line 542. + building sidebar.mdwn, which depends on htmletc/moco-conf-rooms + building contact.mdwn, which depends on sidebar + building 500.mdwn, which depends on sidebar + Use of uninitialized value in concatenation (.) or string at /usr/local/lib/perl5/site_perl/5.8.9/Text/Typography.pm line 542. + building ceramics.mdwn, which depends on sidebar + building glossary.mdwn, which depends on sidebar + syntax error in pagespec "internal(glossary/comment_*)" + warning: post-commit hook exited with status 2 + +But there is no error if I use `ikiwiki --rebuild` to regenerate the whole thing. + +> You neglect to say what version of ikiwiki this is, +> or give any information to reproduce the bug. +> +> My guess: A version older than 3.20100403, which included +> [this bugfix](http://git.ikiwiki.info/?p=ikiwiki;a=commitdiff;h=799b93d258bad917262ac160df74136f05d4a451), +> which could lead to incorrect "syntax error in pagespec" +> that only happened some of the time. +> +> (The Text::Typography warning seems probably unrelated.) +> --[[Joey]] + +>> I'm sorry, I don't know what I was thinking there. It's ikiwiki 3.20100212, and manually applying the patch you linked to made the bug go away. (Upgrading ikiwiki is a pain on nearlyfreespeech, especially if you don't want to keep the build directory around -- please consider making ikiwiki runnable directly from a git clone.) + +[[!meta link="done"]] diff --git a/doc/bugs/pagetitle_function_does_not_respect_meta_titles.mdwn b/doc/bugs/pagetitle_function_does_not_respect_meta_titles.mdwn index be14e5126..c6e3cd4fd 100644 --- a/doc/bugs/pagetitle_function_does_not_respect_meta_titles.mdwn +++ b/doc/bugs/pagetitle_function_does_not_respect_meta_titles.mdwn @@ -144,7 +144,7 @@ So, looking at your meta branch: --[[Joey]] has no title, then A will display the link as "B". Now page B is modified and a title is added. Nothing updates "A". The added overhead of rebuilding every page that links to B when B is - changed (as the `postscan` hook of the po plugin does) is IMHO a killer. + changed (as the `indexhtml` hook of the po plugin does) is IMHO a killer. That could be hundreds or thousands of pages, making interactive editing way slow. This is probably the main reason I had not attempted this whole thing myself. IMHO this calls for some kind of intellegent dependency diff --git a/doc/bugs/plugins__47__relativedate_depends_on_locale_at_setup_file.mdwn b/doc/bugs/plugins__47__relativedate_depends_on_locale_at_setup_file.mdwn new file mode 100644 index 000000000..a9a39ac47 --- /dev/null +++ b/doc/bugs/plugins__47__relativedate_depends_on_locale_at_setup_file.mdwn @@ -0,0 +1,16 @@ +[[plugins/relativedate]] does not works when russian locale defined at setup file (locale => 'ru_RU.UTF-8'). This is happen because javascript for this plugin takes either elements title or content itself. If russian locale is turned on then title generated on russian language and JS can't convert it into Date object. innerHTML is language independent (YYYY-MM-DD HH:mm) always. + +If I switch locale to en_US.UTF-8 then this plugin correctly parses text date and print relative date. But when I mouseover on date I see unusual formating of the date (it uses AM/PM format while russians use 24-h notation). + +P.S. All pages but RecentChanges show well-formated date. RecentChanges show date formated using locale. Anyway, plugin does not work without en_US locale. + +> [[Fixed|done]]. Now it uses C locale for the date put in the title, +> that is used by relativedate. The mouseover will display the date in your +> native locale. +> +> Only exception is that when javascript is disabled... then +> relativedate can't work, so instead you will see your localized date +> displayed; but on mouseover you will get shown the C locale date. +> --[[Joey]] + +>> Thanks. diff --git a/doc/bugs/po:__apache_config_serves_index_directory_for_index.mdwn b/doc/bugs/po:__apache_config_serves_index_directory_for_index.mdwn new file mode 100644 index 000000000..fd7cd518c --- /dev/null +++ b/doc/bugs/po:__apache_config_serves_index_directory_for_index.mdwn @@ -0,0 +1,85 @@ +Similarly to [[po:_apache_config_serves_index.rss_for_index]], +the [[plugins/po]] apache config has another bug. + +The use of "DirectoryIndex index", when combined with multiviews, is intended +to serve up a localized version of the index.??.html file. + +But, if the site's toplevel index page has a discussion page, that +is "/index/discussion/index.html". Or, if the img plugin is used to scale +an image on the index page, that will be "/index/foo.jpg". In either case, +the "index" directory exists, and so apache happily displays that +directory, rather than the site's index page! + +--[[Joey]] + +> Ack, we do have a problem. Seems like ikiwiki's use of `index/` as +> the directory for homepage's sub-pages and attachments makes it +> conflict deeply with Apache's `MultiViews`: as the [MultiViews +> documentation](http://httpd.apache.org/docs/2.2/mod/mod_negotiation.html#multiviews) +> says, `index.*` are considered as possible matches only if the +> `index/` directory *does not exist*. Neither type maps nor +> `mod_mime` config parameters seem to allow overriding this behavior. +> Worse even, I guess any page called `index` would have the same +> issues, not only the wiki homepage. + +> I can think of two workarounds, both kinda stink: +> +> 1. Have the homepage's `targetpage` be something else than +> `index.html`. +> 2. Have the directory for the homepage's sub-pages and attachments +> be something else than `index`. +> +> I doubt either of those can be implemented without ugly special +> casing. Any other idea? --[[intrigeri]] + +>> As I understand it, this is how you'd do it with type maps: +>> +>> * turn off MultiViews +>> * `AddHandler type-map .var` +>> * `DirectoryIndex index.var` +>> * make `index.var` a typemap (text file) pointing to `index.en.html`, +>> `index.fr.html`, etc. +>> +>> I'm not sure how well that fits into IkiWiki's structure, though; +>> perhaps the master language could be responsible for generating the +>> type-map on behalf of all slave languages, or something? +>> +>> Another possibility would be to use filenames like `index.html.en` +>> and `index.html.fr`, and set `DirectoryIndex index.html`? This could +>> get problematic for languages whose ISO codes conventionally mean +>> something else as extensions (Polish, `.pl`, is the usual example, +>> since many sites interpret `.pl` as "this is a (Perl) CGI"). +>> --[[smcv]] + +>>> There is something to be said about "index/foo" being really ugly +>>> and perhaps it would be nice to use something else. There does not +>>> appear to even be one function that could be changed; "$page/foo" is +>>> hardwired into ikiwiki in many places as a place to dump subsidiary +>>> content -- and it's not even consistent, since there is also eg, +>>> "$page.rss". I agree, approaching it from this direction would be a +>>> mess or a lot of work. +>>> +>>> Type maps seem like a valid option, but also a lot of clutter. +>>> +>>> `index.html.pl` does seem to be asking for trouble, even if apache +>>> can be configured to DTRT. It would make serving actual up perl scripts +>>> hard, at least. But that is some good out of the box thinking.. +>>> perhaps "index.foo.pl.html"? +>>> +>>> However, that would mean that +>>> web servers need to be configured differently to serve translated +>>> and non-translated sites. The current apache configuration for po +>>> can be used with non-po sites and they still work. --[[Joey]] + +>>>> I am vulnerable to the same problem because I use MultiViews, though I don't use the `po` module; +>>>> I have to serve both Australian English and American English for my company's website +>>>> (for SEO purposes; certain words that relate to our products are spelt differently in US and Australian English, and we need to be able to be googled with both spellings). +>>>> I'm just fortunate that nobody has thought to add attachments to the front page yet. +>>>> I raise this to point out that this is going to be a recurring problem that won't necessarily be fixed by changing the `po` module in isolation. +>>>> +>>>> One could argue that "index" is already a special case, since it is the top page of the site. +>>>> Things like parentlinks already use a special case for the top page (checking the variable HAS_PARENTLINKS). +>>>> Likewise, when --usedirs is true, index is treated as a special case, since it generates "index.html" and not "index/index.html". +>>>> +>>>> Unfortunately, I'm not sure what the best approach to solving this would be. +>>>> --[[KathrynAndersen]] diff --git a/doc/bugs/po:_apache_config_serves_index.rss_for_index.mdwn b/doc/bugs/po:_apache_config_serves_index.rss_for_index.mdwn new file mode 100644 index 000000000..a2b68c4b1 --- /dev/null +++ b/doc/bugs/po:_apache_config_serves_index.rss_for_index.mdwn @@ -0,0 +1,36 @@ +The apache config documented in [[plugins/po]] has a subtle bug. It works +until a site gets an index.atom or index.rss file. (Acutally, with po +enabled, they're called index.en.atom or index.en.rss etc, but the result +is the same). + +Then, when wget, curl, or w3m is pointed at http://site/, apache serves +up the rss/atom file rather than the index page. + +Analysis: + +* /etc/mime.types gives mime types to .rss and .atom files +* `mod_negotiation`'s MultiViews allows any file with a mime type to be + served up via content negotiation, if the client requests that type. +* wget etc send `Accept: */*` to accept all content types. Compare + with firefox, which sends `Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*` +* So apache has a tie between a html encoded Enlish file, and a rss encoded + English file and the client has no preference. In a tie, apache will serve up the + *smallest* file, which tends to be the rss file. (Apache's docs say it uses that + strange criteria to break ties; see <http://httpd.apache.org/docs/2.0/mod/mod_mime.html#multiviewsmatch>) + +The only way I have found to work around this problem is to remove +atom and rss from /etc/mime.types. Of course, that has other undesirable +results. + +I wonder if it would be worth making the po plugin generate apache +[type map files](http://httpd.apache.org/docs/2.0/mod/mod_negotiation.html#typemaps). +That should avoid this problem. +--[[Joey]] + +Update: A non-intrusive fix is to add this to apache configuration. +This tunes the "quality" of the rss and atom files, in an apparently currently +undocumented way (though someone on #httpd suggested it should get documented). +Result is that apache will prefer serving index.html. --[[Joey]] [[done]] + + AddType application/rss+xml;qs=0.8 .rss + AddType application/atom+xml;qs=0.8 .atom diff --git a/doc/bugs/po:_double_commits_of_po_files.mdwn b/doc/bugs/po:_double_commits_of_po_files.mdwn new file mode 100644 index 000000000..2f3015e2b --- /dev/null +++ b/doc/bugs/po:_double_commits_of_po_files.mdwn @@ -0,0 +1,22 @@ +When adding a new english page, the po files are created, committed, +and then committed again. The second commit makes this change: + + -"Content-Type: text/plain; charset=utf-8\n" + -"Content-Transfer-Encoding: ENCODING" + +"Content-Type: text/plain; charset=UTF-8\n" + +"Content-Transfer-Encoding: ENCODING\n" + +Same thing happens when a change to an existing page triggers a po file +update. --[[Joey]] + +> * The s/utf-8/UTF-8 part has been fixed. +> * The ENCODING\n part is due to an inconsistency in po4a, which +> I've just send a patch for. --[[intrigeri]] + +>> I resubmitted the patch to po4a upstream, sending it this time to +>> their mailing-list: +>> [post archive](http://lists.alioth.debian.org/pipermail/po4a-devel/2010-July/001897.html). +>> --[[intrigeri]] + +>>> Seems to me Debian Squeeze's po4a does not expose this bug anymore +>>> => [[done]]. --[[intrigeri]] diff --git a/doc/bugs/po:_markdown_link_parse_bug.mdwn b/doc/bugs/po:_markdown_link_parse_bug.mdwn new file mode 100644 index 000000000..1aa4eb803 --- /dev/null +++ b/doc/bugs/po:_markdown_link_parse_bug.mdwn @@ -0,0 +1,21 @@ +Apparently this is legal markdown, though unusual syntax for a link: + + [Branchable](http://www.branchable.com/ "Ikiwiki hosting") + +If that is put on a translatable page, the translations display it not as a +link, but as plain text. + +Probably a po4a bug, but I don't see the bug clearly in the gernerated po +file: + + "This was posted automatically by [Branchable](http://www.branchable.com/ " + "\"Ikiwiki hosting\") when I signed up." + +--[[Joey]] + +> I cannot reproduce this on my Squeeze system with ikiwiki Git code; +> both the page in the master language and translation pages perfectly +> display the link (and tooltip) in my testing environment. Were you +> using an oldest po4a, such as Lenny's one? --[[intrigeri]] + +>> Quite likely. Not seeing the problem now, [[done]] --[[Joey]] diff --git a/doc/bugs/po:_might_not_add_translated_versions_of_all_underlays.mdwn b/doc/bugs/po:_might_not_add_translated_versions_of_all_underlays.mdwn new file mode 100644 index 000000000..82aed400d --- /dev/null +++ b/doc/bugs/po:_might_not_add_translated_versions_of_all_underlays.mdwn @@ -0,0 +1,16 @@ +[[plugins/po]]'s `checkconfig` looks in the `underlaydirs`, but plugins that +add underlays typically do so in their own `checkconfig`. + +As far as I can see, this will result in it not adding translated versions +of underlays added by a plugin that comes after it in `$config{add_plugins}`; +for instance, if you have `add_plugins => qw(po smiley)`, you'll probably +not get the translated versions of `smileys.mdwn`. (I haven't tested this.) + +> It doesn't happen because smiley adds the underlay unconditionally on +> import. Which is really more usual. + +To see them all, `po` should use `last => 1` when registering the hook. +--[[smcv]] + +> At least all that don't last their hooks too! But, added, since +> it will make the problem much less likely to occur. --[[Joey]] [[done]] diff --git a/doc/bugs/po:_new_pages_not_translatable.mdwn b/doc/bugs/po:_new_pages_not_translatable.mdwn new file mode 100644 index 000000000..c19f66594 --- /dev/null +++ b/doc/bugs/po:_new_pages_not_translatable.mdwn @@ -0,0 +1,12 @@ +Today I added a new English page to l10n.ikiwiki.info. When I saved, +the page did not have the translation links at the top. I waited until +the po plugin had, in the background, created the po files, and refreshed; +still did not see the translation links. Only when I touched the page +source and refreshed did it finally add the translation links. +I can reproduce this bug in a test site. --[[Joey]] + +> I could reproduce this bug at some point during the merge of a buggy +> version of my ordered slave languages patch, but I cannot anymore. +> Could you please try again? --[[intrigeri]] + +>> Cannot reproduce with 3.20100722, [[done]] I guess. --[[Joey]] diff --git a/doc/bugs/po:_plugin_should_not_override_the_title_on_the_homepage.mdwn b/doc/bugs/po:_plugin_should_not_override_the_title_on_the_homepage.mdwn new file mode 100644 index 000000000..8f9374707 --- /dev/null +++ b/doc/bugs/po:_plugin_should_not_override_the_title_on_the_homepage.mdwn @@ -0,0 +1,58 @@ +The po plugin systematically overrides the title of the homepage with the wikiname. This prevents explicitly changing it with a meta directive. It should rather check whether it was overridden before setting it back. + +Here is a simple patch for that: + + diff --git a/Plugin/po.pm b/Plugin/po.pm + index 6395ebd..a048c6a 100644 + --- a/Plugin/po.pm + +++ b/Plugin/po.pm + @@ -333,7 +333,7 @@ sub pagetemplate (@) { + && $masterpage eq "index") { + $template->param('parentlinks' => []); + } + - if (ishomepage($page) && $template->query(name => "title")) { + + if (ishomepage($page) && $template->query(name => "title") && !$template->query(name => "title_overridden")) { + $template->param(title => $config{wikiname}); + } + } + +Thanks. + +> I fixed this patch a bit and applied it to my po branch, thanks +> (commit 406485917). +> +> But... a bug (probably in HTML::Template) prevents this +> theoretically correct solution to actually work. +> Setting a parameter that does not appear in the template, such as +> `title_overridden`, is not working on my install: the value does not +> seem to be stored anywhere, and when accessing it later using +> `$template->param('title_overridden')` it is always undef. +> Adding `<TMPL_IF TMPL_VAR TITLE_OVERRIDDEN></TMPL_IF>` in +> `page.tmpl` is a working, but ugly workaround. +> +> I am nevertheless in favour of merging the fix into ikiwiki. +> We'll then need to find how to find the remaining (smaller) bug so +> that this code can actually work. +> +> I'd like others to test my po branch and see if they can reproduce +> the bug I am talking of. +> +> --[[intrigeri]] + +>> Commit 406485917 looks fine to me, FWIW --[[smcv]] + +>>> I tracked the HTML::Template bug (or missing documentation?) a bit +>>> more. This lead to commit b2a2246ba in my po branch, that enables +>>> HTML::Template's parent_global_vars option which makes +>>> title_overridden work. +>>> +>>> OTOH I feel this workaround is a bit ugly as this option is not +>>> documented. IMHO being forced to use it reveals a bug in +>>> HTML::Template. I reported this: +>>> https://rt.cpan.org/Public/Bug/Display.html?id=64158. +>>> +>>> But still, I think we need to apply the workaround as +>>> HTML::Template's author has not updated any dist on CPAN for more +>>> than one year. --[[intrigeri]] + +>>>> All merged, [[done]]. --[[Joey]] diff --git a/doc/bugs/po:_po_files_instead_of_html_files.mdwn b/doc/bugs/po:_po_files_instead_of_html_files.mdwn new file mode 100644 index 000000000..f84dc8ff4 --- /dev/null +++ b/doc/bugs/po:_po_files_instead_of_html_files.mdwn @@ -0,0 +1,30 @@ +On the home page of my wiki, when i click on the link "ikiwiki", i get the english file instead of the french file. +At the bottom of this page, there is the "Links" line: +Links: index index.fr templates templates.fr +When i click on "templates.fr", i get the po.file instead of html. + + Sorry for the noise! I set "po_master_language" to fr and all was ok. + +> Any chance you could be a bit more verbose about what the +> misconfiguration was? I don't think the po plugin should behave like that +> in any configuration. Unless, perhaps, it was just not configured to +> support any languages at all, and so the po file was treated as a raw +> file. --[[Joey]] + +>> I can reproduce the bug with: + # po plugin + # master language (non-PO files) + po_master_language => { + code => 'en', + name => 'English' + }, + # slave languages (PO files) + po_slave_languages => [qw{fr|Français}], + +>>> I've never found any `.po` file in the destination directory on +>>> any of my PO-enabled ikiwiki instances. Without more information, +>>> there's nothing I can do: the config snippet pasted above is more +>>> or less the example one and does not allow me to reproduce the +>>> bug. --[[intrigeri]] + +>>>> I think it's best to close this as unreproducible. [[done]] --[[Joey]] diff --git a/doc/bugs/po:_ugly_messages_with_empty_files.mdwn b/doc/bugs/po:_ugly_messages_with_empty_files.mdwn new file mode 100644 index 000000000..d3992b6bc --- /dev/null +++ b/doc/bugs/po:_ugly_messages_with_empty_files.mdwn @@ -0,0 +1,6 @@ +If there are empty .mdwn files, the po plugin displays some ugly messages. + +> This is due to a bug in po4a (not checking definedness of a +> variable). One-liner patch sent. --[[intrigeri]] + +>> This seems to be fixed in po4a 0.40 => [[done]]. --[[intrigeri]] diff --git a/doc/bugs/po:broken_links_to_translatable_basewiki_pages_that_lack_po_fies.mdwn b/doc/bugs/po:broken_links_to_translatable_basewiki_pages_that_lack_po_fies.mdwn new file mode 100644 index 000000000..121d33807 --- /dev/null +++ b/doc/bugs/po:broken_links_to_translatable_basewiki_pages_that_lack_po_fies.mdwn @@ -0,0 +1,73 @@ +broken links to translatable basewiki pages that lack po files +-------------------------------------------------------------- + +If a page is not translated yet, the "translated" version of it +displays wikilinks to other, existing (but not yet translated?) +pages as edit links, as if those pages do not exist. + +That's really confusing, especially as clicking such a link +brings up an edit form to create a new, english page. + +This is with po_link_to=current or negotiated. With default, it doesn't +happen.. + +Also, this may only happen if the page being linked to is coming from an +underlay, and the underlays lack translation to a given language. +--[[Joey]] + +> Any simple testcase to reproduce it, please? I've never seen this +> happen yet. --[[intrigeri]] + +>> Sure, go here <http://l10n.ikiwiki.info/smiley/smileys/index.sv.html> +>> (Currently 0% translateed) and see the 'WikiLink' link at the bottom, +>> which goes to <http://l10n.ikiwiki.info/ikiwiki.cgi?page=ikiwiki/wikilink&from=smiley/smileys&do=create> +>> Compare with eg, the 100% translated Dansk version, where +>> the WikiLink link links to the English WikiLink page. --[[Joey]] + +>>> Seems not related to the page/string translation status: the 0% +>>> translated Spanish version has the correct link, just like the +>>> Dansk version => I'm changing the bug title accordingly. +>>> +>>> I tested forcing the sv html page to be rebuilt by translating a +>>> string in it, it did not fix the bug. I did the same for the +>>> Spanish page, it did not introduce the bug. So this is really +>>> weird. +>>> +>>> The smiley underlay seems to be the only place where the wrong +>>> thing happens: the basewiki underlay has similar examples +>>> that do not exhibit this bug. An underlay linking to another might +>>> be necessary to reproduce it. Going to dig deeper. --[[intrigeri]] + +>>>> After a few hours lost in the Perl debugger, I think I have found +>>>> the root cause of the problem: in l10n wiki's configured +>>>> `underlaydir`, the basewiki is present in every slave language +>>>> that is enabled for this wiki *but* Swedish. With such a +>>>> configuration, the `ikiwiki/wikilink` page indeed does not exist +>>>> in Swedish language: no `ikiwiki/wikilink.sv.po` can be found +>>>> where ikiwiki is looking. Have a look to +>>>> <http://l10n.ikiwiki.info/ikiwiki/>, the basewiki is not +>>>> available in Swedish language on this wiki. So this is not a po +>>>> bug, but a configuration or directories layout issue. This is +>>>> solved by adding the Swedish basewiki to the underlay dir, which +>>>> is I guess not a possibility in the l10n wiki context. I guess +>>>> this could be solved by adding `SRCDIR/basewiki` as an underlay +>>>> to your l10n wiki configuration, possibly using the +>>>> `add_underlays` configuration directive. --[[intrigeri]] + +>>>>> There is no complete Swedish underlay translation yet, so it is not +>>>>> shipped in ikiwiki. I don't think it's a misconfiguration to use +>>>>> a language that doesn't have translated underlays. --[[Joey]] + +>>>>>> Ok. The problem is triggered when using a language that doesn't +>>>>>> have translated underlays, *and* defining +>>>>>> `po_translatable_pages` in a way that renders the base wiki +>>>>>> pages translatable in po's view of things, which in turns makes +>>>>>> the po plugin act as if the translation pages did exist, +>>>>>> although they do not in this case. I still need to have a deep +>>>>>> look at the underlays-related code you added to `po.pm` a while +>>>>>> ago. Stay tuned. --[[intrigeri]] + +>>>>>>> Fixed in my po branch, along with other related small bugs that +>>>>>>> happen in the very same situation only. --[[intrigeri]] + +>>>>>>>> Merged. Not tested yet, but I trust you; [[done]] --[[Joey]] diff --git a/doc/bugs/po_vs_templates.mdwn b/doc/bugs/po_vs_templates.mdwn new file mode 100644 index 000000000..d826546e6 --- /dev/null +++ b/doc/bugs/po_vs_templates.mdwn @@ -0,0 +1,48 @@ +The po plugin's protection against processing loops (i.e. the +alreadyfiltered stuff) is playing against us: the template plugin +triggers a filter hooks run with the very same ($page, $destpage) +arguments pair that is used to identify an already filtered page. + +Processing an included template can then mark the whole translation +page as already filtered, which prevented `po_to_markup` to be called on +the PO content. + +Symptoms: the unprocessed gettext file goes unfiltered to the +generated HTML. + +This has been fixed in my po branch. + +> My commit dcd57dd5c9f3265bb7a78a5696b90976698c43aa updates the +> bugfix in a much more elegant manner. Its main disadvantage is to +> add an (optional) argument to IkiWiki::filter. Please review. + +-- [[intrigeri]] + +>> Hmm. Don't like adding a fourth positional parameter to that (or +>> any really) function. +>> +>> I think it's quite possible that some of the directives that are +>> calling filter do so unnecessarily. For example, conditional, +>> cutpaste, more, and toggle each re-filter text that comes from the +>> page and so has already been filtered. They could probably drop +>> the filtering. template likewise does not need to filter the +>> parameters passed into it. Does it need to filter the template output? +>> Well, it allows the (deprecated) embed plugin to work on template +>> content, but that's about it. +>> +>> Note also that the only other plugin to provide a filter, txt, +>> could also run into similar problems as po has, in theory (it looks at +>> the page parameter and assumes the content is for the whole page). +>> +>> [[!template id=gitbranch branch=origin/filter-full author="[[joey]]"]] +>> So, I've made a filter-full branch, where I attempt to fix this +>> by avoiding unnecessary filtering. Can you check it and merge it into +>> your po branch and remove your other workarounds so I can merge? +>> --[[Joey]] + +>>> I merged your filter-full branch into my po branch and reverted my +>>> other workarounds. According to my tests this works ok. I'm glad +>>> you found this solution, as I didn't like changing the filter +>>> prototype. I believe you can now merge this code. --[[intrigeri]] + +[[!tag patch done]] diff --git a/doc/bugs/post-update_hook_can__39__t_be_compiled_with_tcc.mdwn b/doc/bugs/post-update_hook_can__39__t_be_compiled_with_tcc.mdwn new file mode 100644 index 000000000..a8fb19888 --- /dev/null +++ b/doc/bugs/post-update_hook_can__39__t_be_compiled_with_tcc.mdwn @@ -0,0 +1,19 @@ +Thinking that any c compiler would do the job, I tried to use tcc with ikiwiki, as explicitely allowed by the Debian package dependencies. + +I installed `tcc` and `libc6-dev` (for `libcrt1`). The wrapper compilation was OK, but the wrapper fails to run correctly and dies with + + usage: ikiwiki [options] source dest + ikiwiki --setup configfile + +Everything works fine with gcc. + +versions: Debian lenny + backports + +> Seems that tcc does not respect changing where `environ` points as a way +> to change the environment seen after `exec` +> +> Given that the man page for `clearenv` suggests using `environ=NULL` +> if `clearenv` is not available, I would be lerry or using tcc to compile +> stuff, since that could easily lead to a security compromise of code that +> expects that to work. However, I have fixed ikiwiki to use `clearenv`. +> --[[Joey]] [[done]] diff --git a/doc/bugs/preview_base_url_should_be_absolute.mdwn b/doc/bugs/preview_base_url_should_be_absolute.mdwn new file mode 100644 index 000000000..f160a84c4 --- /dev/null +++ b/doc/bugs/preview_base_url_should_be_absolute.mdwn @@ -0,0 +1,53 @@ +The edit page CGI defines a `base` tag with an URL which is not +absolute, which can break the preview function in some circumstances +(with e.g. images not showing). The trivial [[patch]] that fixes +it can be found [[here|http://sprunge.us/EPHT]] as well as on [[my +git|http://git.oblomov.eu/ikiwiki]]. + +> That patch does mean that if you're accessing the CGI via HTTPS but your +> $config{url} and $config{cgiurl} are HTTP, you'll get preview images loaded +> via HTTP, causing the browser to complain. See +> [[todo/want_to_avoid_ikiwiki_using_http_or_https_in_urls_to_allow_serving_both]] +> for background. +> +> Perhaps the CGI could form its `<base>` URL by using +> `URI->new_abs(urlto(...), $cgi->url)` instead? +> +> You'd also need to change `IkiWiki/Wrapper.pm` to pass at least the +> SERVER_NAME and SERVER_PORT through the environment, probably. +> +> Joey's last comment on +> [[todo/want_to_avoid_ikiwiki_using_http_or_https_in_urls_to_allow_serving_both]] +> suggests that this might already work, but I'm not quite sure how - I'd +> expect it to need more environment variables? --[[smcv]] +> +>> `CGI::url` uses `REQUEST_URI`. So it could be used, but I don't see +>> how to get from the `CGI::url` to an url to the page that is being +>> edited. --[[Joey]] +>>> (The right rune seems to be: `URI->new_abs(urlto($params{page}), $cgi->url))` --[[Joey]] + +--- + +Update: This bug is worse than it first appeared, and does not only affect +previewing. The cgi always has a `<base>` url, and it's always relative, +and that can break various links etc. For example, when the 404 plugin +displays a missing page, it has a Recentchanges link, which would be broken +if the cgi was in an unusual place. + +`misctemplate` needs to *always* set an absolute baseurl. Which is a problem, +since `misctemplate` is not currently passed a cgi object from which to +construct one. --[[Joey]] + +Update: Worse and worse. `baseurl(undef)` can be a relative url, but +nearly every use of it I can find actually needs to be absolute. +the numerous `redirect($q, baseurl(undef))` all need to be absolute +according to `CGI` documentation. + +So, I'm seriously thinking about reverting the part of +[[todo/want_to_avoid_ikiwiki_using_http_or_https_in_urls_to_allow_serving_both]] +that made `baseurl(undef)` relative. +And I suppose, re-opening that todo. :( --[[Joey]] + +---- + +This was fixed in version 3.20110105 [[done]] --[[Joey]] diff --git a/doc/bugs/preview_pagestate.mdwn b/doc/bugs/preview_pagestate.mdwn new file mode 100644 index 000000000..7f7ec0976 --- /dev/null +++ b/doc/bugs/preview_pagestate.mdwn @@ -0,0 +1,13 @@ +If a change to a page is previewed, but not saved, `%pagestate` and +`%wikistate` can be changed, and saved. Actually, it's not limited to +those. Seems that spurious dependencies can be added, though existing +dependencies will at least not be removed. + +It calls saveindex to record state about files created on disk for the +preview. Those files will expire later. However, saveindex also +saves other state changes. + +Seems like it needs to isolate all state changes when previewing... ugh. +--[[Joey]] + +[[done]] diff --git a/doc/bugs/rebuild_after_changing_the_underlaydir_config_option.mdwn b/doc/bugs/rebuild_after_changing_the_underlaydir_config_option.mdwn new file mode 100644 index 000000000..8613ef03c --- /dev/null +++ b/doc/bugs/rebuild_after_changing_the_underlaydir_config_option.mdwn @@ -0,0 +1,12 @@ +It seems that rebuild a wiki (`ikiwiki --rebuild`) after changing the `underlaydir` config option doesn't remove the pages coming from the previous underlaydir. + +I've noticed this with the debian package version 3.20100102.3~bpo50+1. + +Perhaps it is possible to improve this or mention it in the manual page? + +--prosper + +> --rebuild causes ikiwiki to throw away all its info about what it built +> before, so it will never clean up pages that have been removed, by any +> means. Suggest you do a --refresh, possibly followed by a --rebuild +> if that is really necessary. --[[Joey]] diff --git a/doc/bugs/removal_of_transient_pages.mdwn b/doc/bugs/removal_of_transient_pages.mdwn new file mode 100644 index 000000000..2667a2b83 --- /dev/null +++ b/doc/bugs/removal_of_transient_pages.mdwn @@ -0,0 +1,27 @@ +The remove plugin cannot remove [[todo/transient_pages]]. + +> this turns out to be harder than +> I'd hoped, because I don't want to introduce a vulnerability in the +> non-regular-file detection, so I'd rather defer that. --[[smcv]] + +This is particularly a problem for tag pages, and autoindex +created pages. So both plugins default to not creating transient +pages, until this is fixed. --[[Joey]] + +> I'll try to work out which of the checks are required for security +> and which are just nice-to-have, but I'd appreciate any pointers +> you could give. --[[smcv]] + +>> I assume by "non-regular file", you are referring to the check +>> in remove that the file "Must exist on disk, and be a regular file" ? +>> --[[Joey]] + +>>> Yes. It's not entirely clear to me why that's there... --s + +>>>> Yeah, 2461ce0de6231bfeea4d98c86806cdbb85683297 doesn't really +>>>> say, and I tend to assume that when I've written paranoid code +>>>> it's there for a reason. I think that here the concern was that +>>>> the file might be in some underlay that the user should not be able +>>>> to affect by web edits. The `-f` check seems rather redundant, +>>>> surely if it's in `%pagesources` ikiwiki has already verified it's +>>>> safe. --[[Joey]] diff --git a/doc/bugs/remove_orphaned_sparkline-php_from_Suggests.mdwn b/doc/bugs/remove_orphaned_sparkline-php_from_Suggests.mdwn index b4e2a1501..ab08c0b26 100644 --- a/doc/bugs/remove_orphaned_sparkline-php_from_Suggests.mdwn +++ b/doc/bugs/remove_orphaned_sparkline-php_from_Suggests.mdwn @@ -18,3 +18,5 @@ Thanks > rewriting the ikiwiki code to use it, *and* packaging that alternative > and maintaining it in Debian. So your suggestion doesn't make a lot of > sense; Debian should just find a maintainer for sparkline-php. --[[Joey]] + +[[done]] diff --git a/doc/bugs/removing_pages_with_utf8_characters.mdwn b/doc/bugs/removing_pages_with_utf8_characters.mdwn new file mode 100644 index 000000000..0d96aa75f --- /dev/null +++ b/doc/bugs/removing_pages_with_utf8_characters.mdwn @@ -0,0 +1,51 @@ +I have a page with the name "umläute". When I try to remove it, ikiwiki says: + +Error: ?umläute does not exist + +> I'm curious about the '?' in the "?umläute" message. Suggests that the +> filename starts with another strange character. Can I get a copy of a +> git repository or tarball containing this file? --[[Joey]] + +I wrote the following patch, which seems to work on my machine. I'm running on FreeBSD 6.3-RELEASE with ikiwiki-3.20100102.3 and perl-5.8.9_3. + + --- remove.pm.orig 2009-12-14 23:26:20.000000000 +0100 + +++ remove.pm 2010-01-18 17:49:39.000000000 +0100 + @@ -193,6 +193,7 @@ + # and that the user is allowed to edit(/remove) it. + my @files; + foreach my $page (@pages) { + + $page = Encode::decode_utf8($page); + check_canremove($page, $q, $session); + + # This untaint is safe because of the + + +> The problem with this patch is that, in a recent fix to the same +> plugin, I made `@pages` come from `$form->field("page")`, and +> that, in turn is already run through `decode_form_utf8` just above the +> code you patched. So I need to understand why that is apparently not +> working for you. (It works fine for me, even when deleting a file named +> "umläute" --[[Joey]] + +---- + +> Update, having looked at the file in the src of the wiki that +> is causing trouble for remove, it is: `uml\303\203\302\244ute.mdwn` +> And that is not utf-8 encoded, which, represented the same +> would be: `uml\303\244ute.mdwn` +> +> I think it's doubly-utf-8 encoded, which perhaps explains why the above +> patch works around the problem (since the page name gets doubly-decoded +> with it). The patch doesn't fix related problems when using remove, etc. +> +> Apparently, on apoca's system, perl encodes filenames differently +> depending on locale settings. On mine, it does not. Ie, this perl +> program always creates a file named `uml\303\244ute`, no matter +> whether I run it with LANG="" or LANG="en_US.UTF-8": +> +> perl -e 'use IkiWiki; writefile("umläute", "./", "baz")' +> +> Remains to be seen if this is due to the older version of perl used +> there, or perhaps FreeBSD itself. --[[Joey]] +> +> Update: Perl 5.10 fixed the problem. --[[Joey]] diff --git a/doc/bugs/rename_fixup_not_attributed_to_author.mdwn b/doc/bugs/rename_fixup_not_attributed_to_author.mdwn new file mode 100644 index 000000000..bcfafac22 --- /dev/null +++ b/doc/bugs/rename_fixup_not_attributed_to_author.mdwn @@ -0,0 +1,12 @@ +When I renamed `todo/transient_in-memory_pages` to [[todo/transient pages]], +`rename::fixlinks` was meant to blame me for the link-fixing commit, and title it +`update for rename of %s to %s`. Instead, it blamed Joey for the commit, +and didn't set a commit message. + +(It also committed a pile of recentchanges pages which shouldn't have +been committed, for which see [[bugs/git_commit_adds_files_that_were_not_tracked]].) + +--[[smcv]] + +> It was calling `rcs_commit` old-style, and so not passing the session +> object that is used to get the user's name. [[fixed|done]] --[[Joey]] diff --git a/doc/bugs/rss_feeds_do_not_use_recommended_encoding_of_entities_for_some_fields.mdwn b/doc/bugs/rss_feeds_do_not_use_recommended_encoding_of_entities_for_some_fields.mdwn index 48c168997..0a435cea3 100644 --- a/doc/bugs/rss_feeds_do_not_use_recommended_encoding_of_entities_for_some_fields.mdwn +++ b/doc/bugs/rss_feeds_do_not_use_recommended_encoding_of_entities_for_some_fields.mdwn @@ -34,3 +34,19 @@ For Atom, at least, I believe adding `type="xhtml"` to the title element will wo > Update: Ok, I've fixed this for titles, as a special case, but the > underlying problem remains for other fields in rss feeds (such as > author), so I'm leaving this bug report open. --[[Joey]] + +>> I'm curious if there has been any progress on better RSS output? +>> I've been prototyping a new blog and getting good RSS out of it +>> seems important as the bulk of my current readers use RSS. +>> I note, in passing that the "more" plugin doesn't quite do what +>> I want either - I'd like to pass a full RSS feed of a post and only +>> have "more" apply to the front page of the blog. Is there a way to do that? +>> -- [[dtaht]] +>> +>>> To be clear, the RSS spec sucks to such an extent that, as far as +>>> I know, there is no sort of title escaping that will work in all +>>> RSS consumers. Titles are currently escaped in the way +>>> that tends to break the fewest according to what I've read. +>>> If you're unlucky enough to +>>> have a "&" or "<" in your **name**, then you may still run into +>>> problems with how that is escaped in rss feeds. --[[Joey]] diff --git a/doc/bugs/rst_fails_on_file_containing_only_a_number.mdwn b/doc/bugs/rst_fails_on_file_containing_only_a_number.mdwn index dab3b7e5b..99e46aac9 100644 --- a/doc/bugs/rst_fails_on_file_containing_only_a_number.mdwn +++ b/doc/bugs/rst_fails_on_file_containing_only_a_number.mdwn @@ -24,3 +24,6 @@ throwing code..): > No, still the same failure. I think it's failing parsing the input data, > (which perl probably transmitted as an int due to perl internals) > not writing out its response. --[[Joey]] + +> On second thought, this was a bug in ikiwiki, it should be transmitting +> that as a string. Fixed in external.pm --[[Joey]] diff --git a/doc/bugs/rst_plugin_has_python_hardcode_in_shebang_line.mdwn b/doc/bugs/rst_plugin_has_python_hardcode_in_shebang_line.mdwn new file mode 100644 index 000000000..a594adc09 --- /dev/null +++ b/doc/bugs/rst_plugin_has_python_hardcode_in_shebang_line.mdwn @@ -0,0 +1,15 @@ +Current the rst plugin uses this shebang line: + + #!/usr/bin/python + +The problem is that rst plugin uses some feature (for example, iterator comprehension) which is unavailable on old version of Python. + +So rst plugin will not work on a machine which has an old version of python in system path even though +the user have installed a new version of python in other place. For example, I am using ikiwiki with the rst plugin on Mac OS X 10.4 which ships python 2.3 but I do have python2.6 installed on /opt/local/bin/python (via macports). + +Thus I suggest to change the shebang line to: + + #!/usr/bin/env python + +> [[done]], although the irony of all the perl hashbangs in ikiwiki +> being hardcoded doesn't escape me. --[[Joey]] diff --git a/doc/bugs/some_but_not_all_meta_fields_are_stored_escaped.mdwn b/doc/bugs/some_but_not_all_meta_fields_are_stored_escaped.mdwn new file mode 100644 index 000000000..587771ba4 --- /dev/null +++ b/doc/bugs/some_but_not_all_meta_fields_are_stored_escaped.mdwn @@ -0,0 +1,44 @@ +[[!template id=gitbranch branch=smcv/unescaped-meta author="[[Simon_McVittie|smcv]]"]] +[[!tag patch]] +(Warning: this branch has not been tested thoroughly.) + +While discussing the [[plugins/meta]] plugin on IRC, Joey pointed out that +it stores most meta fields unescaped, but 'title', 'guid' and 'description' +are special-cased and stored escaped (with numeric XML/HTML entities). This +is to avoid emitting markup in the `<title>` of a HTML page, or in an RSS/Atom +feed, neither of which are subject to the [[plugins/htmlscrubber]]. + +However, having the meta fields "partially escaped" like this is somewhat +error-prone. Joey suggested that perhaps everything should be stored +unescaped, and the escaping should be done on output; this branch +implements that. + +Points of extra subtlety: + +* The title given to the [[plugins/search]] plugin was previously HTML; + now it's plain text, potentially containing markup characters. I suspect + that that's what Xapian wants anyway (which is why I didn't change it), + but I could be wrong... + + > AFAICS, this if anything, fixes a bug, xapian definitely expects + > unescaped text here. --[[Joey]] + +* Page descriptions in the HTML `<head>` were previously double-escaped: + the description was stored escaped with numeric entities, then that was + output with a second layer of escaping! In this branch, I just emit + the page description escaped once, as was presumably the intention. + +* It's safe to apply this change to a wiki and neglect to rebuild it + (assuming I implemented it correctly!), but until the wiki is rebuilt, + titles, descriptions and GUIDs for unchanged pages will appear + double-escaped on any page that inlines them in `quick=yes` mode, and + is rebuilt for some other reason. The failure mode is too much escaping + rather than too little, so it shouldn't be a security problem. + +* Reverting this change, if applied, is more dangerous; until the wiki is + rebuilt, any titles, descriptions and GUIDs on unchanged pages that + contained markup could appear unescaped on any page that inlines them + in `quick=yes` mode, and is rebuilt for some other reason. The failure + mode here would be too little escaping, i.e. cross-site scripting. + +[[!tag done]] diff --git a/doc/bugs/stray___60____47__p__62___tags.mdwn b/doc/bugs/stray___60____47__p__62___tags.mdwn index 6e508ffda..99d6fe09f 100644 --- a/doc/bugs/stray___60____47__p__62___tags.mdwn +++ b/doc/bugs/stray___60____47__p__62___tags.mdwn @@ -13,3 +13,5 @@ I believe that this snippet in `IkiWiki.pm` might be the reason for the imbalanc } The fact that HTML in a `\[[!meta title]]` is added but then escaped might indicate that some other bug is involved. + +> [[done]] --[[Joey]] diff --git a/doc/bugs/support_for_openid2_logins.mdwn b/doc/bugs/support_for_openid2_logins.mdwn index 139a53760..a71ed7ba9 100644 --- a/doc/bugs/support_for_openid2_logins.mdwn +++ b/doc/bugs/support_for_openid2_logins.mdwn @@ -20,3 +20,5 @@ However both Perl OpenID 2.x implementations have not been released and are inco > I've tested with yahoo, and it works with the updated module. Sweet and > [[done]] --[[Joey]] + +## A quick fix for the impatient running stable is simply `sudo apt-get install libnet-openid-consumer-perl -t unstable` diff --git a/doc/bugs/svn_commit_failures_interpreted_as_merge_conflicts.mdwn b/doc/bugs/svn_commit_failures_interpreted_as_merge_conflicts.mdwn new file mode 100644 index 000000000..0c9bce4b9 --- /dev/null +++ b/doc/bugs/svn_commit_failures_interpreted_as_merge_conflicts.mdwn @@ -0,0 +1,21 @@ +I'm attempting a merge with the SVN plugin via the web interface +with ikiwiki-3.20100403 and subversion 1.6.11. + +The web interface says + + Your changes conflict with other changes made to the page. + + Conflict markers have been inserted into the page content. Reconcile the conflict and commit again to save your changes. + +However there are no merge conflict markers in the page. My apache error log says: + + [Fri Apr 30 16:43:57 2010] [error] [client 10.64.64.42] svn: Commit failed (details follow):, referer: https://unixwiki.ncl.ac.uk/ikiwiki.cgi + [Fri Apr 30 16:43:57 2010] [error] [client 10.64.64.42] svn: Authorization failed, referer: https://unixwiki.ncl.ac.uk/ikiwiki.cgi + +-- [[Jon]] + +> Only way for this to be improved would be for the svn plugin to +> explicitly check the file for conflict markers. I guess it could +> change the error message then, but the actual behavior of putting the +> changed file back in the editor so the user can recommit is about right +> as far as error recovery goes. --[[Joey]] diff --git a/doc/bugs/table_plugin_does_not_handle___92__r__92__n_lines_in_CSV_files.mdwn b/doc/bugs/table_plugin_does_not_handle___92__r__92__n_lines_in_CSV_files.mdwn new file mode 100644 index 000000000..07a7afbb1 --- /dev/null +++ b/doc/bugs/table_plugin_does_not_handle___92__r__92__n_lines_in_CSV_files.mdwn @@ -0,0 +1,3 @@ +The table plugin seems to be unable to read a CSV file that uses \r\n for line delimiters. +The same file with \r works fine. The error message is "Empty data". +--liw diff --git a/doc/bugs/tag_behavior_changes_introduced_by_typed_link_feature.mdwn b/doc/bugs/tag_behavior_changes_introduced_by_typed_link_feature.mdwn new file mode 100644 index 000000000..ed93a2eb7 --- /dev/null +++ b/doc/bugs/tag_behavior_changes_introduced_by_typed_link_feature.mdwn @@ -0,0 +1,16 @@ +The use of typed links for tags and some of the consequent changes +introduced some unwanted functionality variations in the tag system. Two +problems in particular could be observed, when compared to the use of +tags in older versions of IkiWiki: + +* tags in feeds (both rss and atom) would use the file path as their + name (e.g. you would have `<category term="tags/sometag" />` in an atom + item for a page tagged sometag with a tagbase of tags), whereas they + appeared pure before +* tags containing a slash character would appear without the slash + character but be used with the slash character in other circumstances + (effect visible by tagging a page with a name such as "with/slash") + +I've written a [[patch]] to fix this issues by introducing a `tagname()` function that reverts `taglink()`, and it's available [[here|http://sprunge.us/SHRj]] as well as on my [[git|http://git.oblomov.eu/ikiwiki]] + +> [[Applied|done]], with some regexp improvements. --[[Joey]] diff --git a/doc/bugs/tag_plugin:_autotag_vs._staged_changes.mdwn b/doc/bugs/tag_plugin:_autotag_vs._staged_changes.mdwn new file mode 100644 index 000000000..e5526bedf --- /dev/null +++ b/doc/bugs/tag_plugin:_autotag_vs._staged_changes.mdwn @@ -0,0 +1,17 @@ +The autotag functionality of the tag plugin committed (when doing its +first commit) all changes that have been staged (in Git). I suggest it +should be restricted to the specific file only. --[[tschwinge]] + +> This is not specific to the tag plugin. Same can happen +> if you rename a file, or post a comment, or remove a file +> via web interface. All of these use `rcs_commit_staged`. +> +> This is why ikiwiki is supposed to have a checkout of +> the repository that it uses for its own purposes, and nobody else +> should mess with. There are various notes about that being needed here +> and there; you're free to not give ikiwiki its own repo, but you have to +> be aware that it can fight with you if you're making changes to the same +> repo. [[done]] --[[Joey]] + +>> Ack, that is reasonable. (And it's only been a very minor problem +>> during manual testing.) --[[tschwinge]] diff --git a/doc/bugs/tagged__40____41___matching_wikilinks.mdwn b/doc/bugs/tagged__40____41___matching_wikilinks.mdwn index e7e4af7c3..a211654f1 100644 --- a/doc/bugs/tagged__40____41___matching_wikilinks.mdwn +++ b/doc/bugs/tagged__40____41___matching_wikilinks.mdwn @@ -28,6 +28,8 @@ rationale on this, or what am I doing wrong, and how to achieve what I want? >> is valid. [[todo/matching_different_kinds_of_links]] is probably >> how it will eventually be solved. --[[Joey]] +>>> [[Done]]: `tagged` no longer matches other wikilinks. --[[smcv]] + > And this is an illustration why a clean work-around (without changing the software) is not possible: while thinking about [[todo/matching_different_kinds_of_links]], I thought one could work around the problem by simply explicitly including the kind of the relation into the link target (like the tagbase in tags), and by having a separate page without the "tagbase" to link to when one wants simply to refer to the tag without tagging. But this won't work: one has to at least once refer to the real tag page if one wants to talk about it, and this reference will count as tagging (unwanted). --Ivan Z. > But well, perhaps there is a workaround without introducing different kinds of links. One could modify the [[tag plugin|plugins/tag]] so that it adds 2 links to a page: for tagging -- `tagbase/TAG`, and for navigation -- `tagdescription/TAG` (displayed at the bottom). Then the `tagdescription/TAG` page would hold whatever list one wishes (with `tagged(TAG)` in the pagespec), and whenever one wants to merely refer to the tag, one should link to `tagdescription/TAG`--this link won't count as tagging. So, `tagbase/TAG` would become completely auxiliary (internal) link targets for ikiwiki, the users would edit or link to only `tagdescription/TAG`. --Ivan Z. diff --git a/doc/bugs/templateForRecentChangesMissingCloseSpan.mdwn b/doc/bugs/templateForRecentChangesMissingCloseSpan.mdwn new file mode 100644 index 000000000..5c322991a --- /dev/null +++ b/doc/bugs/templateForRecentChangesMissingCloseSpan.mdwn @@ -0,0 +1,26 @@ +In the template for ikiwiki's recent changes page + + /usr/share/ikiwiki/templates/change.tmpl + +there is a missing </span> tag after the + + <span class="changedate"><TMPL_VAR COMMITDATE> + +This results in the recentchanges/ page being invalid and rendering quite horrifyingly in Internet Exploder. + +[I'm running](http://wiki.shlrm.org) (linked so you can see the one I'm running if you need to) the latest version of ikiwiki, and I note that it's broken on [ikiwiki.info](http://validator.w3.org/check?uri=http%3A%2F%2Fikiwiki.info%2Frecentchanges%2F&charset=%28detect+automatically%29&doctype=Inline&group=0&user-agent=W3C_Validator%2F1.767) too :) + +[This one on debian](https://www.icanttype.org/recentchanges/) is somehow [valid](http://validator.w3.org/check?uri=https%3A%2F%2Fwww.icanttype.org%2F%2Frecentchanges%2F&charset=%28detect+automatically%29&doctype=Inline&group=0&user-agent=W3C_Validator%2F1.767), although it's using the same template. Perhaps there's an additional scrubbing going on his end. + +Thanks, +David + +PS: I have fixed the template by hand on my server, so it will validate, however ikiwiki.info will not. + +> [[!template id="gitbranch" branch=smcv/trivia author="[[smcv]]"]] [[!tag patch]] +> Enabling either [[plugins/htmltidy]] or [[plugins/htmlbalance]] will automatically fix unbalanced +> markup like this; using [[plugins/comments]] without having one or other of those is a bad idea +> from the point of view of avoiding comment forgery, which is probably why icanttype.org works +> correctly. Anyway, I've fixed this in a branch: Joey, care to review smcv/trivia? --[[smcv]] + +[[done]], thanks guys --[[Joey]] diff --git a/doc/bugs/the_login_page_is_unclear_when_multiple_methods_exist.mdwn b/doc/bugs/the_login_page_is_unclear_when_multiple_methods_exist.mdwn index 9985c13a0..70266c49c 100644 --- a/doc/bugs/the_login_page_is_unclear_when_multiple_methods_exist.mdwn +++ b/doc/bugs/the_login_page_is_unclear_when_multiple_methods_exist.mdwn @@ -12,3 +12,5 @@ Followed by the "login" button underneath. It's not obvious to anyone unfamiliar > it visually distinct from the rest of the form. I'm sure the styling > could be improved, but the current form does not seem too non-obvious > to me, or to naive users in the field. --[[Joey]] + +>> [[done]], better fixed by new fancy openid login form. --[[Joey]] diff --git a/doc/bugs/transient_autocreated_tagbase_is_not_transient_autoindexed.mdwn b/doc/bugs/transient_autocreated_tagbase_is_not_transient_autoindexed.mdwn new file mode 100644 index 000000000..3eb1542d3 --- /dev/null +++ b/doc/bugs/transient_autocreated_tagbase_is_not_transient_autoindexed.mdwn @@ -0,0 +1,7 @@ + mkdir -p ikiwiki-tag-test/raw/a_dir/ ikiwiki-tag-test/rendered/ + echo '[[!taglink a_tag]]' > ikiwiki-tag-test/raw/a_dir/a_page.mdwn + ikiwiki --verbose --plugin tag --plugin autoindex --plugin mdwn --set autoindex_commit=0 --set tagbase=tag --set tag_autocreate=1 --set tag_autocreate_commit=0 ikiwiki-tag-test/raw/ ikiwiki-tag-test/rendered/ + ls -al ikiwiki-tag-test/raw/.ikiwiki/transient/ + ls -al ikiwiki-tag-test/rendered/tag/ + +Shouldn't `ikiwiki-tag-test/raw/.ikiwiki/transient/tag.mdwn` and `ikiwiki-tag-test/rendered/tag/index.html` exist? diff --git a/doc/bugs/trouble_with_base_in_search.mdwn b/doc/bugs/trouble_with_base_in_search.mdwn new file mode 100644 index 000000000..ca6a6c5cc --- /dev/null +++ b/doc/bugs/trouble_with_base_in_search.mdwn @@ -0,0 +1,60 @@ +For security reasons, one of the sites I'm in charge of uses a Reverse Proxy to grab the content from another machine behind our firewall. +Let's call the out-facing machine Alfred and the one behind the firewall Betty. + +For the static pages, everything is fine. However, when trying to use the search, all the links break. +This is because, when Alfred passes the search query on to Betty, the search result has a "base" tag which points to Betty, and all the links to the "found" pages are relative. +So we have + + <base href="Betty.example.com"/> + ... + <a href="./path/to/found/page/">path/to/found/page</a> + +This breaks things for anyone on Alfred, because Betty is behind a firewall and they can't get there. + +What would be better is if it were possible to have a "base" which didn't reference the hostname, and for the "found" links not to be relative. +Something like this: + + <base href="/"/> + ... + <a href="/path/to/found/page/">path/to/found/page</a> + +The workaround I've come up with is this. + +1. Set the "url" in the config to ' ' (a single space). It can't be empty because too many things complain if it is. +2. Patch the search plugin so that it saves an absolute URL rather than a relative one. + +Here's a patch: + + diff --git a/IkiWiki/Plugin/search.pm b/IkiWiki/Plugin/search.pm + index 3f0b7c9..26c4d46 100644 + --- a/IkiWiki/Plugin/search.pm + +++ b/IkiWiki/Plugin/search.pm + @@ -113,7 +113,7 @@ sub indexhtml (@) { + } + $sample=~s/\n/ /g; + + - my $url=urlto($params{destpage}, ""); + + my $url=urlto($params{destpage}, undef); + if (defined $pagestate{$params{page}}{meta}{permalink}) { + $url=$pagestate{$params{page}}{meta}{permalink} + } + +It works for me, but it has the odd side-effect of prefixing links with a space. Fortunately that doesn't seem to break browsers. +And I'm sure someone else could come up with something better and more general. + +--[[KathrynAndersen]] + +> The `<base href>` is required to be genuinely absolute (HTML 4.01 §12.4). +> Have you tried setting `url` to the public-facing URL, i.e. with `alfred` +> as the hostname? That seems like the cleanest solution to me; if you're +> one of the few behind the firewall and you access the site via `betty` +> directly, my HTTP vs. HTTPS cleanup in recent versions should mean that +> you rarely get redirected to `alfred`, because most URLs are either +> relative or "local" (start with '/'). --[[smcv]] + +>> I did try setting `url` to the "Alfred" machine, but that doesn't seem clean to me at all, since it forces someone to go to Alfred when they started off on Betty. +>> Even worse, it prevents me from setting up a test environment on, say, Cassandra, because as soon as one tries to search, one goes to Alfred, then Betty, and not back to Cassandra at all. +>> Hardcoded solutions make me nervous. + +>> I suppose what I would like would be to not need to use a `<base href>` in searching at all. +>> --[[KathrynAndersen]] diff --git a/doc/bugs/underlaydir_file_expose.mdwn b/doc/bugs/underlaydir_file_expose.mdwn index c827c6dd8..4ee30e39d 100644 --- a/doc/bugs/underlaydir_file_expose.mdwn +++ b/doc/bugs/underlaydir_file_expose.mdwn @@ -1,4 +1,13 @@ If a file in the srcdir is removed, exposing a file in the underlaydir, -ikiwiki will notice the removal and delete the page from the destdir. The +ikiwiki will not notice the removal, and the page from the underlay will not be built. (However, it will be if the wiki gets rebuilt.) + +> This problem is caused by ikiwiki storing only filenames relative to +> the srcdir or underlay, and mtime comparison not handling this case. + +> A related problem occurs if changing a site's theme with the +> [[plugins/theme]] plugin. The style.css of the old and new theme +> often has the same mtime, so ikiwiki does not update it w/o a rebuild. +> This is worked around in theme.pm with a special-purpose needsbuild hook. +> --[[Joey]] diff --git a/doc/bugs/urlto_API_change_breaks_wikis_with_po_plugin.mdwn b/doc/bugs/urlto_API_change_breaks_wikis_with_po_plugin.mdwn new file mode 100644 index 000000000..4268a1390 --- /dev/null +++ b/doc/bugs/urlto_API_change_breaks_wikis_with_po_plugin.mdwn @@ -0,0 +1,98 @@ +The po plugin needs to be updated to match the urlto sub API and +signature changes. Else a wiki with the po plugin enabled cannot be +refreshed / rebuilt because of (correct) Perl errors. + +My po branch contains a fix. +--[[intrigeri]] + +> The commit looks sane to me, for what it's worth. Joey, please +> consider merging? --[[smcv]] + +>> Merged. --[[Joey]] + +Also, I fear the lack of any useful `$from` parameter might break some +l10n'd link niceness when using `po_link_to = current` but I have not +investigated this yet. +--[[intrigeri]] + +> If `urlto` is called without a second parameter, it means we need +> a URL valid from either the CGI URL or any page in the wiki, +> (so we'd previously have set the third parameter true), but we +> don't *necessarily* need an absolute URL - so return what you'd +> have returned if asked for an absolute URL, but looking like +> `/bugs/` rather than `http://ikiwiki.info/bugs/` if possible. +> +> It looks as though `beautify_urlpath` under `po_link_to = current`, +> and 3-argument `urlto`, aren't tested by `t/po.t` - perhaps you +> could add some test cases there? To test 3-argument `urlto` you'd +> need to add `$config{baseurl} = "http://example.com"` or +> something. --[[smcv]] + +>> I'm leaving this bug report open until this can be checked. --[[Joey]] + +>>> My `ready/urlto` branch improves the test coverage. The bugfix from +>>> that branch fixes most of `po` too, but leaves behind some perhaps +>>> less-than-ideal behaviour: links where the current language is unknown, +>>> with `po_link_to = current`, always go to the master language, +>>> whereas perhaps it'd be better to go to the negotiated language in +>>> this case? --[[smcv]] + +>>>> Thanks for taking care, thanks for these improvements! +>>>> +>>>> OTOH I consider any of these behaviours (either the brand new one +>>>> = link to master language, or the alternative one = link to +>>>> negotiated) as a regression. Any of these is contrary to what +>>>> `po_link_to = current` is supposed to do according to the +>>>> documentation. +>>>> +>>>> Let's be less technical, let me display my practical usecase +>>>> (making this possible was one of the main reasons I initially +>>>> implemented `po_link_to = current`). +>>>> +>>>> Summary: the current state of things is an annoying regression +>>>> and it needs to be fixed. +>>>> +>>>> Context: I participate in building a Live system based on Debian +>>>> Live; the project's multilingual website +>>>> ([T(A)ILS](https://amnesia.boum.org/) is built using ikiwiki. A +>>>> static / offline copy is shipped on ISO images; this is the way +>>>> end-user documentation lands on the CDs. Note that no webserver +>>>> runs on the Live system to serve this wiki, so `po_link_to = +>>>> current` is compulsory. A user can choose her preferred language +>>>> at boot time. Depending on her decision, The desktop shortcut +>>>> that points to the embedded documentation (i.e. static wiki) +>>>> links to a different entry point depending on the chosen +>>>> language. +>>>> +>>>> The previous (documented) behaviour was deadly simple: if I am +>>>> presented a page in English (master language) it means it does +>>>> not exist in my preferred language; the computer always displays +>>>> me the best available version according to my needs. The new +>>>> behaviour brings a troubling seemingly random factor into the +>>>> user navigation experience and IMHO is a mess from a web +>>>> ergonomics point of view (no content negotiation available, +>>>> remember): I sometimes am shown an English page although it is +>>>> fully translated in my language one click away, and on the +>>>> contrary I sometimes I am shown the optimal page. This, is, well, +>>>> interesting. This practically forces the non-English speaking +>>>> website visitor to check the otherlanguages list on every single +>>>> page to make sure *herself* there is nothing better available, +>>>> and sometimes click on her preferred language link to get a page +>>>> she actually can read. +>>>> +>>>> I unfortunately might not be able to dedicate the needed time to +>>>> help fix this in a timely manner, so I don't want to urge anyone. +>>>> Take care! --[[intrigeri]] + +>>>>> I can see why this is bad, but to the best of my knowledge it's +>>>>> not a regression: each of the calls to 1-argument `urlto` was +>>>>> previously a call to 3-argument `urlto`, which always produces +>>>>> a fully absolute URL, so in either case there isn't enough +>>>>> context to know the current language. Links that were previously +>>>>> 2-argument `urlto` still have a defined second argument; +>>>>> I've just edited `plugins/write` to clarify why the second +>>>>> argument should be provided whenever possible. --[[smcv]] + +>>>>>> Ok. I am sorry for the burden that arose from my +>>>>>> misunderstanding. No need to keep this bug open then => +>>>>>> [[done]] --[[intrigeri]] diff --git a/doc/bugs/urlto__40____34____34____44___...__44___1__41___not_absolute.mdwn b/doc/bugs/urlto__40____34____34____44___...__44___1__41___not_absolute.mdwn new file mode 100644 index 000000000..8a93848b3 --- /dev/null +++ b/doc/bugs/urlto__40____34____34____44___...__44___1__41___not_absolute.mdwn @@ -0,0 +1,9 @@ +[[!template id=gitbranch branch=smcv/ready/urlto author="[[Simon_McVittie|smcv]]"]] +[[!tag patch]] + +urlto() has a special-case for a zero-length first argument, but it +produces a relative path even if the third argument is given and true. + +My `ready/urlto` branch simplifies this special case so it works. --[[smcv]] + +[[merged|done]] --[[Joey]] diff --git a/doc/bugs/utf-8_bug_in_websetup.pm.mdwn b/doc/bugs/utf-8_bug_in_websetup.pm.mdwn new file mode 100644 index 000000000..debedb01c --- /dev/null +++ b/doc/bugs/utf-8_bug_in_websetup.pm.mdwn @@ -0,0 +1,22 @@ +[[!tag patch bugs]] + +I type chinese characters into the fields. After press "save setup" button the characters turn into gibberish. + +I submit a patch that solve the problem for me. --Lingo + +> Fully fixing it is slightly more complex, but now [[done]] --[[Joey]] + +---- + + --- websetup.pm 2009-12-02 05:07:46.000000000 +0800 + +++ /usr/share/perl5/IkiWiki/Plugin/websetup.pm 2010-01-08 22:05:16.000000000 +0800 + @@ -308,7 +308,8 @@ + $fields{$_}=$shown{$_} foreach keys %shown; + } + } + - + + + + IkiWiki::decode_form_utf8($form); + if ($form->submitted eq "Cancel") { + IkiWiki::redirect($cgi, $config{url}); + return; diff --git a/doc/bugs/web_reversion_on_ikiwiki.info.mdwn b/doc/bugs/web_reversion_on_ikiwiki.info.mdwn new file mode 100644 index 000000000..6f18cfcba --- /dev/null +++ b/doc/bugs/web_reversion_on_ikiwiki.info.mdwn @@ -0,0 +1,14 @@ +I created [[sandbox/revert me]] and then tried the revert button on +[[recentchanges]], but I was not allowed to revert it. The specific error +was + + Error: you are not allowed to change sandbox/revert_me.mdwn + +I've just tried reading through the revert code, and I haven't figured out +what permission I am lacking. Perhaps the error message could be a little +clearer on that. The error might have been thrown by git_parse_changes in +git.pm or check_canchange in IkiWiki.pm, via IkiWiki::Receive. -- Jon + +[[fixed|done]] --[[Joey]] + +: Brilliant, many thanks. -- [[Jon]] diff --git a/doc/bugs/wrapper_can__39__t_find_the_perl_modules.mdwn b/doc/bugs/wrapper_can__39__t_find_the_perl_modules.mdwn new file mode 100644 index 000000000..9804d86c5 --- /dev/null +++ b/doc/bugs/wrapper_can__39__t_find_the_perl_modules.mdwn @@ -0,0 +1,16 @@ +If i intsall perl modules in my custom directory, cgi wrapper can't find them. I found clearing enviroment variables in code of wrapper. But information about custom directories put to perl with PERL5LIB variable. + +Workaround: add newenviron variable PERL5LIB + +My additional question - what wrapper do? I'am russian hosting provider. I am interesting with ikiwiki. + +> The wrapper allows ikiwiki to run as the user who owns the wiki, which +> is generally not the same as the user that runs the web server. +> (It also handles some other things, like some locking.) +> +> As a suid program, the wrapper cannot safely let environment variables +> pass through. +> +> If you want to install ikiwiki's perl modules in a nonstandard location, +> you can set `INSTALL_BASE` when running `Makefile.PL`. ikiwiki will then +> be built to look in that location. --[[Joey]] [[!tag done]] diff --git a/doc/bugs/yaml_setup_file_does_not_support_UTF-8_if_XS_is_installed.mdwn b/doc/bugs/yaml_setup_file_does_not_support_UTF-8_if_XS_is_installed.mdwn new file mode 100644 index 000000000..349464844 --- /dev/null +++ b/doc/bugs/yaml_setup_file_does_not_support_UTF-8_if_XS_is_installed.mdwn @@ -0,0 +1,65 @@ +I converted an ikiwiki setup file to YAML as +[[documented|tips/yaml_setup_files]]. + +On my Debian Squeeze system, attempting to build the wiki using the +YAML setup file triggers the following error message: + + YAML::XS::Load Error: The problem: + + Invalid trailing UTF-8 octet + + was found at document: 0 + usage: ikiwiki [options] source dest + ikiwiki --setup configfile + +Indeed, my setup file contains UTF-8 characters. + +Deinstalling YAML::XS ([[!debpkg libyaml-libyaml-perl]]) resolves this +issue. According to YAML::Any's POD, YAML::Syck is used instead of +YAML::XS in this case since it's the best YAML implementaion available +on my system. + +No encoding-related setting is mentionned in YAML::XS' POD. We may +consider there is a bug in there. I'll see if it's known / fixed +somewhere as soon as I get online. + +Joey, as a (hopefully) temporary workaround, what do you think of +explicitely using YAML::Syck (or whatever other YAML implementation +that does not expose this bug) rather than letting YAML::Any pick its +preferred one? + +--[[intrigeri]] + +> Upgrading YAML::XS ([[!debpkg libyaml-libyaml-perl]]) to current sid +> version (0.34-1) fixes this bug for me. --[[intrigeri]] + +>> libyaml-syck-perl's description mentions that the module is now +>> deprecated. (I had to do some ugly workaround to make unicode work with +>> Syck earlier.) So it appears the new YAML::Xs is the +>> way to go longterm, and presumably YAML::Any will start depending on it +>> in due course? --[[Joey]] + +>>> Right. Since this bug is fixed in current testing/sid, only +>>> Squeeze needs to be taken care of. As far as Debian Squeeze is +>>> concerned, I see two ways out of the current buggy situation: +>>> +>>> 1. Add `Conflicts: libyaml-libyaml-perl (< 0.34-1~)` to the +>>> ikiwiki packages uploaded to stable and squeeze-backports. +>>> Additionally uploading the newer, fixed `libyaml-libyaml-perl` +>>> to squeeze-backports would make the resulting situation a bit +>>> easier to deal with from the Debian stable user point of view. +>>> 2. Patch the ikiwiki packages uploaded to stable and +>>> squeeze-backports: +>>> - either to workaround the bug by explicitly using YAML::Syck +>>> (yeah, it's deprecated, but it's Debian stable) +>>> - or to make the bug easier to workaround by the user, e.g. by +>>> warning her of possible problems in case YAML::Any has chosen +>>> YAML::XS as its preferred implementation (the +>>> `YAML::Any->implementation` module method can come in handy +>>> in this case). +>>> +>>> I tend to prefer the first aforementioned solution, but any of +>>> these will anyway be kinda ugly, so... + +>>>> I was wrong: I just experienced that bug with YAML::XS 0.34-1 +>>>> too. Seems like [[!cpanrt 54683]]. --[[intrigeri]] diff --git a/doc/competition.mdwn b/doc/competition.mdwn new file mode 100644 index 000000000..2c782ea92 --- /dev/null +++ b/doc/competition.mdwn @@ -0,0 +1,19 @@ +When I started ikiwiki in 2006, there were no other existing systems that +filled quite the niche of generating a static html wiki out of markdown +files stored in a [[VCS|rcs]]. My +[first blog about ikiwiki](http://kitenet.net/~joey/blog/entry/seeking_wiki/) +looked at some projects that were semi-close, and found them wanting. + +My hope was that besides being useful to all its [[users|ikiwikiusers]], +ikiwiki would help spread its underlying concepts. Let a thousand flowers +bloom! These are some that have sprung up since. --[[Joey]] + +* [Gitit](http://gitit.johnmacfarlane.net/) is a wiki backed by a git (or + darcs) filestore. No static rendering here; pages are generated on the fly. + It's written in Haskell and uses the amazing PanDoc to generate html + from markdown or many other formats. + +* [Markdoc](http://blog.zacharyvoase.com/post/246800035) statically builds + a wiki from markdown source (which can be in a VCS, if you check it in). + It includes a built-in webserver to ease serving the generated static + html. diff --git a/doc/css.mdwn b/doc/css.mdwn index 5b6b9e1af..bc070cb99 100644 --- a/doc/css.mdwn +++ b/doc/css.mdwn @@ -3,11 +3,22 @@ ## Using CSS with ikiwiki Ikiwiki comes with two CSS stylesheets: [[style.css]] and [[local.css]]. -The idea is to customize the second one overriding the first one and +The idea is to customize the second one, overriding the first one and defining brand new rendering rules. While ikiwiki's default use of stylesheets is intentionally quite plain and minimalistic, CSS allows creating any kind of look you can dream up. +The [[theme_plugin|plugins/theme]] provides some prepackaged [[themes]] in an +easy to use way. + The [[css_market]] page is an attempt to collect user contributed local.css files. + +## Per-page CSS + +The [[plugins/meta]] plugin can be used to add additional style sheets to a +page. + +The [[plugins/localstyle]] plugin can be used to override the toplevel +[[local.css]] for a whole section of the wiki. diff --git a/doc/css_market.mdwn b/doc/css_market.mdwn index c0e349552..3f5627028 100644 --- a/doc/css_market.mdwn +++ b/doc/css_market.mdwn @@ -4,6 +4,9 @@ User contributed stylesheet files for ikiwiki. Unless otherwise noted, these style sheets can be installed by copying them into your wiki's source dir with a filename of `local.css`. +Some of stylesheets have developed into fullfledged [[themes]] that are +included in ikiwiki for easy use. + Feel free to add your own stylesheets here. (Upload as wiki pages; wiki gnomes will convert them to css files..) @@ -15,7 +18,6 @@ gnomes will convert them to css files..) * **[[css_market/kirkambar.css]]**, contributed by [[Roktas]]. This far from perfect stylesheet follows a [Gitweb](http://www.kernel.org/git/?p=git/git.git;a=tree;f=gitweb) like theme, so it may provide a consistent look'n feel along with the [[rcs/git]] backend. ;-) - You can see it in action on [kirkambar](http://kirkambar.net/) (Turkish content). [[!meta stylesheet="kirkambar"]] * **[[css_market/embeddedmoose.css]]**, contributed by [[JoshTriplett]]. @@ -46,15 +48,12 @@ gnomes will convert them to css files..) * **[contraste.css][4]**, contributed by [[Blanko]]. Can be seen on [Contraste Demo][5]. Local.css and templates available [here][6]. -* **[[css_market/actiontabs.css]]**, contributed by [[svend]]. This style sheet displays - the action list (Edit, RecentChanges, etc.) as tabs. - [[!meta stylesheet="actiontabs"]] - -If your web browser allows selecting between multiple stylesheets, this -page can be viewed using many of the stylesheets above. For example, if -using Epiphany with the Select Stylesheet extension enabled, use View -> -Style. In Firefox or Iceweasel, use View -> Page Style. - +* **[wiki.css](http://cyborginstitute.net/includes/wiki.css)** by [[tychoish]]. + I typically throw this in as `local.css` in new wikis as a slightly more clear and readable + layout for wikis that need to be functional and elegant, but not necessarily uniquely designed. + Currently in use by the [the outeralliance wiki](http://oa.criticalfutures.com/). +* **[ikiwiked gray-green](https://github.com/AntPortal/ikiwiked/raw/master/theme/gray-green/local.css)**, contributed by [Danny Castonguay](https://antportal.com/). +* **[ikiwiked gray-orange](https://github.com/AntPortal/ikiwiked/raw/master/theme/gray-orange/local.css)**, contributed by [Danny Castonguay](https://antportal.com/). Can be seen in action at [antportal.com/wiki](https://antportal.com/wiki/). Feel free to modify and contribute on [Github](https://github.com/AntPortal/ikiwiked) <!-- Page links --> [1]: http://blankoworld.homelinux.com/demo/ikiwiki/blankoblues/src/local.css (Download Blankoblues CSS) diff --git a/doc/css_market/actiontabs.css b/doc/css_market/actiontabs.css deleted file mode 100644 index a1dc47e92..000000000 --- a/doc/css_market/actiontabs.css +++ /dev/null @@ -1,122 +0,0 @@ -/* ikiwiki local style sheet */ - -/* Add local styling here, instead of modifying style.css. */ - -a { - text-decoration: none; - color: #005a9c; -} - -a:hover { - text-decoration: underline; -} - - -hr { - border-style: none; - background-color: #999; - height: 1px; -} - -code, pre { - background: #eee; -} - -pre { - padding: .5em; -} - -body { - margin: 0; - padding: 0; - font-family: sans-serif; - color: black; - background: white; -} - -.pageheader { - margin: 0; - padding: 1em 2em 0 2em; - background: #eee; - border-color: #999; - border-style: none none solid none; - border-width: 1px; -} - -.header { - font-size: 100%; - font-weight: normal; -} - -.title { - display: block; - margin-top: .2em; - font: 140% sans-serif; - text-transform: capitalize; -} - -.actions { - text-align: right; - padding: 0; -} - -#content, #comments, #footer { - margin: 1em 2em; -} - -#pageinfo { - border-color: #999; -} - -.inlinepage { - margin: .4em 0; - padding: .4em 0; - border-style: none; - border-top: 1px solid #aaa; -} - -.inlineheader { - font-size: 120%; - font-weight: normal; -} - -h1 { font: 120% sans-serif } -h2 { font: bold 100% sans-serif } -h3 { font: italic 100% sans-serif } -h4, h5, h6 { font: small-caps 100% sans-serif } - -/* Smaller headings for inline pages */ -.inlinepage h1 { font-size: 110% } -.inlinepage h2 { font-size: 100% } -.inlinepage h3 { font-size: 100% } - -.pageheader .actions ul { - border-style: none -} - -.actions ul { - font-size: 75%; - padding: 0; - border-style: none; -} - -.actions ul li a { - text-decoration: none; -} - -.actions ul li { - margin: 0; - padding: .1em .5em 0 .5em; - background: white; - border-color: #999; - border-style: solid solid none solid; - border-width: 1px; -} - -div.recentchanges { - border-style: none; -} - -.pagecloud { - width: auto; -} diff --git a/doc/css_market/kirkambar.css b/doc/css_market/kirkambar.css index 76d9ba771..e756a1260 100644 --- a/doc/css_market/kirkambar.css +++ b/doc/css_market/kirkambar.css @@ -40,7 +40,7 @@ pre, tt, code { monospace; } -pre, tt, code, tr.changeinfo, #blogform { +pre, tt, code, tr.changeinfo, .blogform { color: inherit; background-color: #f6f6f0; } diff --git a/doc/download.mdwn b/doc/download.mdwn index 8f5004ca9..f1ae5ad31 100644 --- a/doc/download.mdwn +++ b/doc/download.mdwn @@ -1,16 +1,19 @@ -Here's how to get ikiwiki. See [[setup]] for how to use it, and be sure to -add your wiki to [[IkiwikiUsers]] if you use ikiwiki. +Here's how to get ikiwiki in source or prepackaged form. See [[setup]] for +how to use it, and be sure to add your wiki to [[IkiwikiUsers]] if you use +ikiwiki. -## tarball +## source + +Ikiwiki is developed in a [[git_repository|git]]. The best place to download a tarball of the latest release is from <http://packages.debian.org/unstable/source/ikiwiki>. -Installation steps and requirements are listed on the [[install]] page. +Manual installation steps and requirements are listed on the [[install]] page. -## packages +## Debian / Ubuntu packages -To install with apt, if using Debian or Ubuntu: +To install with [apt](http://www.debian.org/doc/manuals/debian-reference/ch02.en.html#_basic_package_management_operations), if using Debian or Ubuntu: apt-get install ikiwiki @@ -19,26 +22,31 @@ Or download the deb from <http://packages.debian.org/unstable/web/ikiwiki>. There is a backport of a recent version of ikiwiki for Debian 5.0 at <http://packages.debian.org/lenny-backports/ikiwiki>. -Fedora versions 8 and newer have RPMs of ikiwiki available. - There is also an unofficial backport of ikiwiki for Ubuntu Jaunty, provided by [[Paweł_Tęcza|users/ptecza]], at [http://gpa.net.icm.edu.pl/ubuntu/](http://gpa.net.icm.edu.pl/ubuntu/index-en.html). +## RPM packages + +Fedora versions 8 and newer have RPMs of ikiwiki available. + +Ikiwiki's source includes a RPM spec file, which you can use to build your +own RPM. + +## BSD ports + +Ikiwiki can be installed [from macports](http://www.macports.org/ports.php?by=name&substr=ikiwiki) +by running `sudo port install ikiwiki`. + NetBSD and many other platforms: pkgsrc has an [ikiwiki package](ftp://ftp.netbsd.org/pub/pkgsrc/current/pkgsrc/www/ikiwiki/README.html). FreeBSD has ikiwiki in its [ports collection](http://www.freshports.org/www/ikiwiki/). +## Other packages + Gentoo has an [ebuild](http://bugs.gentoo.org/show_bug.cgi?id=144453) in its bug database. The [openSUSE Build Service](http://software.opensuse.org/search?baseproject=ALL&p=1&q=ikiwiki) has packages for openSUSE -IkiWiki can be installed [from macports](http://www.macports.org/ports.php?by=name&substr=ikiwiki) -by running `sudo port install ikiwiki`. - A [PKGBUILD for Arch Linux](http://aur.archlinux.org/packages.php?ID=12284) is in the AUR. - -## revision control - -Ikiwiki is developed in a [[git_repository|git]]. diff --git a/doc/examples/blog.mdwn b/doc/examples/blog.mdwn index f542cad0c..5f8f6c3ce 100644 --- a/doc/examples/blog.mdwn +++ b/doc/examples/blog.mdwn @@ -5,24 +5,21 @@ Or, run this command to set up a blog with ikiwiki. % ikiwiki -setup /etc/ikiwiki/auto-blog.setup -Some additional configuration you might want to do: +Some additional configuration you might want to do, if not using +`auto-blog.setup`: * Make sure to configure ikiwiki to generate RSS or Atom feeds. -* Make sure you have the tag plugin enabled, and tag posts using it. An - example of how to tag a post is: - \[[!tag tags/life]] - -* Enable the [[sidebar|plugins/sidebar]] plugin to get a sidebar listing all - the categories you've tagged posts with. +* Make sure you have the [[tag|plugins/tag]] plugin enabled, and the + `tagbase` set to "tags". Tag pages will then automatically be created. + An example of how to tag a post is: + \[[!tag life]] * Enable the [[pagestats|plugins/pagestats]] plugin to get a tag cloud to display on the [[index]]. -* Enable the [[comments|plugins/comments]] plugin and configure it to - enable comments to posts to the blog: - - comments_pagespec => 'blog/posts/* and !*/Discussion', +* Enable the [[comments|plugins/comments]] plugin to + enable comments to posts to the blog. * Enable the [[calendar|plugins/calendar]] plugin and run the [[ikiwiki-calendar]] command from cron daily to get an interlinked diff --git a/doc/examples/blog/archives.mdwn b/doc/examples/blog/archives.mdwn new file mode 100644 index 000000000..d07b73b74 --- /dev/null +++ b/doc/examples/blog/archives.mdwn @@ -0,0 +1,8 @@ +[[!if test="archives/*" then=""" +Browse through blog archives by year: +[[!map pages="./archives/* and !./archives/*/* and !*/Discussion"]] +""" +else=""" +You need to use the `ikiwiki-calendar` program to generate calendar-based +archive pages. +"""]] diff --git a/doc/examples/blog/comments.mdwn b/doc/examples/blog/comments.mdwn index 4735dea08..e22b50a34 100644 --- a/doc/examples/blog/comments.mdwn +++ b/doc/examples/blog/comments.mdwn @@ -1,3 +1,10 @@ -This page will show all comments made to posts in my [[blog|index]]. +[[!sidebar content=""" +[[!inline pages="comment_pending(./posts/*)" feedfile=pendingmoderation +description="comments pending moderation" show=-1]] +Comments in the [[!commentmoderation desc="moderation queue"]]: +[[!pagecount pages="comment_pending(./posts/*)"]] +"""]] -[[!inline pages="./posts/*/Discussion or internal(./posts/*/comment_*)"]] +Recent comments on posts in the [[blog|index]]: +[[!inline pages="./posts/*/Discussion or comment(./posts/*)" +template="comment"]] diff --git a/doc/examples/blog/index.mdwn b/doc/examples/blog/index.mdwn index 01b714fcd..7914cd203 100644 --- a/doc/examples/blog/index.mdwn +++ b/doc/examples/blog/index.mdwn @@ -1,13 +1,11 @@ -[[!pagestats pages="./tags/*" among="./posts/*"]] +[[!if test="enabled(sidebar)" then=""" +[[!sidebar]] +""" else=""" +[[!inline pages=sidebar raw=yes]] +"""]] -Welcome to my blog. - -Have a look at the most recent posts below, or browse the tag cloud on the -right. Archives of all [[posts]] and all [[comments]] are also available. - -[[!inline pages="./posts/* and !*/Discussion" show="10" +[[!inline pages="page(./posts/*) and !*/Discussion" show="10" actions=yes rootpage="posts"]] ----- This blog is powered by [ikiwiki](http://ikiwiki.info). diff --git a/doc/examples/blog/posts.mdwn b/doc/examples/blog/posts.mdwn index 4b2939120..08e014838 100644 --- a/doc/examples/blog/posts.mdwn +++ b/doc/examples/blog/posts.mdwn @@ -1,3 +1,3 @@ -Here is a full list of posts to my [[blog|index]]. +Here is a full list of posts to the [[blog|index]]. -[[!inline pages="./posts/* and !*/Discussion" archive=yes feedshow=10 quick=yes]] +[[!inline pages="page(./posts/*) and !*/Discussion" archive=yes feedshow=10 quick=yes]] diff --git a/doc/examples/blog/posts/first_post.mdwn b/doc/examples/blog/posts/first_post.mdwn index d49432341..343497d18 100644 --- a/doc/examples/blog/posts/first_post.mdwn +++ b/doc/examples/blog/posts/first_post.mdwn @@ -1,4 +1,2 @@ This is the first post to this example blog. To add new posts, just add files to the posts/ subdirectory, or use the web form. - -[[!tag tags/tech]] diff --git a/doc/examples/blog/sidebar.mdwn b/doc/examples/blog/sidebar.mdwn index a9fac388e..e0895f63f 100644 --- a/doc/examples/blog/sidebar.mdwn +++ b/doc/examples/blog/sidebar.mdwn @@ -1,7 +1,10 @@ -Example sidebar +[[!if test="enabled(calendar)" then=""" +[[!calendar pages="page(./posts/*) and !*/Discussion"]] +"""]] -* [[Blog|index]] -* [[Archive|posts]] +[[Recent Comments|comments]] -Categories: -[[!map pages="./tags/* and !*/Discussion"]] +[[Archives]] + +[[Tags]]: +[[!pagestats style="list" pages="./tags/*" among="./posts/*"]] diff --git a/doc/examples/blog/tags/life.mdwn b/doc/examples/blog/tags/life.mdwn deleted file mode 100644 index 719f2b192..000000000 --- a/doc/examples/blog/tags/life.mdwn +++ /dev/null @@ -1,4 +0,0 @@ -This feed contains pages in the "life" category. - -[[!inline pages="link(tags/life) and !*/Discussion" -show="10" actions=yes]] diff --git a/doc/examples/blog/tags/tech.mdwn b/doc/examples/blog/tags/tech.mdwn deleted file mode 100644 index e811cac34..000000000 --- a/doc/examples/blog/tags/tech.mdwn +++ /dev/null @@ -1,3 +0,0 @@ -This feed contains pages in the "tech" category. - -[[!inline pages="link(tags/tech) and !*/Discussion" show=10 actions=yes]] diff --git a/doc/examples/softwaresite.mdwn b/doc/examples/softwaresite.mdwn index e43a9d116..99f791177 100644 --- a/doc/examples/softwaresite.mdwn +++ b/doc/examples/softwaresite.mdwn @@ -14,3 +14,6 @@ Some additional configuration you might want to do: * Read the [[tips/integrated_issue_tracking_with_ikiwiki]] article for tips about how to use ikiwiki as a BTS. + +* Read [[tips/spam_and_softwaresites]] for information on how to keep spam + and spam-fighting commits out of your main version control history. diff --git a/doc/examples/softwaresite/news.mdwn b/doc/examples/softwaresite/news.mdwn index 9b53c7d99..20efba6e0 100644 --- a/doc/examples/softwaresite/news.mdwn +++ b/doc/examples/softwaresite/news.mdwn @@ -1,4 +1,4 @@ -This is where annoucements of new releases, features, and other news is +This is where announcements of new releases, features, and other news is posted. FooBar users are recommended to subscribe to this page's RSS feed. diff --git a/doc/features.mdwn b/doc/features.mdwn index 3925d78ef..0dbdba5df 100644 --- a/doc/features.mdwn +++ b/doc/features.mdwn @@ -13,7 +13,7 @@ Instead of editing pages in a stupid web form, you can use vim and commit changes via [[Subversion|rcs/svn]], [[rcs/git]], or any of a number of other [[Revision_Control_Systems|rcs]]. -ikiwiki can be run from a [[post-commit]] hook to update your wiki +Ikiwiki can be run from a [[post-commit]] hook to update your wiki immediately whenever you commit a change using the RCS. It's even possible to securely let @@ -25,7 +25,7 @@ run a simple wiki without page history, it can do that too. ## A wiki compiler -ikiwiki is a wiki compiler; it builds a static website for your wiki, and +Ikiwiki is a wiki compiler; it builds a static website for your wiki, and updates it as pages are edited. It is fast and smart about updating a wiki, it only builds pages that have changed (and tracks things like creation of new pages and links that can indirectly cause a page to need a rebuild) @@ -45,7 +45,7 @@ easily be added by [[plugins]]. For example it also supports traditional [[plugins/HTML]], or pages written in [[reStructuredText|plugins/rst]] or [[Textile|plugins/textile]]. -ikiwiki also supports files of any other type, including plain text, +Ikiwiki also supports files of any other type, including plain text, images, etc. These are not converted to wiki pages, they are just copied unchanged by ikiwiki as it builds your wiki. So you can check in an image, program, or other special file and link to it from your wiki pages. @@ -70,11 +70,15 @@ you would care to syndicate. ## Valid html and [[css]] -ikiwiki aims to produce -[valid XHTML 1.0](http://validator.w3.org/check?url=referer). ikiwiki -generates html using [[templates|wikitemplates]], and uses [[css]], so you +Ikiwiki aims to produce +[valid XHTML 1.0](http://validator.w3.org/check?url=referer). +(Experimental [[tips/HTML5]] support is also available.) + +Ikiwiki generates html using [[templates]], and uses [[css]], so you can change the look and layout of all pages in any way you would like. +Ikiwiki ships with several ready to use [[themes]]. + ## [[Plugins]] Plugins can be used to add additional features to ikiwiki. The interface is @@ -163,7 +167,7 @@ Well, sorta. Rather than implementing YA history browser, it can link to ### Full text search -ikiwiki can use the xapian search engine to add powerful +Ikiwiki can use the xapian search engine to add powerful full text [[plugins/search]] capabilities to your wiki. ### Translation via po files diff --git a/doc/forum.mdwn b/doc/forum.mdwn index 729540774..19ca9ed0b 100644 --- a/doc/forum.mdwn +++ b/doc/forum.mdwn @@ -1,6 +1,8 @@ -This is a place for questions and discussions that don't have a Discussion page fitting enough. Users of ikiwiki can ask questions here. +This is a place for questions and discussions that don't have a Discussion +page fitting enough. Users of ikiwiki can ask questions here. -_This is a bold experiment by me, since I have exactly such a question. This overrides the default content/discussion dichotomy, feel free to refactor and discuss! --ulrik_ +Note that for more formal bug reports or todo items, you can also edit the +[[bugs]] and [[todo]] pages. ## Current topics ## diff --git a/doc/forum/404_-_not_found.mdwn b/doc/forum/404_-_not_found.mdwn new file mode 100644 index 000000000..dc3318901 --- /dev/null +++ b/doc/forum/404_-_not_found.mdwn @@ -0,0 +1,22 @@ +Hi, + +I've followed the tutorial to install ikiwiki. Once installed (on a Ubuntu +10.04 distro running under VirtualBox on a Windows XP, SP3 host), I can +access the **http://ubuntu1004/index.lighttpd.html** page without any +issues. + +But when I try to access the page **http://ubuntu1004/~geertvc/gwiki** (as +is mentioned at the end of the ikiwiki setup), I get the error message +"**404 - not found**". + +I've also followed the "dot-cgi" trick, but with the same negative result. +The web server I've installed, is lighttpd. + +What did I miss? + +Best rgds, + +--Geert + +> Perhaps your webserver is not exporting your `public_html` directory +> in `~geertvc`? Check its configuration. --[[Joey]] diff --git a/doc/forum/404_-_not_found/comment_1_3dea2600474f77fb986767da4d507d62._comment b/doc/forum/404_-_not_found/comment_1_3dea2600474f77fb986767da4d507d62._comment new file mode 100644 index 000000000..453419cf3 --- /dev/null +++ b/doc/forum/404_-_not_found/comment_1_3dea2600474f77fb986767da4d507d62._comment @@ -0,0 +1,8 @@ +[[!comment format=mdwn + username="http://jmtd.livejournal.com/" + ip="188.222.50.68" + subject="comment 1" + date="2010-09-09T21:41:07Z" + content=""" +You probably need to run \"lighttpd-enable-mod userdir\" +"""]] diff --git a/doc/forum/404_-_not_found/comment_2_948e4678be6f82d9b541132405351a2c._comment b/doc/forum/404_-_not_found/comment_2_948e4678be6f82d9b541132405351a2c._comment new file mode 100644 index 000000000..c3fb72db5 --- /dev/null +++ b/doc/forum/404_-_not_found/comment_2_948e4678be6f82d9b541132405351a2c._comment @@ -0,0 +1,31 @@ +[[!comment format=mdwn + username="https://www.google.com/accounts/o8/id?id=AItOawllEHb4oGNaUrl7vyziQGrxAlQFri_BfaE" + nickname="Geert" + subject="comment 2" + date="2010-09-12T06:45:27Z" + content=""" +After a re-installation of ikiwiki (first removed all old files), I get the following feedback: + + Successfully set up gwiki: + url: http://ubuntu1004/~geertvc/gwiki + srcdir: ~/gwiki + destdir: ~/public_html/gwiki + repository: ~/gwiki.git + To modify settings, edit ~/gwiki.setup and then run: + ikiwiki -setup ~/gwiki.setup + + +In the lighttpd config file (/etc/lighttpd/lighttpd.conf), I've now changed the item \"server.document-root\" from the default \"/var/www\" to (in my case) \"/home/geertvc/public_html/gwiki\". I've taken the destdir location (see above) as document root for lighttpd. + +When doing this, I can see the \"index.html\" page of ikiwiki (by typing the following URL in the address box of the browser: \"ubuntu1004/index.html\"). So, that seems to be the right modification, right? Or isn't it? + +Note: when I take the directory \"/home/geertvc/gwiki\" (= the URL given above), then things do not work. I can't see the content of \"index.html\", I get the error message I mentioned in my initial post (404 - not found). + +But when clicking, for instance, the \"Edit\" button, the link brings me to the following location: + + http://ubuntu1004/~geertvc/gwiki/ikiwiki.cgi?page=index&do=edit + +However, there's not at all a file called \"ikiwiki.cgi\" at that location. The location of the file \"ikiwiki.cgi\" is \"/home/geertvc/public_html/gwiki\", so why is the link \"Edit\" leading me to that (wrong?) location? + +Apparently, something is still wrong with my settings. Hope, with the above information, someone can put me on the right track... +"""]] diff --git a/doc/forum/404_-_not_found/comment_3_4c7b1fa88776815bbc6aa286606214c2._comment b/doc/forum/404_-_not_found/comment_3_4c7b1fa88776815bbc6aa286606214c2._comment new file mode 100644 index 000000000..9f606f04e --- /dev/null +++ b/doc/forum/404_-_not_found/comment_3_4c7b1fa88776815bbc6aa286606214c2._comment @@ -0,0 +1,8 @@ +[[!comment format=mdwn + username="http://jmtd.livejournal.com/" + ip="78.105.191.131" + subject="Follow instructions" + date="2010-09-12T12:26:49Z" + content=""" +Please re-read my comment. If you enable usersdirs then /~user corresponds to ~/public_html. The change you have made has / corresponding instead, which is why the links don't work. +"""]] diff --git a/doc/forum/Apache_XBitHack.mdwn b/doc/forum/Apache_XBitHack.mdwn new file mode 100644 index 000000000..d5da0825e --- /dev/null +++ b/doc/forum/Apache_XBitHack.mdwn @@ -0,0 +1,28 @@ +I'd like to be able to use the Apache XBitHack to enable Server Side Includes on my site. Yes, it is possible to enable SSI by setting the page extension to .shtml, and that is what I am doing at the moment. +However, the disadvantage of this approach is that the server does not give a LastModified header, which means that the content can't be cached. However, the way that I am using SSI is such that the main content of the page really is "last modified" when the page was last modified, so I'd like to be able to indicate that. And using the XBitHack - that is, setting the executable bit on the generated page - would enable me to do that. + +I gather from the [[security]] page that having the executable bit set on files is considered a security hole, but how big a hole would it be if I'm the only one editing the site? Is there a way, a somewhat safe way, of implementing XBitHack for IkiWiki? + +-- [[KathrynAndersen]] + +> The risk with execute bits on files in the generated site is that someone +> commits an executable, ikiwiki copies it as-is, and now the web browser +> can be used to run it. Obviously if you're the only committer, that is +> not much of a risk. Or you can lock down apache to not allow running +> arbitrary files. It's also pretty unlikely that a rendered mdwn file +> would result in a html page that can be run as an executable. So an +> option that makes all files rendered from mdwn or other markups +> get the x bit set would be pretty safe even with untrusted editors. --[[Joey]] + +>> So how about this: if something has a page-type (i.e. mdwn or whatever authorized page types there are) +>> then add something at the end of the process (would that be the "changes" hook?) +>> which sets the x bit on the generated page file. Would that work? + +>> Or is there a way to say "tell me all the generated files that end in .html" and use that as a list to start from? + +>> --[[KathrynAndersen]] + +>>> Yes, the `change` hook is passed the names of source files that got +>>> built. Use `pagetype` to check which got htmlized (and filter out ones +>>> that got copied), and then use `htmlpage` to get the name of the html +>>> file that was generated, and chmod it. --[[Joey]] diff --git a/doc/forum/Asciidoc_plugin.mdwn b/doc/forum/Asciidoc_plugin.mdwn new file mode 100644 index 000000000..57d6fd91e --- /dev/null +++ b/doc/forum/Asciidoc_plugin.mdwn @@ -0,0 +1,14 @@ +I have completely overhauled the Asciidoc plugin for ikiwiki that was created by [[Karl Mowson|http://www.mowson.org/karl/colophon/]]. The source can be downloaded from my [[Dropbox|http://dl.dropbox.com/u/11256359/asciidoc.pm]]. + +### Features + +* Uses a filter hook to escape WikiLinks and Directives using Asciidoc `+++` passthrough macros, to avoid them being processed by Asciidoc. This behavior is configurable in the wiki setup file. +* Adds a preprocessor directive 'asciidoc' which allows extra Asciidoc command-line options to be passed on a per-page basis. Each parameter name is the option name (the leading `--` will be inserted automatically), and the parameter value is the option value. Currently, only 'conf-file' and 'doctype' are allowed (or even useful). +* Sets the page title from the first line in the Asciidoc file using a meta directive. This behavior is configurable in the wiki setup file. +* Searches for an Asciidoc configuration file named the same as the wiki if none is specified in the setup file. +* Asciidoc configuration files are stored in the wiki. They should be named `._conf` to avoid publishing them. + +### Problems + +* Escaping Directives is not optimal. It prevents markup from being used in Directives, and the passthrough macros have to include extra spaces to avoid having directives that return an empty string collapse to `++++++`. In addition, I had to borrow the regexps from the Ikiwiki source code; it would be nice if this were available as part of the API. +* Handling of Asciidoc errors is suboptimal; they are simply inserted into the returned page. This could be fixed in Perl 5.12 by using the run_forked() in IPC::Cmd. diff --git a/doc/forum/Blog_posting_times_and_ikiwiki_state.mdwn b/doc/forum/Blog_posting_times_and_ikiwiki_state.mdwn new file mode 100644 index 000000000..0c1da5b97 --- /dev/null +++ b/doc/forum/Blog_posting_times_and_ikiwiki_state.mdwn @@ -0,0 +1,28 @@ +What I wanted +------------- + +I thought to myself it would be nice to see from the console the dates that my ikiwiki blog posts were published. Especially as I would like to know the order of my todo list without having to view the webpage. + +What I discovered +----------------- + +Looked at the code and saw the functions for grabbing the ctime from git but couldn't reconcile them to the "Posted" date in the RSS feed. Some more reading and I figured out that the Posted time is taken from the UNIX ctime when first uploaded into the repository and this information is stored in the page state via a Perl storable database - indexdb. (I'm sure most know this but to be clear in UNIX ctime is *not* the actual creation time of a file. UNIX has no facility for recording the actual creation time - however on first upload to the wiki it's good enough). + +Wrote a Perl script to query and sort indexdb. Now I can list my todos or blog posts in the order they appear on the web. Handy. + +However the ikiwiki state is specifically excluded via '.gitignore'. I work a lot on trains and not having this file in my cloned wiki means I can't list published posts or my todos in the proper order. I can get an approximation from git logs but, dam it, I want it the same! + +What can I do? +-------------- + +Is it a spectacularly bad idea to include the ikiwiki state file in my cloned repo (I suspect it is). What else could be done? Can I disable pagestate somehow or force ikiwiki to always use git commit times for Posted times? + +> Have you tried running ikiwiki with the `--gettime` option on your laptop, +> to force it to retrieve initial commit times from git? You should only +> need to do that once, and it'll be cached in the pagestate thereafter. +> +> Because that functionality is slow on every supported VCS except git, +> ikiwiki tries to avoid using it unless it's really needed. [[rcs]] +> lists it as "fast" for git, though, so depending how fast it really is +> and how large your wiki is, you might be able to get away with running +> ikiwiki with `--gettime` all the time? --[[smcv]] diff --git a/doc/forum/Blog_posting_times_and_ikiwiki_state/comment_1_87304dfa2caea7e526cdb04917524e8c._comment b/doc/forum/Blog_posting_times_and_ikiwiki_state/comment_1_87304dfa2caea7e526cdb04917524e8c._comment new file mode 100644 index 000000000..62bae02b0 --- /dev/null +++ b/doc/forum/Blog_posting_times_and_ikiwiki_state/comment_1_87304dfa2caea7e526cdb04917524e8c._comment @@ -0,0 +1,8 @@ +[[!comment format=mdwn + username="https://www.google.com/accounts/o8/id?id=AItOawmwYptyV5ptNt8LCbMYsmpcNkk9_DRt-EY" + nickname="Matt" + subject="comment 1" + date="2010-11-04T11:52:53Z" + content=""" +Perhaps I have a different setup from you but on my laptop I don't have ikiwiki installed - only a clone of the git repo. You mean to run --gettime on the post-update git hook? +"""]] diff --git a/doc/forum/Can_OpenID_users_be_adminusers__63__.mdwn b/doc/forum/Can_OpenID_users_be_adminusers__63__.mdwn index 7599e71e5..17c60c423 100644 --- a/doc/forum/Can_OpenID_users_be_adminusers__63__.mdwn +++ b/doc/forum/Can_OpenID_users_be_adminusers__63__.mdwn @@ -62,7 +62,7 @@ index 0bf100a..77b467a 100644 >>>> So you can see if the two usernames/openids match. If the end is "0", >>>> they don't match. If nothing is logged, you have not enabled the websetup plugin. ->>>> If the end if "1" you should see the "Wiki Setup" button, if not the +>>>> If the end if "1" you should see the "Setup" button, if not the >>>> problem is not in determining if you're an admin, but elsewhere.. >>>> --[[Joey]] diff --git a/doc/forum/Can__39__t_get_comments_plugin_working.mdwn b/doc/forum/Can__39__t_get_comments_plugin_working.mdwn new file mode 100644 index 000000000..f189d9b64 --- /dev/null +++ b/doc/forum/Can__39__t_get_comments_plugin_working.mdwn @@ -0,0 +1,16 @@ +I feel like I must be missing something. + +My blog is based on Ikiwiki, and uses /yyyy/mm/dd/title/ for blog posts. +Because I use the plugin that generates index pages for subdirectories, I +have to use /????/??/??/* to identify posts and avoid missing the index +pages for years, months and days. + +I've enabled the comments plugin, but no matter what I do, I can't seem to make the comment form appear on my posts. I've removed the entire site and have rebuilt. I've set the pagespec to /????/??/??/* and ./????/??/??/*, but neither seems to work. I don't see any output, or anything else to indicate that pages aren't working. + +Are there any other plugins that need to be enabled for this to work? I think I've locked things down such that anonymous users can't edit by enabling signinedit and setting a lock, but this may be blocking the ability to comment (though I don't recall seeing anything in the docs about needing additional plugins.) + +> Just use '????/??/??/*' , and it will work. +> [[Pagespecs|ikiwiki/pagespec]] are automatically matched absolute to the +> top of the site, and adding a leading "/" is not necessary and will +> make the PageSpec not match. (And the relative PageSpec with "./" is +> not right in this case either). --[[Joey]] diff --git a/doc/forum/Can__39__t_get_ikiwiki_working_again_after_reinstall.mdwn b/doc/forum/Can__39__t_get_ikiwiki_working_again_after_reinstall.mdwn new file mode 100644 index 000000000..08187e6f2 --- /dev/null +++ b/doc/forum/Can__39__t_get_ikiwiki_working_again_after_reinstall.mdwn @@ -0,0 +1,16 @@ +My server got hacked by an EXIM vulnerability, and so I reimaged the system. After installing ikiwiki I can't get it to accept my old setup file, and I'm not sure what to do. + +I'm running debian stable with security updates. Running setup I get. +Can't use an undefined value as an ARRAY reference at /usr/share/perl5/IkiWiki/Setup/Standard.pm line 33. +That line in the source file has something todo with wrappers. Also since the reinstall there is no /etc/ikiwiki/auto.setup + +After futzing with it for over an hour I tried installing the debian backports version, and get a new different error. + +Can't exec "git": No such file or directory at /usr/share/perl5/IkiWiki/Plugin/git.pm line 169. +Cannot exec 'git pull origin': No such file or directory +'git pull origin' failed: at /usr/share/perl5/IkiWiki/Plugin/git.pm line 195. +Can't exec "git": No such file or directory at /usr/share/perl5/IkiWiki/Plugin/git.pm line 169. +Cannot exec 'git log --max-count=100 --pretty=raw --raw --abbrev=40 --always -c -r HEAD -- .': No such file or directory +'git log --max-count=100 --pretty=raw --raw --abbrev=40 --always -c -r HEAD -- .' failed: + +Any ideas how I can get ikiwiki working again? diff --git a/doc/forum/Can__39__t_get_ikiwiki_working_again_after_reinstall/comment_1_87a360155ff0502fe08274911cc6a53f._comment b/doc/forum/Can__39__t_get_ikiwiki_working_again_after_reinstall/comment_1_87a360155ff0502fe08274911cc6a53f._comment new file mode 100644 index 000000000..fa974765f --- /dev/null +++ b/doc/forum/Can__39__t_get_ikiwiki_working_again_after_reinstall/comment_1_87a360155ff0502fe08274911cc6a53f._comment @@ -0,0 +1,8 @@ +[[!comment format=mdwn + username="https://www.google.com/accounts/o8/id?id=AItOawkpwzlIQkUFJvJ8dF2-Y-sQklGpVB1fTzk" + nickname="Daniel" + subject="Fixed." + date="2011-01-19T10:18:16Z" + content=""" +Oops forgot to install git. Could have used a more helpful error message. +"""]] diff --git a/doc/forum/Can_custom_themes_live_somewhere_inside_srcdir__63__.mdwn b/doc/forum/Can_custom_themes_live_somewhere_inside_srcdir__63__.mdwn new file mode 100644 index 000000000..a07c31c00 --- /dev/null +++ b/doc/forum/Can_custom_themes_live_somewhere_inside_srcdir__63__.mdwn @@ -0,0 +1,8 @@ +Do custom [[themes]] have to live outside of the wiki (eg. `/usr/share/ikiwiki/themes/`) or is there a way for them to live inside of the wiki srcdir? + +I haven't been able to find a way so for now I'm just using a symlink, but that's a bit ugly. + +I ask because I do most of my ikiwiki work on my laptop and then push changes to my server. It's not a big deal but it's annoying to have to sync the themes separately and it seems like something which should be able to live inside the wiki like templates. + +Cheers, +[[AdamShand]] diff --git a/doc/forum/Can_custom_themes_live_somewhere_inside_srcdir__63__/comment_1_d1e79825dfb5213d2d1cba2ace1707b1._comment b/doc/forum/Can_custom_themes_live_somewhere_inside_srcdir__63__/comment_1_d1e79825dfb5213d2d1cba2ace1707b1._comment new file mode 100644 index 000000000..027127b41 --- /dev/null +++ b/doc/forum/Can_custom_themes_live_somewhere_inside_srcdir__63__/comment_1_d1e79825dfb5213d2d1cba2ace1707b1._comment @@ -0,0 +1,8 @@ +[[!comment format=mdwn + username="http://smcv.pseudorandom.co.uk/" + nickname="smcv" + subject="comment 1" + date="2011-01-29T18:17:40Z" + content=""" +The theme plugin is just a shortcut for adding an underlay with a style.css and maybe some images. If you want to base your design on a modified theme, copy the theme's style.css (or part of it) to the local.css in your wiki's repository; you can also copy in the images and disable the theme plugin entirely. +"""]] diff --git a/doc/forum/Can_custom_themes_live_somewhere_inside_srcdir__63__/comment_2_8177ede5a586b1a573a13fd26f8d3cc0._comment b/doc/forum/Can_custom_themes_live_somewhere_inside_srcdir__63__/comment_2_8177ede5a586b1a573a13fd26f8d3cc0._comment new file mode 100644 index 000000000..2b312731e --- /dev/null +++ b/doc/forum/Can_custom_themes_live_somewhere_inside_srcdir__63__/comment_2_8177ede5a586b1a573a13fd26f8d3cc0._comment @@ -0,0 +1,8 @@ +[[!comment format=mdwn + username="http://adam.shand.net/" + nickname="Adam" + subject="Doh." + date="2011-01-29T19:32:18Z" + content=""" +Ah that makes sense, thanks! +"""]] diff --git a/doc/forum/Cannot_write_to_commitlock.mdwn b/doc/forum/Cannot_write_to_commitlock.mdwn new file mode 100644 index 000000000..05490a799 --- /dev/null +++ b/doc/forum/Cannot_write_to_commitlock.mdwn @@ -0,0 +1,28 @@ +I am following the laptop wiki with git tip page. I have set up my local and remote wiki as suggested. However, when I try to push my local changes back to the server I get the following error: + +Writing objects: 100% (4/4), 359 bytes, done. +Total 4 (delta 2), reused 0 (delta 0) +cannot write to /home/ian/ianbarton/.ikiwiki/commitlock: No such file or directory +To ian@wilkesley.org:~/ikiwiki/ianbarton.git + 5cf9054..16a871d master -> master + +The relevnt bit of my setup file is: + +git_wrapper => '/home/ian/ianbarton.git/hooks/post-commit', + +Now ~/ianbarton/.ikiwiki exists and is owned and writable by me. I have tried touching commitlock and also removing lock and commitlock before pushing. Any suggestions for further trouble shooting? + +Ian. + +> I'm guessing that this is some kind of permissions problem, +> and that the error message is just being misleading. +> +> When you push the changes to the server, what user is +> git logging into the server as? If that user is different +> than `ian` (possibly due to using git-daemon?), the post-commit +> wrapper needs to be setuid to `ian`. This ensures that ikiwiki +> runs as you and can see and write to the files. --[[Joey]] + +The user is logging as ian, the same user as the laptop. I can push and pull git repos on the same server owned by the same user via ssh with no problem. I have deleted and re-started from scratch several times. However, for my use case I think it's simpler to keep the repo on my local computer and just rsync the web pages to the server. + +Ian. diff --git a/doc/forum/Debian_5.0.7:_Can__39__t_locate_IkiWiki__47__Setup__47__Automator.pm.mdwn b/doc/forum/Debian_5.0.7:_Can__39__t_locate_IkiWiki__47__Setup__47__Automator.pm.mdwn new file mode 100644 index 000000000..b501a11c8 --- /dev/null +++ b/doc/forum/Debian_5.0.7:_Can__39__t_locate_IkiWiki__47__Setup__47__Automator.pm.mdwn @@ -0,0 +1,19 @@ +Hi People, + +first thanks for this nice an usable piece of software. + +Recently i moved to a new server, reinstalled ikiwiki via aptitude and now i'm getting this error: + + ikiwiki -setup /etc/ikiwiki/auto.setup + /etc/ikiwiki/auto.setup: Can't locate IkiWiki/Setup/Automator.pm in @INC (@INC contains: /etc/perl /usr/local/lib/perl/5.10.0 /usr/local/share/perl/5.10.0 /usr/lib/perl5 /usr/share/perl5 /usr/lib/perl/5.10 /usr/share/perl/5.10 /usr/local/lib/site_perl .) at (eval 10) line 13. + +Or with an existing wiki: + + ikiwiki -setup younameit.setup + younameit.setup: Can't use an undefined value as an ARRAY reference at /usr/share/perl5/IkiWiki/Setup/Standard.pm line 33. + BEGIN failed--compilation aborted at (eval 10) line 293. + +Can you help? + +Best wishes, +Tobias. diff --git a/doc/forum/Debian_5.0.7:_Can__39__t_locate_IkiWiki__47__Setup__47__Automator.pm/comment_1_aec4bf4ca7d04d580d2fa83fd3f7166f._comment b/doc/forum/Debian_5.0.7:_Can__39__t_locate_IkiWiki__47__Setup__47__Automator.pm/comment_1_aec4bf4ca7d04d580d2fa83fd3f7166f._comment new file mode 100644 index 000000000..2c884e261 --- /dev/null +++ b/doc/forum/Debian_5.0.7:_Can__39__t_locate_IkiWiki__47__Setup__47__Automator.pm/comment_1_aec4bf4ca7d04d580d2fa83fd3f7166f._comment @@ -0,0 +1,8 @@ +[[!comment format=mdwn + username="http://joey.kitenet.net/" + nickname="joey" + subject="comment 1" + date="2011-01-25T19:18:21Z" + content=""" +You're using an old version of ikiwiki with setup files from a newer version. That won't work for various reasons, and the simplest fix is to upgrade your server to the version you had before. +"""]] diff --git a/doc/forum/Debian_5.0.7:_Can__39__t_locate_IkiWiki__47__Setup__47__Automator.pm/comment_4_c682ebb0e8e72088a8f92356dc31ef37._comment b/doc/forum/Debian_5.0.7:_Can__39__t_locate_IkiWiki__47__Setup__47__Automator.pm/comment_4_c682ebb0e8e72088a8f92356dc31ef37._comment new file mode 100644 index 000000000..35a258885 --- /dev/null +++ b/doc/forum/Debian_5.0.7:_Can__39__t_locate_IkiWiki__47__Setup__47__Automator.pm/comment_4_c682ebb0e8e72088a8f92356dc31ef37._comment @@ -0,0 +1,13 @@ +[[!comment format=mdwn + username="tk" + ip="79.222.20.29" + subject="comment 4" + date="2011-01-26T12:34:29Z" + content=""" +Thank you for the fast reply, Joey! +I tried it with ikiwiki from debian backports and it works as usual :) + +Bye, +Tobias. + +"""]] diff --git a/doc/forum/Different_templates_for_subdirectories__63_____40__Blogging_and_Wiki_pages__41__.mdwn b/doc/forum/Different_templates_for_subdirectories__63_____40__Blogging_and_Wiki_pages__41__.mdwn new file mode 100644 index 000000000..8d6700651 --- /dev/null +++ b/doc/forum/Different_templates_for_subdirectories__63_____40__Blogging_and_Wiki_pages__41__.mdwn @@ -0,0 +1,7 @@ +I have been mucking about with ikiwiki for two whole days now. + +I like many things about it. Even though I've been spending most of my time wrestling with css I did manage to write a whole lot of blog posts and love what ikiwiki is doing for the "revise" part of my writing cycle. And I like the idea of integrating the wiki and the blog into one unifying architecture.... + +But... I would like very much to have different page templates for blogging and wiki-ing, some way of specifying that for stuff in the "/posts" directory I'd rather use blogpost.tmpl rather than page.tmpl. I just spent a few minutes looking at the perl for this (I assume Render.pm) and my mind dumped core... + +(generically, some way to specify output formatting on a subdirectory basis would be good) diff --git a/doc/forum/Different_templates_for_subdirectories__63_____40__Blogging_and_Wiki_pages__41__/comment_1_15651796492a6f04a19f4a481947c97c._comment b/doc/forum/Different_templates_for_subdirectories__63_____40__Blogging_and_Wiki_pages__41__/comment_1_15651796492a6f04a19f4a481947c97c._comment new file mode 100644 index 000000000..e92f4107d --- /dev/null +++ b/doc/forum/Different_templates_for_subdirectories__63_____40__Blogging_and_Wiki_pages__41__/comment_1_15651796492a6f04a19f4a481947c97c._comment @@ -0,0 +1,16 @@ +[[!comment format=mdwn + username="https://www.google.com/accounts/o8/id?id=AItOawlY5yDefnXSHvWGbJ9kvhnAyQZiAAttENk" + nickname="Javier" + subject="comment 1" + date="2010-10-21T15:00:50Z" + content=""" +You can do what you want with the [[ikiwiki/directive/pagetemplate]] directive, but in a slightly cumbersome way, because you have to say what template you want in every page that differs from the default. + +See also: [[templates]] + +And, a perhaps more proper solution to your problem, although I don't fully understand the way of tackling it, in [[todo/multiple_template_directories]]. + +If you could create a proper page in this wiki, centralizing all the knowledge dispersed in those pages, it would be nice ;) + +--[[jerojasro]] +"""]] diff --git a/doc/forum/Discussion_PageSpec__63__.mdwn b/doc/forum/Discussion_PageSpec__63__.mdwn new file mode 100644 index 000000000..2860d0d17 --- /dev/null +++ b/doc/forum/Discussion_PageSpec__63__.mdwn @@ -0,0 +1,3 @@ +I've looked around but haven't found it. Can you set a Discussion PageSpec so only certain pages allow discussion? + +> Not currently, sorry. --[[Joey]] diff --git a/doc/forum/Doing_related_links_based_on_tags.mdwn b/doc/forum/Doing_related_links_based_on_tags.mdwn new file mode 100644 index 000000000..9f6a1b937 --- /dev/null +++ b/doc/forum/Doing_related_links_based_on_tags.mdwn @@ -0,0 +1,31 @@ +I've been recently using a template this + + ---- + Related posts: + + \[[!inline pages="blog/posts/* + and !blog/posts/*/* + and !Discussion + and !tagged(draft) + and <TMPL_VAR raw_tagged>" + archive="yes" + quick="yes" + show="5"]] + +Which I then call by doing this at the end of my blog posts on my +ikiwiki install + + \[[!tag software linux]] + \[[!template id=related tagged="tagged(software) or tagged(linux)"]] + +It somewhat works, I was wondering if anyone else has tried to do +something like the above to get "related posts" based on tags. The way +that I have done it isn't very clever as it only links to the last 5 +most recently posted items based on my parameters. Is it possible to +"randomly" select a bunch of links from a set of user defined +pagespecs? + +I know that the [[backlinks]] plugin exists for this sort of stuff +(related links), it just lacks some user configuration options. + +> I guess what you need is an extension to [[ikiwiki/pagespec/sorting]] to support "random" as a sort method. Remember though, that the chosen few would only change when the page was regenerated, not on every page view. -- [[Jon]] diff --git a/doc/forum/Dump_plugin.mdwn b/doc/forum/Dump_plugin.mdwn new file mode 100644 index 000000000..ff3bfea90 --- /dev/null +++ b/doc/forum/Dump_plugin.mdwn @@ -0,0 +1,4 @@ +I have a second plugin that adds a directive 'dump', and dumps all sorts of information (env variables and template variables) about a page into the end of the page. It's cheesy, but it's available in my [[Dropbox|http://dl.dropbox.com/u/11256359/dump.pm]] as well as the Asciidoc plugin. + +### Issues +* It really ought to use some kind of template instead of HTML. In fact, it ought to embed its information in template variables of some kind rather than stuffing it into the end of the page. diff --git a/doc/forum/Dump_plugin/comment_1_bfce80b3f5be78ec28692330843d4ae1._comment b/doc/forum/Dump_plugin/comment_1_bfce80b3f5be78ec28692330843d4ae1._comment new file mode 100644 index 000000000..855b72bbb --- /dev/null +++ b/doc/forum/Dump_plugin/comment_1_bfce80b3f5be78ec28692330843d4ae1._comment @@ -0,0 +1,14 @@ +[[!comment format=mdwn + username="https://www.google.com/accounts/o8/id?id=AItOawngqGADV9fidHK5qabIzKN0bx1ZIfvaTqs" + nickname="Glenn" + subject="New dump plugin" + date="2010-10-03T00:45:47Z" + content=""" +I took my own advice and rewrote the dump plugin so that it uses a template. A sample template has been added to my [[Dropbox|http://dl.dropbox.com/u/11256359/dump.tmpl]]. + +### Issues: + +* Dumps appear at the end of the page rather than where the directive occurs. +* For some reason I haven't yet figured out, dumps don't appear in page previews. +* I haven't tested inlined content and the dump plugin. +"""]] diff --git a/doc/forum/Error:_bad_page_name.mdwn b/doc/forum/Error:_bad_page_name.mdwn new file mode 100644 index 000000000..70277a1e4 --- /dev/null +++ b/doc/forum/Error:_bad_page_name.mdwn @@ -0,0 +1,46 @@ +I'm trying to use ikiwiki for the first time. In the start, I had problems +with installing the package, because I don't have a root account on my +server. + +When I solved this, I finally set up my wiki, but whenever I try to edit a +page, I get an error: “Error: bad page name”. + +What am I doing wrong? The wiki is at +<http://atrey.karlin.mff.cuni.cz/~onderka/wiki/>, the setupfile I used at +<http://atrey.karlin.mff.cuni.cz/~onderka/wiki/ikiwiki.setup>. + +> This means that one of the checks that ikiwiki uses to prevent +> editing files with strange or insecure names has fired incorrectly. +> Your setup file seems fine. +> We can figure out what is going wrong through a series of tests: +> +> * Test if your perl has a problem with matching alphanumerics: +> `perl -le 'print int "index"=~/^([-[:alnum:]+\/.:_]+)$/'` +> * Check if something is breaking pruning of disallowed files: +> `perl -le 'use IkiWiki; %config=IkiWiki::defaultconfig(); print ! IkiWiki::file_pruned("index")'` +> --[[Joey]] + +>>Both seem to run fine: + + onderka@atrey:~$ perl -le 'print int "index"=~/^([-[:alnum:]+\/.:_]+)$/' + 1 + onderka@atrey:~$ perl -le 'use IkiWiki; %config=IkiWiki::defaultconfig(); print ! IkiWiki::file_pruned("index")' + 1 + +>>> Try installing this [instrumented +>>> version](http://kitenet.net/~joey/tmp/editpage.pm) of +>>> `IkiWiki/Plugin/editpage.pm`, which will add some debugging info +>>> to the error message. --[[Joey]] + +>>>>When I tried to `make` ikiwiki with this file, I got the error + + ../IkiWiki/Plugin/editpage.pm:101: invalid variable interpolation at "$" + +>>>>> Sorry about that, I've corrected the above file. --[[Joey]] + +>>>>>> Hmm, funny. Now that I reinstalled it with your changed file, it started working. I didn't remember how exactly did I install it the last time, so this time, it seems I did it correctly. Thank you very much for your help. + +>>>>>>> Well, this makes me suspect you installed an older version of +>>>>>>> ikiwiki and my file, which is from the latest version, included a +>>>>>>> fix for whatever bug you were seeing. If I were you, I'd ensure +>>>>>>> that I have a current version of ikiwiki installed. --[[Joey]] diff --git a/doc/forum/Error_Code_1.mdwn b/doc/forum/Error_Code_1.mdwn new file mode 100644 index 000000000..3e6878bb7 --- /dev/null +++ b/doc/forum/Error_Code_1.mdwn @@ -0,0 +1,7 @@ +Hi. I'm new to ikiwiki. I typed + +"make install" + +and got "Error Code 1". + +Any help is appreciated. diff --git a/doc/forum/Error_Code_1/comment_1_0459afcc383aad382df67a19eaf2e731._comment b/doc/forum/Error_Code_1/comment_1_0459afcc383aad382df67a19eaf2e731._comment new file mode 100644 index 000000000..f4bb410eb --- /dev/null +++ b/doc/forum/Error_Code_1/comment_1_0459afcc383aad382df67a19eaf2e731._comment @@ -0,0 +1,8 @@ +[[!comment format=mdwn + username="http://joey.kitenet.net/" + nickname="joey" + subject="comment 1" + date="2011-02-25T18:57:22Z" + content=""" +Read the README file. Ikiwiki's source does not include a Makefile; you have to run ./Makefile.PL to create one. +"""]] diff --git a/doc/forum/Flowplayer.mdwn b/doc/forum/Flowplayer.mdwn new file mode 100644 index 000000000..9bf3ab3af --- /dev/null +++ b/doc/forum/Flowplayer.mdwn @@ -0,0 +1 @@ +[Flowplayer](http://flowplayer.org) is the open source flash video player plugin. [My site](http://mcfrisk.kapsi.fi) has raw html enabled to work with old content so I was able to use the raw html and javascript [examples](http://flowplayer.org/documentation/installation/index.html) in blog posts, but some of them fail when combined on the [aggregate page](http://mcfrisk.kapsi.fi/skiing/). Any hints on how to properly use Flowplayer with ikiwiki? diff --git a/doc/forum/Forward_slashes_being_escaped_as_252F.mdwn b/doc/forum/Forward_slashes_being_escaped_as_252F.mdwn new file mode 100644 index 000000000..5df81e561 --- /dev/null +++ b/doc/forum/Forward_slashes_being_escaped_as_252F.mdwn @@ -0,0 +1,33 @@ +When I try to edit a page that has a forward slash in the URL, I get "Error: +bad page name". I think the problem is because the forward slash is escaped as +`%252F` instead of `%2F`. + +For example, if I go to `http://ciffer.net/~svend/tech/hosts/` and click Edit, +I am sent to a page with the URL +`http://ciffer.net/~svend/ikiwiki.cgi?page=tech%252Fhosts&do=edit`. + +I am running ikiwiki 3.20100504~bpo50+1 on Debian Lenny. + + +> But on your page, the Edit link is escaped normally and correctly (using %2F). +> Look at the page source! +> +> The problem is that your web server is forcing a hard (302) redirect +> to the doubly-escaped url. In wireshark I see your web server send back: + + HTTP/1.1 302 Found\r\n + Apache/2.2.9 (Debian) PHP/5.2.6-1+lenny9 with Suhosin-Patch + Location: http://ciffer.net/~svend/ikiwiki.cgi?page=tech%252Fhosts&do=edit + +> You'll need to investigate why your web server is doing that... --[[Joey]] + +>> Thanks for pointing me in the right direction. I have the following redirect +>> in my Apache config. + + RewriteEngine on + RewriteCond %{HTTP_HOST} ^www\.ciffer\.net$ + RewriteRule /(.*) http://ciffer.net/$1 [L,R] + +>> and my ikiwiki url setting contained `www.ciffer.net`, which was causing the +>> redirect. Correcting the url fixed the problem. I'm still not sure why +>> Apache was mangling the URL. --[[Svend]] diff --git a/doc/forum/How_does_ikiwiki_remember_times__63__.mdwn b/doc/forum/How_does_ikiwiki_remember_times__63__.mdwn index 6ce576db1..6b7739fd0 100644 --- a/doc/forum/How_does_ikiwiki_remember_times__63__.mdwn +++ b/doc/forum/How_does_ikiwiki_remember_times__63__.mdwn @@ -20,15 +20,17 @@ Do I have it right? > Some VCS, like git, set the file mtimes to the current time > when making a new checkout, so they will be lost if you do that. > The creation times can be retrived using the `--getctime` option. -> I suppose it might be nice if there were a `--getmtime` that pulled -> true modification times out of the VCS, but I haven't found it a big -> deal in practice for the last modification times to be updated to the -> current time when rebuilding a wiki like this. --[[Joey]] +> --[[Joey]] > > > Thanks for the clarification. I ran some tests of my own to make sure I understand it right, and I'm satisfied > > that the order of posts in my blog can be retrieved from the VCS using the `--getctime` option, at least if I > > choose to order my posts by creation time rather than modification time. But I now know that I can't rely on > > page modification times in ikiwiki as these can be lost permanently. +> +> > > Update: It's now renamed to `--gettime`, and pulls both the creation +> > > and modification times. Also, per [[todo/auto_getctime_on_fresh_build]], +> > > this is now done automatically the first time ikiwiki builds a +> > > srcdir. So, no need to worry about this any more! --[[Joey]] > > > > I would suggest that there should at least be a `--getmtime` option like you describe, and perhaps that > > `--getctime` and `--getmtime` be _on by default_. In my opinion the creation times and modification times of @@ -91,19 +93,6 @@ Do I have it right? > A quick workaround for me to get modification times right is the following > little zsh script, which unfortunately only works for git: - #!/usr/bin/env zsh - - set +x - - for FILE in **/*(.); do - TIMES="`git log --pretty=format:%ai $FILE`" - MTIME="`echo $TIMES | head -n1`" - - if [ ! -z $MTIME ]; then - echo touch -m -d "$MTIME" $FILE - touch -m -d "$MTIME" $FILE - fi - - done +>> Elided; no longer needed since --gettime does that, and much faster! --[[Joey]] > --[[David_Riebenbauer]] diff --git a/doc/forum/How_to_list_new_pages__44___inline__63__.mdwn b/doc/forum/How_to_list_new_pages__44___inline__63__.mdwn new file mode 100644 index 000000000..f28e8b99b --- /dev/null +++ b/doc/forum/How_to_list_new_pages__44___inline__63__.mdwn @@ -0,0 +1,5 @@ +Hi, I'd love to include a "New posts" list into my front page, like at <http://danhixon.github.com/> for example. + +It should be different from recent changes in that it shouldn't show modifications of existing pages, and in that it would be inside a page with other content. + +Thanks, Thomas diff --git a/doc/forum/How_to_list_new_pages__44___inline__63__/comment_1_e989b18bade34a92a9c8fe7099036e15._comment b/doc/forum/How_to_list_new_pages__44___inline__63__/comment_1_e989b18bade34a92a9c8fe7099036e15._comment new file mode 100644 index 000000000..cf6f642d4 --- /dev/null +++ b/doc/forum/How_to_list_new_pages__44___inline__63__/comment_1_e989b18bade34a92a9c8fe7099036e15._comment @@ -0,0 +1,13 @@ +[[!comment format=mdwn + username="http://joey.kitenet.net/" + nickname="joey" + subject="use an inline directive" + date="2010-11-29T20:39:37Z" + content=""" +This is what the [[ikiwiki/directive/inline]] directive is for. It's often used, to for example, show new posts to a blog. If you want to show new posts to anywhere in your site, or whatever, you can configure the [[ikiwiki/PageSpec]] in it to do that, too. + +For example, you could use this: + + The most recent 3 pages added to this site: + \[[!inline pages=\"*\" archive=yes show=4]] +"""]] diff --git a/doc/forum/How_to_specify_repository_is_on_a_remote_host__63__.mdwn b/doc/forum/How_to_specify_repository_is_on_a_remote_host__63__.mdwn new file mode 100644 index 000000000..ad8f27252 --- /dev/null +++ b/doc/forum/How_to_specify_repository_is_on_a_remote_host__63__.mdwn @@ -0,0 +1,3 @@ +I'm new to ikiwiki and I'm trying to install it and set it up. I've read the documentation but I still don't understand how access to the repository works. We want ikiwiki to run on one machine but we want the repository to be on a separate machine running svn. How can I configure ikiwiki to access the repository on the remote machine? And how is authentication on the remote host handled in ikiwiki? Does there have to be a one-to-one correspondence between account names (and passwords) on the ikiwiki machine and the accounts on the svn machine? Thanks, + +Eric diff --git a/doc/forum/How_to_specify_repository_is_on_a_remote_host__63__/comment_1_0c71e17ae552cbab1056ac96fbd36c59._comment b/doc/forum/How_to_specify_repository_is_on_a_remote_host__63__/comment_1_0c71e17ae552cbab1056ac96fbd36c59._comment new file mode 100644 index 000000000..954ef0810 --- /dev/null +++ b/doc/forum/How_to_specify_repository_is_on_a_remote_host__63__/comment_1_0c71e17ae552cbab1056ac96fbd36c59._comment @@ -0,0 +1,9 @@ +[[!comment format=mdwn + username="http://adam.shand.net/" + nickname="Adam" + subject="Depending ..." + date="2011-01-25T03:00:12Z" + content=""" +... on exactly what you are trying to do, you may find some answers [[here|forum/how_to_setup_ikiwiki_on_a_remote_host/]]. + +"""]] diff --git a/doc/forum/How_to_specify_repository_is_on_a_remote_host__63__/comment_2_b309302a084fbd8bcd4cd9bd2509cf5a._comment b/doc/forum/How_to_specify_repository_is_on_a_remote_host__63__/comment_2_b309302a084fbd8bcd4cd9bd2509cf5a._comment new file mode 100644 index 000000000..4ceb69474 --- /dev/null +++ b/doc/forum/How_to_specify_repository_is_on_a_remote_host__63__/comment_2_b309302a084fbd8bcd4cd9bd2509cf5a._comment @@ -0,0 +1,10 @@ +[[!comment format=mdwn + username="http://joey.kitenet.net/" + nickname="joey" + subject="here's the scoop .. but, don't do it" + date="2011-01-25T18:30:01Z" + content=""" +To do what you describe, you would set up the svn repository on your server, and then do a regular svn checkout of it to the machine running ikiwiki, and configure ikiwiki to use that directory as its srcdir. The only unix user ikiwiki does anything as is the one you use to set it up, so it's up to you to allow that user to commit to svn without needing to enter a password. + +However, I don't recommend this configuration at all. You're adding a ssh (or webdav) connection overhead to every edit to the wiki, since ikiwiki's commit to svn will have to be pushed across the network to the server. And ikiwiki's svn support is missing many of the [[newer_ikiwiki_features|rcs]], such as including for example support for easily reverting edits. +"""]] diff --git a/doc/forum/Ikiwiki_CGI_not_working_on_my_server__44___and_it__39__s_a_binary_file__63__.mdwn b/doc/forum/Ikiwiki_CGI_not_working_on_my_server__44___and_it__39__s_a_binary_file__63__.mdwn new file mode 100644 index 000000000..35db20dc8 --- /dev/null +++ b/doc/forum/Ikiwiki_CGI_not_working_on_my_server__44___and_it__39__s_a_binary_file__63__.mdwn @@ -0,0 +1,33 @@ +Hey, trying to get ikiwiki working on my account on a shared webserver. Actually installing ikiwiki on the server is phase 2. For now I'm running the latest ikiwiki (from source) locally, compiling the output with the ikiwiki command, then rsyncing the output dir up to the server. This works for the static HTML files, but the CGI file doesn't work, the server redirects to an error page. The error log on the server says "Premature end of script headers: /path/to/ikiwiki.cgi" + +My first thought was that this is a Perl CGI and I would need to change the shebang to point to the unusual location of Perl on this server, it's at /usr/pkg/bin/perl. But when I looked at ikiwiki.cgi I found it was a binary file. + +Why is it a binary? And what can I do about this error? + +> It's a binary because it's SUID, so that it has permission to write to the ikiwiki repository. See [[security]], under 'suid wrappers', for more on that. +> +> As to why you get 'premature end of script headers', that suggests there is a problem running +> the script (and there is output occurring before the HTTP headers are printed). Do you have access +> to the webserver logs for your host? They might contain some clues. Are you sure that the webserver +> is setup for CGI properly? -- [[Jon]] + +> Quite likely your laptop and your server do not run the same +> OS, so the wrapper binary cannot just be copied from one +> to the other and run. Also, the wrapper is just that, a +> thin wrapper which then runs ikiwiki. As ikiwiki is not +> yet installed on your server, that's another reason what +> you're trying can't work. +> +> If installing ikiwiki on the server is not possible or +> too much work right now, you could try building your wiki +> on your laptop with cgi disabled in the setup file. +> The result would be a static website that you could deploy to +> the server this way. Of course, it wouldn't be editable +> on the server, and other features that need the CGI would +> also be disabled. --[[Joey]] + +> > Ah, ok thanks. Yes the server runs a different OS and ikiwiki +> > is not installed on it. I've got it working as a static site, +> > so if I want the CGI I'll have to install ikiwiki on the server. +> > Ok. It might not work as I don't have root access, but I might +> > give it a try. Thanks diff --git a/doc/forum/Map_Plugin__44___would_like_to_add___63__updated_to_all_links.mdwn b/doc/forum/Map_Plugin__44___would_like_to_add___63__updated_to_all_links.mdwn new file mode 100644 index 000000000..fcffe690f --- /dev/null +++ b/doc/forum/Map_Plugin__44___would_like_to_add___63__updated_to_all_links.mdwn @@ -0,0 +1,3 @@ +Map Plugin, would like to add ?updated to all links created. + +When I edit a page and then click that page in a map in a sidebar Safari always shows me a cached page. diff --git a/doc/forum/Map_Plugin__44___would_like_to_add___63__updated_to_all_links/comment_1_3fe4c5967e704355f9b594aed46baf67._comment b/doc/forum/Map_Plugin__44___would_like_to_add___63__updated_to_all_links/comment_1_3fe4c5967e704355f9b594aed46baf67._comment new file mode 100644 index 000000000..ce1a78584 --- /dev/null +++ b/doc/forum/Map_Plugin__44___would_like_to_add___63__updated_to_all_links/comment_1_3fe4c5967e704355f9b594aed46baf67._comment @@ -0,0 +1,13 @@ +[[!comment format=mdwn + username="justint" + ip="24.182.207.250" + subject="skip it" + date="2010-10-13T05:30:50Z" + content=""" +skip it, I added + + <meta http-equiv=\"expires\" value=\"Thu, 16 Mar 2000 11:00:00 GMT\" /> + <meta http-equiv=\"pragma\" content=\"no-cache\" /> + +to my page.tmpl and the problem went away. +"""]] diff --git a/doc/forum/Migrating_old_repository_to_new_ikiwiki_system__63__.mdwn b/doc/forum/Migrating_old_repository_to_new_ikiwiki_system__63__.mdwn index fe67e6aba..d7a33b526 100644 --- a/doc/forum/Migrating_old_repository_to_new_ikiwiki_system__63__.mdwn +++ b/doc/forum/Migrating_old_repository_to_new_ikiwiki_system__63__.mdwn @@ -20,10 +20,6 @@ How do I set up an ikiwiki system using a pre-existing repository (instead of cr > recreate the ikiwiki srcdir > 3. `git clone` from the bare git repository a second time, > to create a checkout you can manually edit (optional) -> 4. run `ikiwiki --getctime --setup your.setup` -> The getctime will ensure page creation times are accurate -> by putting the info out of the git history, -> and only needs to be done once. > > If you preserved your repository, but not the setup file, > the easiest way to make one is probably to run diff --git a/doc/forum/Moving_wiki.git_folder__63__.mdwn b/doc/forum/Moving_wiki.git_folder__63__.mdwn new file mode 100644 index 000000000..77d1da1ee --- /dev/null +++ b/doc/forum/Moving_wiki.git_folder__63__.mdwn @@ -0,0 +1,17 @@ +Hi folks, I created a simple wiki to keep notes and references for projects, it's worked quite nice so far. I decided to use git as it's what I use daily to manage code, and it's available on all my machines. + +Anyway, I wanted to move all the wiki source stuff into a subfolder so that it stops cluttering up my ~ directory. However, there seems to be a problem with moving wiki.git (I moved wiki, wiki.git and wiki.setup) and I'm not sure where to tell ikiwiki that the git directory has been moved. I changed + + srcdir => '/home/pixel/.notebook/wiki', + git_wrapper => '/home/pixel/.notebook/wiki.git/hooks/post-update', + +and that seems to be fine. However when I go to run ikiwiki --setup things go wrong: + + pixel@tosh: [~ (ruby-1.9.2-p0)] ➔ ikiwiki -setup .notebook/wiki.setup + successfully generated /home/pixel/public_html/wiki/ikiwiki.cgi + successfully generated /home/pixel/.notebook/wiki.git/hooks/post-update + fatal: '/home/pixel/wiki.git' does not appear to be a git repository + fatal: The remote end hung up unexpectedly + 'git pull origin' failed: at /usr/share/perl5/IkiWiki/Plugin/git.pm line 193. + +I've gone through wiki.setup and nothing has jumped out as the place to set this, have I missed something? diff --git a/doc/forum/Moving_wiki.git_folder__63__/comment_1_05238461520613f4ed1b0d02ece663bd._comment b/doc/forum/Moving_wiki.git_folder__63__/comment_1_05238461520613f4ed1b0d02ece663bd._comment new file mode 100644 index 000000000..d654591c0 --- /dev/null +++ b/doc/forum/Moving_wiki.git_folder__63__/comment_1_05238461520613f4ed1b0d02ece663bd._comment @@ -0,0 +1,11 @@ +[[!comment format=mdwn + username="http://users.itk.ppke.hu/~cstamas/openid/" + ip="212.183.140.47" + subject="comment 1" + date="2010-10-27T22:45:28Z" + content=""" +I think you want to edit + + .git/config + +"""]] diff --git a/doc/forum/Moving_wiki.git_folder__63__/comment_2_72b2b842dfa0cfaf899fe7af12977519._comment b/doc/forum/Moving_wiki.git_folder__63__/comment_2_72b2b842dfa0cfaf899fe7af12977519._comment new file mode 100644 index 000000000..f2e7ece18 --- /dev/null +++ b/doc/forum/Moving_wiki.git_folder__63__/comment_2_72b2b842dfa0cfaf899fe7af12977519._comment @@ -0,0 +1,10 @@ +[[!comment format=mdwn + username="http://pixel.dreamwidth.org/" + ip="65.29.14.21" + subject="comment 2" + date="2010-10-28T02:54:15Z" + content=""" +That did it thanks! + +Should I make some sort of edit in the setup page? I've used git for a while and for whatever reason it never occurred to me that this was from git, not from ikiwiki itself. +"""]] diff --git a/doc/forum/Need_something_more_powerful_than_Exclude.mdwn b/doc/forum/Need_something_more_powerful_than_Exclude.mdwn new file mode 100644 index 000000000..5e8043258 --- /dev/null +++ b/doc/forum/Need_something_more_powerful_than_Exclude.mdwn @@ -0,0 +1,5 @@ +When I originally looked at the "exclude" option, I thought it meant that it excluded pages completely, but it apparently doesn't. What I've found in practice is that a file which matches the "exclude" regex is excluded from *processing*, but it is still copied over to the destination directory. Thus, for example, if I have "^Makefile$" as the exclude pattern, and I have a file `src/foo/Makefile`, then that file is copied unaltered into `dest/foo/Makefile`. However, what I want is for `src/foo/Makefile` to be completely ignored: that it is not only not processed, but not even *copied* into the destination directory. + +I'm not sure if the current behaviour is a bug or a feature, but I would like a "totally ignore this file" feature if it's possible to have one. + +-- [[KathrynAndersen]] diff --git a/doc/forum/Need_something_more_powerful_than_Exclude/comment_2_0019cd6b34c8d8678b2532de57a92d15._comment b/doc/forum/Need_something_more_powerful_than_Exclude/comment_2_0019cd6b34c8d8678b2532de57a92d15._comment new file mode 100644 index 000000000..7842caeac --- /dev/null +++ b/doc/forum/Need_something_more_powerful_than_Exclude/comment_2_0019cd6b34c8d8678b2532de57a92d15._comment @@ -0,0 +1,12 @@ +[[!comment format=mdwn + username="http://smcv.pseudorandom.co.uk/" + nickname="smcv" + subject="expression anchored too closely?" + date="2010-11-23T10:43:21Z" + content=""" +It looks as though you might only be excluding a top-level Makefile, and not a Makefile in subdirectories. Try excluding `(^|/)Makefile$` instead, for instance? (See `wiki_file_prune_regexps` in `IkiWiki.pm` for hints.) + +The match operation in `&file_pruned` ends up a bit like this: + + \"foo/Makefile\" =~ m{…|…|…|(^|/)Makefile$} +"""]] diff --git a/doc/forum/Need_something_more_powerful_than_Exclude/comment_2_f577ab6beb9912471949d8d18c790267._comment b/doc/forum/Need_something_more_powerful_than_Exclude/comment_2_f577ab6beb9912471949d8d18c790267._comment new file mode 100644 index 000000000..bd964d540 --- /dev/null +++ b/doc/forum/Need_something_more_powerful_than_Exclude/comment_2_f577ab6beb9912471949d8d18c790267._comment @@ -0,0 +1,11 @@ +[[!comment format=mdwn + username="http://kerravonsen.dreamwidth.org/" + ip="60.241.8.244" + subject="Missed It By That Much" + date="2010-11-25T02:55:20Z" + content=""" +I discovered that I not only needed to change the regexp, but I also needed to delete .ikiwiki/indexdb because `file_pruned` only gets called for files that aren't in the `%pagesources` hash, and since the file in question was already there because it had been put there before the exclude regex was changed, it wasn't even being checked! + +[[KathrynAndersen]] + +"""]] diff --git a/doc/forum/Need_something_more_powerful_than_Exclude/comment_3_1ed260b0083a290688425a006a83f603._comment b/doc/forum/Need_something_more_powerful_than_Exclude/comment_3_1ed260b0083a290688425a006a83f603._comment new file mode 100644 index 000000000..8b93acd79 --- /dev/null +++ b/doc/forum/Need_something_more_powerful_than_Exclude/comment_3_1ed260b0083a290688425a006a83f603._comment @@ -0,0 +1,8 @@ +[[!comment format=mdwn + username="http://joey.kitenet.net/" + nickname="joey" + subject="comment 3" + date="2010-11-29T20:41:49Z" + content=""" +`%pagesources` gets nuked when you rebuild the whole wiki with eg, ikiwiki -setup or ikiwiki -rebuild. So you shouldn't normally need to remove the indexdb, just rebuild when making this sort of change that affects the whole site. +"""]] diff --git a/doc/forum/Need_something_more_powerful_than_Exclude/comment_4_c39bdaf38e1e20db74eb26f0560bd673._comment b/doc/forum/Need_something_more_powerful_than_Exclude/comment_4_c39bdaf38e1e20db74eb26f0560bd673._comment new file mode 100644 index 000000000..15f1fecb8 --- /dev/null +++ b/doc/forum/Need_something_more_powerful_than_Exclude/comment_4_c39bdaf38e1e20db74eb26f0560bd673._comment @@ -0,0 +1,10 @@ +[[!comment format=mdwn + username="http://kerravonsen.dreamwidth.org/" + ip="60.241.8.244" + subject="comment 4" + date="2010-11-30T02:35:43Z" + content=""" +One would think that would be the case, yes, but for some reason it didn't work for me. 8-( + +[[KathrynAndersen]] +"""]] diff --git a/doc/forum/News_site_where_articles_are_submitted_and_then_reviewed_before_posting.mdwn b/doc/forum/News_site_where_articles_are_submitted_and_then_reviewed_before_posting.mdwn new file mode 100644 index 000000000..8dd755274 --- /dev/null +++ b/doc/forum/News_site_where_articles_are_submitted_and_then_reviewed_before_posting.mdwn @@ -0,0 +1,23 @@ +[[!meta date="2008-04-28 14:57:25 -0400"]] + +I am considering moving a news site to Ikiwiki. I am hoping that Ikiwiki has a feature where anonymous posters can submit a form that moderators can review and then accept for it to be posted on a news webpage (like front page of the website). + +> Well, you can have one blog that contains unreviewed articles, and +> moderators can then add a tag that makes the article show up in the main +> news feed. There's nothing stopping someone submitting an article +> pre-tagged though. If you absolutely need to lock that down, you could +> have one blog with unreviewed articles in one subdirectory, and reviewers +> then move the file over to another subdirectory when they're ready to +> publish it. (This second subdirectory would be locked to prevent others +> from writing to it.) --[[Joey]] + +Also it would be good if the news page would keep maybe just the latest 10 entries with links to an archive that make it easy to browse to old entries by date. (Could have over a thousand news articles.) + +> The inline plugin allows setting up things like this. + +Plus users be able to post feedback to news items. If anonymous, they must be approved first. I'd prefer to not use normal "wiki" editor for feedback. + +Any thoughts or examples on this? Any links to examples of news sites or blogs with outside feedback using ikiwiki? + +Thanks --[[JeremyReed]] + diff --git a/doc/forum/PERL5LIB__44___wrappers_and_homedir_install.mdwn b/doc/forum/PERL5LIB__44___wrappers_and_homedir_install.mdwn new file mode 100644 index 000000000..fba941efc --- /dev/null +++ b/doc/forum/PERL5LIB__44___wrappers_and_homedir_install.mdwn @@ -0,0 +1,38 @@ +What is the way to tell wrappers that PERL5LIB should include ~/bin directories? + +Having this in the wiki.setup doesn't help anymore: + + # environment variables + ENV => { + PATH => '/home/user/bin/bin:/usr/local/bin:/usr/bin:/bin:/usr/games:/home/user/ikiwiki/usr/bin/:/home/user/ikiwiki/usr/sbin/:/home/user/bin/bin/:~/bin/bin/', + PERL5LIB => '/home/user/bin/share/perl/5.10.0:/home/user/bin/lib/perl/5.10.0' + }, + +Or at least I get CGI errors and running ikiwiki.cgi manually fails too: + + Use of uninitialized value $tainted in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 233. + Argument "" isn't numeric in umask at /usr/share/perl5/IkiWiki.pm line 139. + Undefined subroutine &IkiWiki::cgierror called at /home/user/bin/bin/ikiwiki line 199. + +Server has an older ikiwiki installed but I'd like to use a newer version from git, and I don't have root access. + +> You can't set `PERL5LIB` in `ENV` in a setup file, because ikiwiki is already +> running before it reads that, and so it has little effect. Your error +> messages do look like a new bin/ikiwiki is using an old version of +> `IkiWiki.pm`. +> +> The thing to do is set `INSTALL_BASE` when you're installing ikiwiki from +> source. Like so: + + cd ikiwiki + perl Makefile.PL INSTALL_BASE=$HOME PREFIX= + make install + +> Then `$HOME/bin/ikiwiki` will have hardcoded into it to look +> for ikiwiki's perl modules in `$HOME/lib/perl5/` +> (This is documented in the README file by the way.) --[[Joey]] + +>> Ok, *perl Makefile.PL INSTALL_BASE=$HOME/bin PREFIX=* finally did it for me. I tried too many things with +>> these paths so I wasn't sure which actually worked. After that I did +>> *$ ikiwiki --setup www.setup --wrappers --rebuild*. Somehow in this update mess I seem to have lost the user +>> accounts, maybe the --rebuild was too much. diff --git a/doc/forum/PageSpec_results_from_independent_checkout.mdwn b/doc/forum/PageSpec_results_from_independent_checkout.mdwn new file mode 100644 index 000000000..693287d2b --- /dev/null +++ b/doc/forum/PageSpec_results_from_independent_checkout.mdwn @@ -0,0 +1,8 @@ +I'd like to be able to do PageSpec matches independent of the Ikiwiki checkout, but at best I'm currently restricted to copying over and using whatever is in the indexdb with this approach: + + perl -MIkiWiki -le '$config{wikistatedir}=".ikiwiki"; IkiWiki::loadindex(); print foreach pagespec_match_list("", shift)' "bugs/*" + +I get the impression there's a way to build up enough state to run pagespec matches without doing any rendering, but I don't know how. Any ideas? -- JoeRayhawk + +> It's not possible to build up enough state without at a minimum +> performing the scan pass of rendering on every page. --[[Joey]] diff --git a/doc/forum/Possible_to_use_meta_variables_in_templates__63__.mdwn b/doc/forum/Possible_to_use_meta_variables_in_templates__63__.mdwn new file mode 100644 index 000000000..3c214d457 --- /dev/null +++ b/doc/forum/Possible_to_use_meta_variables_in_templates__63__.mdwn @@ -0,0 +1,11 @@ +I'm trying to create a [[!iki plugins/template desc=template]] which references variables from the [[!iki plugins/meta desc=meta]] plugin, but either it's not supported or I'm doing something wrong. This is what my template looks like: + + <div class="attributionbox"> + <p><b>Written by:</b> <a href="<TMPL_VAR AUTHORURL>"><TMPL_VAR AUTHOR></a></p> + <p><TMPL_VAR text></b></p> + </div> + +The template is working because I get the content, but all the places where I reference meta variables are blank. Is this supposed to work or am I trying to do something unsupported? Many thanks for any pointers. + +Cheers, +[[AdamShand]] diff --git a/doc/forum/Possible_to_use_meta_variables_in_templates__63__/comment_1_556078a24041289d8f0b7ee756664690._comment b/doc/forum/Possible_to_use_meta_variables_in_templates__63__/comment_1_556078a24041289d8f0b7ee756664690._comment new file mode 100644 index 000000000..3aeeec793 --- /dev/null +++ b/doc/forum/Possible_to_use_meta_variables_in_templates__63__/comment_1_556078a24041289d8f0b7ee756664690._comment @@ -0,0 +1,20 @@ +[[!comment format=mdwn + username="http://smcv.pseudorandom.co.uk/" + nickname="smcv" + subject="not supported at the moment" + date="2011-01-24T15:17:59Z" + content=""" +This isn't supported, because [[ikiwiki/directive/template]] templates +don't run `pagetemplate` hooks (which is how information gets from +[[ikiwiki/directive/meta]] into, for instance, `page.tmpl`). The only +inputs to the `HTML::Template` are the parameters passed to the +directive, plus the `raw_`-prefixed versions of those, plus the extra +parameters passed to every `preprocess` hook (currently `page`, `destpage` +and `preview`). + +I think having `pagetemplate` hooks run for this sort of template +by default would be rather astonishing, but perhaps some sort of +opt-in while defining the template would be reasonable? One problem +with that is that the templates used by [[ikiwiki/directive/template]] +are just wiki pages, and don't really have any special syntax support. +"""]] diff --git a/doc/forum/Possible_to_use_meta_variables_in_templates__63__/comment_2_e7e954218d39bc310015b95aa1a5212c._comment b/doc/forum/Possible_to_use_meta_variables_in_templates__63__/comment_2_e7e954218d39bc310015b95aa1a5212c._comment new file mode 100644 index 000000000..b53188128 --- /dev/null +++ b/doc/forum/Possible_to_use_meta_variables_in_templates__63__/comment_2_e7e954218d39bc310015b95aa1a5212c._comment @@ -0,0 +1,10 @@ +[[!comment format=mdwn + username="http://adam.shand.net/" + nickname="Adam" + subject="Bummer." + date="2011-01-24T15:26:33Z" + content=""" +Thanks for the quick response! I'm trying to figure out some way that I can reference meta variables inside of a page. Specifically I'm trying to create an attribution box which lists all of the information I have about who wrote the page, where the original can be found etc. I can just pass the values to the template, but it would be really nice not have to put this information in for the meta plugin and my attribution box! + +The changes you suggest sound wonderful but are beyond my abilities right row. Any ideas how I might accomplish this in the mean time? +"""]] diff --git a/doc/forum/Possible_to_use_meta_variables_in_templates__63__/comment_3_8b16c563c89eb6980ad6a5539d934d7a._comment b/doc/forum/Possible_to_use_meta_variables_in_templates__63__/comment_3_8b16c563c89eb6980ad6a5539d934d7a._comment new file mode 100644 index 000000000..a20f8f5c6 --- /dev/null +++ b/doc/forum/Possible_to_use_meta_variables_in_templates__63__/comment_3_8b16c563c89eb6980ad6a5539d934d7a._comment @@ -0,0 +1,8 @@ +[[!comment format=mdwn + username="http://smcv.pseudorandom.co.uk/" + nickname="smcv" + subject="comment 3" + date="2011-01-24T20:58:52Z" + content=""" +I usually just have a template that contains a suitable `\[[!meta]]` directive. +"""]] diff --git a/doc/forum/Possible_to_use_meta_variables_in_templates__63__/comment_4_76eadf93cce4e2168960131d4677c5fc._comment b/doc/forum/Possible_to_use_meta_variables_in_templates__63__/comment_4_76eadf93cce4e2168960131d4677c5fc._comment new file mode 100644 index 000000000..b5c626130 --- /dev/null +++ b/doc/forum/Possible_to_use_meta_variables_in_templates__63__/comment_4_76eadf93cce4e2168960131d4677c5fc._comment @@ -0,0 +1,8 @@ +[[!comment format=mdwn + username="http://kerravonsen.dreamwidth.org/" + ip="202.173.183.92" + subject="contrib plugins can do this" + date="2011-01-24T23:11:40Z" + content=""" +You can do this by using the [[plugins/contrib/field]] plugin with the [[plugins/contrib/ftemplate]] plugin. +"""]] diff --git a/doc/forum/Possible_to_use_meta_variables_in_templates__63__/comment_5_ddabe4a005042d19c7669038b49275c1._comment b/doc/forum/Possible_to_use_meta_variables_in_templates__63__/comment_5_ddabe4a005042d19c7669038b49275c1._comment new file mode 100644 index 000000000..6279b20ba --- /dev/null +++ b/doc/forum/Possible_to_use_meta_variables_in_templates__63__/comment_5_ddabe4a005042d19c7669038b49275c1._comment @@ -0,0 +1,12 @@ +[[!comment format=mdwn + username="http://adam.shand.net/" + nickname="Adam" + subject="Thanks!" + date="2011-01-25T02:51:35Z" + content=""" +smcv, sorry I don't understand? How are you getting the \[[!meta] to work on a template page, I thought that's what you said didn't work? Do you mean a pagetemplate? + +kerravonsen, thanks for the pointer I'll check those out. + +I realised last night that I think I could also do this with a pagetemplate, since I should be able to access meta variables there. A little clumsy for what I want to do but should hopefully work fine. Would be really neat with the [section template](http://ikiwiki.info/todo/Set_templates_for_whole_sections_of_the_site/) plugin, I'll have to look at that. +"""]] diff --git a/doc/forum/Processing_non-pages.mdwn b/doc/forum/Processing_non-pages.mdwn new file mode 100644 index 000000000..23af417a4 --- /dev/null +++ b/doc/forum/Processing_non-pages.mdwn @@ -0,0 +1,7 @@ +I'd like to be able to write a plugin that minifies CSS pages, but the whole plugin mechanism appears to be oriented towards generating HTML pages. That is, all files appear to be split into "pages with page types" and "pages without page types". Pages without page types are copied from the source to the destination directory and that's all. Pages *with* page-types go through the whole gamut: scan, filter, preprocess, linkify, htmlize, sanitize, format, and then they're written as "foo.html". + +I could be mistaken, but I don't think registering "css" as a page-type would work. Sure, I could then process the content to my heart's content, but at the end, my foo.css file would be saved as foo.html, which is NOT what I want. + +What I would like would be something in-between, where one could take `foo.css`, process it (in this case, run a minify over it) and output it as `foo.css`. + +How? diff --git a/doc/forum/Regex_for_Valid_Characters_in_Filenames.mdwn b/doc/forum/Regex_for_Valid_Characters_in_Filenames.mdwn new file mode 100644 index 000000000..618576f81 --- /dev/null +++ b/doc/forum/Regex_for_Valid_Characters_in_Filenames.mdwn @@ -0,0 +1,19 @@ +I'm sure that this is documented somewhere but I've ransacked the wiki and I can't find it. :-( What are the allowed characters in an ikiwiki page name? I'm writing a simple script to make updating my blog easier and need to filter invalid characters (so far I've found that # and , aren't allowed ;-)). Thanks for any pointers. -- [[AdamShand]] + +> The default `wiki_file_regexp` matches filenames containing only +> `[-[:alnum:]_.:/+]` +> +> The titlepage() function will convert freeform text to a valid +> page name. See [[todo/should_use_a_standard_encoding_for_utf_chars_in_filenames]] +> for an example. --[[Joey]] + +>> Perfect, thanks! +>> +>> In the end I decided that I didn't need any special characters in filenames and replaced everything but alphanumeric characters with underscores. In addition to replacing bad characters I also collapse multiple underscores into a single one, and strip off trailing and leading underscores to make tidy filenames. If it's useful to anybody else here's a sed example: +>> +>> # echo "++ Bad: ~@#$%^&*()_=}{[];,? Iki: +_-:./ Num: 65.5 ++" | sed -e 's/[^A-Za-z0-9_]/_/g' -e 's/__*/_/g' -e 's/^_//g' -e 's/_$//g' +>> Bad_Iki_Num_65_5 +>> +>>--[[AdamShand]] + +[[!meta date="2008-01-18 23:40:02 -0500"]] diff --git a/doc/forum/Render_more_than_one_dest_page_from_same_source_page.mdwn b/doc/forum/Render_more_than_one_dest_page_from_same_source_page.mdwn new file mode 100644 index 000000000..e7362c903 --- /dev/null +++ b/doc/forum/Render_more_than_one_dest_page_from_same_source_page.mdwn @@ -0,0 +1,51 @@ +Is it possible to render more than one destination page from the same source page? +That is, same source, slightly different presentation at the other end, needing a different output file. + +> It's possible to render more than one output _file_ from a given source +> page. See, for example, the inline plugin's generation of rss files. +> This is done by calling `will_render()` and using `writefile()` to +> generate the additional files. Probably in a format hook if you want +> to generate html files. + +>> Thanks for the tip, I'll take a look at that. -- [[KathrynAndersen]] + +> It's not possible for one source file to represent multiple wiki pages. +> There is a 1:1 mapping between source filenames and page names. The +> difference between wiki pages and output files is that you can use +> wikilinks to link to wiki pages, etc. --[[Joey]] + +I have two problems that would be solved by being able to do this. + +[[!toc startlevel=2]] + +##"full" and "print" versions of a page. + +One has a page "foo", which is rendered into foo.html. +One also wants a foo-print.html page, which uses "page-print.tmpl" rather than "page.tmpl" as its template. + +I want to do this for every page on the site, automatically, so it isn't feasible to do it by hand. + +> Did you know that ikiwiki's `style.css` arranges for pages to display +> differently when printed out? Things like the Action bar are hidden in +> printouts (search for `@media print`). So I don't see a reason to need +> whole files for printing when you can use these style sheet tricks. +> --[[Joey]] + +>>Fair enough. --[[KathrynAndersen]] + +##"en" and "en-us" versions of a page. + +My site is in non-US English. However, I want US-English people to find my site when they search for it when they use US spelling on certain search terms (such as "optimise" versus "optimize"). This requires a (crude) US-English version of the site where the spellings are changed automatically, and the LANG is "en-us" rather than "en". (No, don't tell me to use keywords; Google ignores keywords and has for a number of years). + +So I want the page "foo" to render to "foo.en.html" and "foo.en-us.html" where the content is the same, just some automated word-substitution applied before foo.en-us.html is written. And do this for every page on the site. + +I can't do this with the "po" plugin, as it considers "en-us" not to be a valid language. And the "po" plugin is probably overkill for what I want anyway. + +But I'm not sure how to achieve the result I need. + +-- [[KathrynAndersen]] + +> Sounds like this could be considered a single page that generates two +> html files, so could be handled per above. --[[Joey]] + +>>Thanks! --[[KathrynAndersen]] diff --git a/doc/forum/Setting_up_a_development_environment.mdwn b/doc/forum/Setting_up_a_development_environment.mdwn new file mode 100644 index 000000000..0b4e555c1 --- /dev/null +++ b/doc/forum/Setting_up_a_development_environment.mdwn @@ -0,0 +1,32 @@ +Hi, + +I'm trying to setup a development environment to hack on the comments plugin and I'm having problems getting my Ikiwiki CGI to use my git checkout as the libdir and templatedir instead of the system one. + +My <tt>.setup</tt> contains: + + srcdir => '/home/francois/wiki/testblog', + destdir => '/var/www/testblog', + url => 'http://localhost/testblog', + cgiurl => 'http://localhost/testblog/ikiwiki.cgi', + cgi_wrapper => '/var/www/testblog/ikiwiki.cgi', + templatedir => '/home/francois/devel/remote/ikiwiki/templates', + underlaydir => '/home/francois/devel/remote/ikiwiki/doc', + libdir => '/home/francois/devel/remote/ikiwiki', + ENV => {}, + git_wrapper => '/home/francois/wiki/testblog.git/hooks/post-update', + +Now, if I modify <tt>~/devel/remote/ikiwiki/templates/comment.tmpl</tt>, my changes don't appear when I add a comment to a blog post. On the other hand, if I hack <tt>/usr/share/ikiwiki/templates/comment.tmpl</tt> and cause the page to be rebuilt by adding a new comment then that does have an effect. + +The same is true for <tt>~/devel/remote/ikiwiki/Ikiwiki/Plugin/comments.pm</tt> (doesn't appear to be used) and <tt>/usr/share/perl5/Ikiwiki/Plugin/comments.pm</tt> (my hacks affect pages as they are recompiled). + +I must be missing something obvious, but the [[ikiwiki development environment tips]] didn't help me... + +Cheers, + +[[Francois|fmarier]] + +> I updated the [[ikiwiki development environment tips]] page with my +> approach to running ikiwiki from the git checkout (with changes). For +> the templates, also make sure that you do not have custom templates in +> your src dir as they will be used instead of those from the template +> dir if found. --GB diff --git a/doc/forum/Should_not_create_an_existing_page.mdwn b/doc/forum/Should_not_create_an_existing_page.mdwn new file mode 100644 index 000000000..b9500757f --- /dev/null +++ b/doc/forum/Should_not_create_an_existing_page.mdwn @@ -0,0 +1,15 @@ +[[!meta date="2007-01-08 14:55:31 +0000"]] + +This might be a bug, but will discuss it here first. +Clicking on an old "?" or going to a create link but new Markdown content exists, should not go into "create" mode, but should do a regular "edit". + +> I belive that currently it does a redirect to the new static web page. +> At least that's the intent of the code. --[[Joey]] + +>> Try at your site: `?page=discussion&from=index&do=create` +>> It brings up an empty textarea to start a new webpage -- even though it already exists here. --reed + +>>> Ah, right. Notice that the resulting form allows saving the page as +>>> discussion, or users/discussion, but not index/discussion, since this +>>> page already exists. If all the pages existed, it would do the redirect +>>> thing. --[[Joey]] diff --git a/doc/forum/Spaces_in_wikilinks.mdwn b/doc/forum/Spaces_in_wikilinks.mdwn new file mode 100644 index 000000000..9326ac448 --- /dev/null +++ b/doc/forum/Spaces_in_wikilinks.mdwn @@ -0,0 +1,104 @@ +[[!meta date="2007-07-02 13:21:29 +0000"]] + +# Spaces in WikiLinks? + +Hello Joey, + +I've just switched from ikiwiki 2.0 to ikiwiki 2.2 and I'm really surprised +that I can't use the spaces in WikiLinks. Could you please tell me why the spaces +aren't allowed in WikiLinks now? + +My best regards, + +--[[PaweB|ptecza]] + +> See [[bugs/Spaces_in_link_text_for_ikiwiki_links]] + +---- + +# Build in OpenSolaris? + +Moved to [[bugs/build_in_opensolaris]] --[[Joey]] + +---- + +# Various ways to use Subversion with ikiwiki + +I'm playing around with various ways that I can use subversion with ikiwiki. + +* Is it possible to have ikiwiki point to a subversion repository which is on a different server? The basic checkin/checkout functionality seems to work but there doesn't seem to be any way to make the post-commit hook work for a non-local server? + +> This is difficult to do since ikiwiki's post-commit wrapper expects to +> run on a machine that contains both the svn repository and the .ikiwiki +> state directory. However, with recent versions of ikiwiki, you can get +> away without running the post-commit wrapper on commit, and all you lose +> is the ability to send commit notification emails. + +> (And now that [[recentchanges]] includes rss, you can just subscribe to +> that, no need to worry about commit notification emails anymore.) + +* Is it possible / sensible to have ikiwiki share a subversion repository with other data (either completely unrelated files or another ikiwiki instance)? This works in part but again the post-commit hook seems problematic. + +--[[AdamShand]] + +> Sure, see ikiwiki's subversion repository for example of non-wiki files +> in the same repo. If you have two wikis in one repository, you will need +> to write a post-commit script that calls the post-commit wrappers for each +> wiki. + +---- + +# Regex for Valid Characters in Filenames + +I'm sure that this is documented somewhere but I've ransacked the wiki and I can't find it. :-( What are the allowed characters in an ikiwiki page name? I'm writing a simple script to make updating my blog easier and need to filter invalid characters (so far I've found that # and , aren't allowed ;-)). Thanks for any pointers. -- [[AdamShand]] + +> The default `wiki_file_regexp` matches filenames containing only +> `[-[:alnum:]_.:/+]` +> +> The titlepage() function will convert freeform text to a valid +> page name. See [[todo/should_use_a_standard_encoding_for_utf_chars_in_filenames]] +> for an example. --[[Joey]] + +>> Perfect, thanks! +>> +>> In the end I decided that I didn't need any special characters in filenames and replaced everything but alphanumeric characters with underscores. In addition to replacing bad characters I also collapse multiple underscores into a single one, and strip off trailing and leading underscores to make tidy filenames. If it's useful to anybody else here's a sed example: +>> +>> # echo "++ Bad: ~@#$%^&*()_=}{[];,? Iki: +_-:./ Num: 65.5 ++" | sed -e 's/[^A-Za-z0-9_]/_/g' -e 's/__*/_/g' -e 's/^_//g' -e 's/_$//g' +>> Bad_Iki_Num_65_5 +>> +>>--[[AdamShand]] + +# Upgrade steps from RecentChanges CGI to static page? + +Where are the upgrade steps for RecentChanges change from CGI to static feed? +I run multiple ikiwiki-powered sites on multiple servers, but today I just upgraded one to 2.32.3. +Please have a look at +<http://bsdwiki.reedmedia.net/wiki/recentchanges.html> +Any suggestions? + +> There are no upgrade steps required. It does look like you need to enable +> the meta plugin to get a good recentchanges page though.. --[[Joey]] + +# News site where articles are submitted and then reviewed before posting? + +I am considering moving a news site to Ikiwiki. I am hoping that Ikiwiki has a feature where anonymous posters can submit a form that moderators can review and then accept for it to be posted on a news webpage (like front page of the website). + +> Well, you can have one blog that contains unreviewed articles, and +> moderators can then add a tag that makes the article show up in the main +> news feed. There's nothing stopping someone submitting an article +> pre-tagged though. If you absolutely need to lock that down, you could +> have one blog with unreviewed articles in one subdirectory, and reviewers +> then move the file over to another subdirectory when they're ready to +> publish it. (This second subdirectory would be locked to prevent others +> from writing to it.) --[[Joey]] + +Also it would be good if the news page would keep maybe just the latest 10 entries with links to an archive that make it easy to browse to old entries by date. (Could have over a thousand news articles.) + +> The inline plugin allows setting up things like this. + +Plus users be able to post feedback to news items. If anonymous, they must be approved first. I'd prefer to not use normal "wiki" editor for feedback. + +Any thoughts or examples on this? Any links to examples of news sites or blogs with outside feedback using ikiwiki? + +Thanks --[[JeremyReed]] + diff --git a/doc/forum/TMPL__95__VAR_IS__95__ADMIN.mdwn b/doc/forum/TMPL__95__VAR_IS__95__ADMIN.mdwn new file mode 100644 index 000000000..c8eec0bc9 --- /dev/null +++ b/doc/forum/TMPL__95__VAR_IS__95__ADMIN.mdwn @@ -0,0 +1 @@ +Could someone point out where I could implement a template variable? I would like an IS_ADMIN. I'm pretty sure I could do it but I'm sure someone has an opinion on where this might belong. I want to hide the edit links on my blog for non-admin users. diff --git a/doc/forum/TMPL__95__VAR_IS__95__ADMIN/comment_1_3172568473e9b79ad7ab623afd19411a._comment b/doc/forum/TMPL__95__VAR_IS__95__ADMIN/comment_1_3172568473e9b79ad7ab623afd19411a._comment new file mode 100644 index 000000000..7967ea6f0 --- /dev/null +++ b/doc/forum/TMPL__95__VAR_IS__95__ADMIN/comment_1_3172568473e9b79ad7ab623afd19411a._comment @@ -0,0 +1,13 @@ +[[!comment format=mdwn + username="http://smcv.pseudorandom.co.uk/" + nickname="smcv" + subject="comment 1" + date="2011-02-23T11:14:14Z" + content=""" +This isn't possible without out-of-band mechanisms (Javascript or something). +ikiwiki produces static HTML; template variables are evaluated when the HTML +is compiled, and admins and non-admins see the exact same file. + +(More precisely, URLs containing `/ikiwiki.cgi/` are dynamically-generated +pages; everything else is static.) +"""]] diff --git a/doc/forum/TMPL__95__VAR_IS__95__ADMIN/comment_2_4302d56a6fe68d17cc42d26e6f3566c2._comment b/doc/forum/TMPL__95__VAR_IS__95__ADMIN/comment_2_4302d56a6fe68d17cc42d26e6f3566c2._comment new file mode 100644 index 000000000..5e34ab4f8 --- /dev/null +++ b/doc/forum/TMPL__95__VAR_IS__95__ADMIN/comment_2_4302d56a6fe68d17cc42d26e6f3566c2._comment @@ -0,0 +1,8 @@ +[[!comment format=mdwn + username="http://smcv.pseudorandom.co.uk/" + nickname="smcv" + subject="comment 2" + date="2011-02-23T11:16:56Z" + content=""" +See [[bugs/logout in ikiwiki]] for discussion of a similar issue. +"""]] diff --git a/doc/forum/TMPL__95__VAR_IS__95__ADMIN/comment_3_4cc44e61b9c28a2d524fa874f115041a._comment b/doc/forum/TMPL__95__VAR_IS__95__ADMIN/comment_3_4cc44e61b9c28a2d524fa874f115041a._comment new file mode 100644 index 000000000..e152091bc --- /dev/null +++ b/doc/forum/TMPL__95__VAR_IS__95__ADMIN/comment_3_4cc44e61b9c28a2d524fa874f115041a._comment @@ -0,0 +1,14 @@ +[[!comment format=mdwn + username="justint" + ip="24.182.207.250" + subject="IS_ADMIN template wouldn't work" + date="2011-02-23T16:03:31Z" + content=""" +Ok, I think I get it. The template is used when IkiWiki is compiling the static pages, then users access the static pages. So at the time the templates are compiled there isn't a concept of who is accessing the page or what their session may be like (be it admin or anon or whatever). + +Is there a simple way to serve a different static page based on session information? Off the top of my head I would say no but I thought I would ask. I suppose I could try to compile two static sites, one for me and one for the world. That would solve my IS_ADMIN problem, but I don't think its a solution for similar types of things. + +I like ikiwiki for what it is, I get the feeling I may be asking it to do something it wasn't meant to do. If so I'd appreciate it if someone told me to stop trying. [[users/justint]] + + +"""]] diff --git a/doc/forum/TMPL__95__VAR_IS__95__ADMIN/comment_4_33143bad68f3f6beae963a3d0ec5d0bd._comment b/doc/forum/TMPL__95__VAR_IS__95__ADMIN/comment_4_33143bad68f3f6beae963a3d0ec5d0bd._comment new file mode 100644 index 000000000..ab7370c5a --- /dev/null +++ b/doc/forum/TMPL__95__VAR_IS__95__ADMIN/comment_4_33143bad68f3f6beae963a3d0ec5d0bd._comment @@ -0,0 +1,53 @@ +[[!comment format=mdwn + username="http://smcv.pseudorandom.co.uk/" + nickname="smcv" + subject="comment 4" + date="2011-02-23T16:19:28Z" + content=""" +> ... at the time the templates are compiled there isn't a concept of who is accessing the page + +Yes, this is the problem with what you're asking for. + +> Is there a simple way to serve a different static page based on session information? + +No, the thing serving the static pages is your web server; IkiWiki isn't involved +at all. + +> I suppose I could try to compile two static sites, one for me and one for the world + +I've done similar in the past with two setup files, under the same user ID, running +different checkouts of the same git repository - one for me, on https with +[[plugins/httpauth]], and one for the world, with only [[plugins/openid]]. You have +to make them write their git wrappers to different filenames, and make the real +git hook be a shell script that runs one wiki's wrapper, then the other, to refresh +both wikis when something gets committed. + +It's a bit fiddly to admin (you have to duplicate most setup changes in the two +setup files), but can be made to work. I've given up on that in favour of having +a single wiki reachable from both http and https, with [[plugins/httpauth]] +only working over https. + +> I get the feeling I may be asking it to do something it wasn't meant to do. + +Pretty much, yes. + +> If so I'd appreciate it if someone told me to stop trying. + +I can help! \"Stop trying.\" :-) + +But, if you want this functionality badly enough, one way you could get +it would be to have all the links on all the pages (for the benefit of +`NoScript` users), use Javascript to make an XMLHTTPRequest (or something) +to to a CGI action provided by a [[plugin|plugins/write]] +(`ikiwiki.cgi?do=amiadminornot` or something), and if that says the user +isn't an admin, hide some of the links to not confuse them. + +That would break the normal way that people log in to ikiwiki (by trying +to do something that needs them logged-in, like editing), so you'd also +want to add a \"Log In\" button or link (or just remember that editing your +Preferences has the side-effect of logging you in). + +Note that hiding the links isn't useful for security, only for +usability - the actual edit obviously needs to check whether the +user is a logged-in admin, and it already does. +"""]] diff --git a/doc/forum/TMPL__95__VAR_IS__95__ADMIN/comment_5_ef790766456d723670f52cc9e3955e90._comment b/doc/forum/TMPL__95__VAR_IS__95__ADMIN/comment_5_ef790766456d723670f52cc9e3955e90._comment new file mode 100644 index 000000000..5cbc0d206 --- /dev/null +++ b/doc/forum/TMPL__95__VAR_IS__95__ADMIN/comment_5_ef790766456d723670f52cc9e3955e90._comment @@ -0,0 +1,12 @@ +[[!comment format=mdwn + username="justint" + ip="24.182.207.250" + subject="easy money" + date="2011-02-23T17:47:35Z" + content=""" +I've done a plugin, but I haven't done a CGI one yet. I can probably handle it though. I have a little javascript, I could probably do that too. + +I'm fuzzy on the log in bit, I don't know how to bring up a log in page in IkiWiki (that would just return to the calling page and not an edit page). + +If I were going to do this I'd want to have a log out button appear when the user is logged in. Is it possible to add a log out function to the same plugin? +"""]] diff --git a/doc/forum/TMPL__95__VAR_IS__95__ADMIN/comment_6_3db50264e01c8fad2e5567b5a9c7b6dc._comment b/doc/forum/TMPL__95__VAR_IS__95__ADMIN/comment_6_3db50264e01c8fad2e5567b5a9c7b6dc._comment new file mode 100644 index 000000000..4d1c224b7 --- /dev/null +++ b/doc/forum/TMPL__95__VAR_IS__95__ADMIN/comment_6_3db50264e01c8fad2e5567b5a9c7b6dc._comment @@ -0,0 +1,23 @@ +[[!comment format=mdwn + username="http://smcv.pseudorandom.co.uk/" + nickname="smcv" + subject="comment 6" + date="2011-02-23T18:08:37Z" + content=""" +> I'm fuzzy on the log in bit, I don't know how to bring up a log in page in IkiWiki + +Cheap hack: make a link to `cgiurl(do => prefs)` and the user will have +to press Back a couple of times when they've logged in :-) + +Less-cheap hack: have a CGI plugin that responds to `do=login` by doing +basically the same thing as `IkiWiki::needsignin`, but instead of +returning to the `QUERY_STRING`, return to the HTTP referer, or +a page whose name is passed in the query string, or some such. + +> If I were going to do this I'd want to have a log out button appear +> when the user is logged in. Is it possible to add a log out function to the same plugin? + +I don't see why not; you could create it from Javascript for logged-in +users only. That'd close the bug [[bugs/logout in ikiwiki]] (see that +bug for related ideas). +"""]] diff --git a/doc/forum/TMPL__95__VAR_IS__95__ADMIN/comment_7_bdc5c96022fdb8826b57d68a41ef6ca0._comment b/doc/forum/TMPL__95__VAR_IS__95__ADMIN/comment_7_bdc5c96022fdb8826b57d68a41ef6ca0._comment new file mode 100644 index 000000000..79fd8516f --- /dev/null +++ b/doc/forum/TMPL__95__VAR_IS__95__ADMIN/comment_7_bdc5c96022fdb8826b57d68a41ef6ca0._comment @@ -0,0 +1,8 @@ +[[!comment format=mdwn + username="justint" + ip="24.182.207.250" + subject="Continuing discussion..." + date="2011-02-24T02:59:04Z" + content=""" +Ok, I'll go over to the [[bugs/logout_in_ikiwiki]] page. Thank you for your help. +"""]] diff --git a/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server..mdwn b/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server..mdwn new file mode 100644 index 000000000..4061a7348 --- /dev/null +++ b/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server..mdwn @@ -0,0 +1,11 @@ +Hi, +Installed ikiwiki on my ubuntu (04.10) box; after creating a blog according to your setup instructions I cannot edit files on the web interface, and I get this errer «The requested URL /~jean/blog/ikiwiki.cgi was not found on this server.» +I have no idea what to do (sorry for my ignorance) + +tia, + +> Make sure you have a `~/public_html/ikiwiki.cgi`. Your setup +> file should generate that via the `cgi_wrapper` option. +> +> Maybe you need to follow the [[tips/dot_cgi]] tip to make apache see it. +> --[[Joey]] diff --git a/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_1_d36ce6fab90e0a086ac84369af38d205._comment b/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_1_d36ce6fab90e0a086ac84369af38d205._comment new file mode 100644 index 000000000..f95972c4f --- /dev/null +++ b/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_1_d36ce6fab90e0a086ac84369af38d205._comment @@ -0,0 +1,16 @@ +[[!comment format=mdwn + username="jeanm" + subject="comment 1" + date="2010-06-19T13:35:37Z" + content=""" +OK, I followed the dot cgi tip and this error diappears, thanks a lot! So ubuntu doesn't provide a \"working out of the box\" ikiwiki. + +But I get a new error message now when trying to edit a page: + +Error: \"do\" parameter missing + +My plugins now: +add_plugins => [qw{goodstuff websetup comments blogspam 404 muse}], + + +"""]] diff --git a/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_2_5836bba08172d2ddf6a43da87ebb0332._comment b/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_2_5836bba08172d2ddf6a43da87ebb0332._comment new file mode 100644 index 000000000..0a544eeb1 --- /dev/null +++ b/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_2_5836bba08172d2ddf6a43da87ebb0332._comment @@ -0,0 +1,7 @@ +[[!comment format=mdwn + username="http://joey.kitenet.net/" + subject="do parameter missing" + date="2010-06-23T17:03:12Z" + content=""" +That's an unusual problem. Normally the url or form that calls ikiwiki.cgi includes a \"do\" parameter, like \"do=edit\". I'd have to see the site to debug why it is missing for you. +"""]] diff --git a/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_3_4eec15c8c383275db5401c8e3c2d9242._comment b/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_3_4eec15c8c383275db5401c8e3c2d9242._comment new file mode 100644 index 000000000..faf3ad31b --- /dev/null +++ b/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_3_4eec15c8c383275db5401c8e3c2d9242._comment @@ -0,0 +1,9 @@ +[[!comment format=mdwn + username="jeanm" + ip="81.56.145.104" + subject="do parameter missing" + date="2010-06-30T07:30:08Z" + content=""" +the site address is piaffer.org, with a link to blog just over the picture. +tia, +"""]] diff --git a/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_4_43ac867621efb68affa6ae2b92740cad._comment b/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_4_43ac867621efb68affa6ae2b92740cad._comment new file mode 100644 index 000000000..d8b516f5f --- /dev/null +++ b/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_4_43ac867621efb68affa6ae2b92740cad._comment @@ -0,0 +1,10 @@ +[[!comment format=mdwn + username="http://joey.kitenet.net/" + nickname="joey" + subject="comment 4" + date="2010-07-04T18:16:26Z" + content=""" +What is the muse plugin that you have enabled? I am not familiar with it. + +Apparently your ikiwiki is not seeing cgi parameters that should be passed to it. This appears to be some kind of web server misconfiguration, or possibly a broken ikiwiki wrapper or broken CGI.pm. +"""]] diff --git a/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_5_e098723bb12adfb91ab561cae21b492b._comment b/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_5_e098723bb12adfb91ab561cae21b492b._comment new file mode 100644 index 000000000..b832d64f4 --- /dev/null +++ b/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_5_e098723bb12adfb91ab561cae21b492b._comment @@ -0,0 +1,8 @@ +[[!comment format=mdwn + username="http://joey.kitenet.net/" + nickname="joey" + subject="do parameter missing" + date="2010-07-08T06:04:44Z" + content=""" +I just debugged this problem with someone else who was using ngix-fcgi. There was a problem with it not passing CGI environment variables properly. If you're using that, it might explain your problem. +"""]] diff --git a/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_6_101183817ca4394890bd56a7694bedd9._comment b/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_6_101183817ca4394890bd56a7694bedd9._comment new file mode 100644 index 000000000..25a4e8bae --- /dev/null +++ b/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_6_101183817ca4394890bd56a7694bedd9._comment @@ -0,0 +1,10 @@ +[[!comment format=mdwn + username="jeanm" + ip="81.56.145.104" + subject="comment 6" + date="2010-07-12T17:43:31Z" + content=""" +I'm using apache and mostly firefox. +I've tried some changes in my config but still the same problem, then I fell ill and unable to try anything more. Now I seem to be better and I will go back to the problem soon. +Thx +"""]] diff --git a/doc/forum/Upgrade_steps_from_RecentChanges_CGI_to_static_page.mdwn b/doc/forum/Upgrade_steps_from_RecentChanges_CGI_to_static_page.mdwn new file mode 100644 index 000000000..298ff49f1 --- /dev/null +++ b/doc/forum/Upgrade_steps_from_RecentChanges_CGI_to_static_page.mdwn @@ -0,0 +1,10 @@ +Where are the upgrade steps for RecentChanges change from CGI to static feed? +I run multiple ikiwiki-powered sites on multiple servers, but today I just upgraded one to 2.32.3. +Please have a look at +<http://bsdwiki.reedmedia.net/wiki/recentchanges.html> +Any suggestions? + +> There are no upgrade steps required. It does look like you need to enable +> the meta plugin to get a good recentchanges page though.. --[[Joey]] + +[[!meta date="2008-02-23 21:10:42 -0500"]] diff --git a/doc/forum/Various_ways_to_use_Subversion_with_ikiwiki.mdwn b/doc/forum/Various_ways_to_use_Subversion_with_ikiwiki.mdwn new file mode 100644 index 000000000..8eed30cd8 --- /dev/null +++ b/doc/forum/Various_ways_to_use_Subversion_with_ikiwiki.mdwn @@ -0,0 +1,23 @@ +[[!meta date="2007-08-17 03:54:10 +0000"]] + +I'm playing around with various ways that I can use subversion with ikiwiki. + +* Is it possible to have ikiwiki point to a subversion repository which is on a different server? The basic checkin/checkout functionality seems to work but there doesn't seem to be any way to make the post-commit hook work for a non-local server? + +> This is difficult to do since ikiwiki's post-commit wrapper expects to +> run on a machine that contains both the svn repository and the .ikiwiki +> state directory. However, with recent versions of ikiwiki, you can get +> away without running the post-commit wrapper on commit, and all you lose +> is the ability to send commit notification emails. + +> (And now that [[recentchanges]] includes rss, you can just subscribe to +> that, no need to worry about commit notification emails anymore.) + +* Is it possible / sensible to have ikiwiki share a subversion repository with other data (either completely unrelated files or another ikiwiki instance)? This works in part but again the post-commit hook seems problematic. + +--[[AdamShand]] + +> Sure, see ikiwiki's subversion repository for example of non-wiki files +> in the same repo. If you have two wikis in one repository, you will need +> to write a post-commit script that calls the post-commit wrappers for each +> wiki. --[[Joey]] diff --git a/doc/forum/an_alternative_approach_to_structured_data.mdwn b/doc/forum/an_alternative_approach_to_structured_data.mdwn new file mode 100644 index 000000000..6e6af8adb --- /dev/null +++ b/doc/forum/an_alternative_approach_to_structured_data.mdwn @@ -0,0 +1,63 @@ +## First Pass + +Looking at the discussion about [[todo/structured_page_data]], it looks a bit like folks are bogged down in figuring out what *markup* to use for structured page data, something I doubt that people will really agree on. And thus, little progress is made. + +I propose that, rather than worry about what the data looks like, that we take a similar approach +to the way Revision Control Systems are used in ikiwiki: a front-end + back-end approach. +The front-end would be a common interface, where queries are made about the structured data, +and there would be any number of back-ends, which could use whatever markup or format that they desired. + +To that purpose, I've written the [[plugins/contrib/field]] plugin for a possible front-end. +I called it "field" because each page could be considered a "record" where one could request the values of "fields" of that record. +The idea is that back-end plugins would register functions which can be called when the value of a field is desired. + +This is gone into in more depth on the plugin page itself, but I would appreciate feedback and improvements on the approach. +I think it could be really powerful and useful, especially if it becomes part of ikiwiki proper. + +--[[KathrynAndersen]] + +> It looks like an interesting idea. I don't have time right now to look at it in depth, but it looks interesting. -- [[Will]] + +> I agree such a separation makes some sense. But note that the discussion on [[todo/structured_page_data]] +> talks about associating data types with fields for a good reason: It's hard to later develop a good UI for +> querying or modifying a page's data if all the data has an implicit type "string". --[[Joey]] + +>> I'm not sure that having an implicit type of "string" is really such a bad thing. After all, Perl itself manages with just string and number, and easily converts from one to the other. Strong typing is generally used to (a) restrict what can be done with the data and/or (b) restrict how the data is input. The latter could be done with some sort of validated form, but that, too, could be decoupled from looking up and returning the value of a field. --[[KathrynAndersen]] + +## Second Pass + +I have written additional plugins which integrate with the [[plugins/contrib/field]] plugin to both set and get structured page data. + +* [[plugins/contrib/getfield]] - query field values inside a page using {{$*fieldname*}} markup +* [[plugins/contrib/ftemplate]] - like [[plugins/template]] but uses "field" data as well as passed-in data +* [[plugins/contrib/ymlfront]] - looks for YAML-format data at the front of a page; this is just one possible back-end for the structured data + +--[[KathrynAndersen]] + +> I'm not an IkiWiki committer ([[Joey]] is the only one I think) +> but I really like the look of this scheme. In particular, +> having `getfield` interop with `field` without being *part of* +> `field` makes me happy, since I'm not very keen on `getfield`'s +> syntax (i.e. "ugh, yet another mini-markup-language without a +> proper escaping mechanism"), but this way people can experiment +> with different syntaxes while keeping `field` for the +> behind-the-scenes bits. +> +>> I've started using `field` on a private site and it's working +>> well for me; I'll try to do some code review on its +>> [[plugins/contrib/field/discussion]] page. --s +> +> My [[plugins/contrib/album]] plugin could benefit from +> integration with `field` for photos' captions and so on, +> probably... I'll try to work on that at some point. +> +> [[plugins/contrib/report]] may be doing too much, though: +> it seems to be an variation on `\[[inline archive="yes"]]`, +> with an enhanced version of sorting, a mini version of +> [[todo/wikitrails]], and some other misc. I suspect it could +> usefully be divided up into discrete features? One good way +> to do that might be to shuffle bits of its functionality into +> the IkiWiki distribution and/or separate plugins, until there's +> nothing left in `report` itself and it can just go away. +> +> --[[smcv]] diff --git a/doc/forum/cleaning_up_discussion_pages_and_the_like.mdwn b/doc/forum/cleaning_up_discussion_pages_and_the_like.mdwn new file mode 100644 index 000000000..35ceae59b --- /dev/null +++ b/doc/forum/cleaning_up_discussion_pages_and_the_like.mdwn @@ -0,0 +1,11 @@ +For example in [[forum/ikiwiki__39__s_notion_of_time]], should one remove the +text about the implementation bug that has been fixed, or should it stay there, +for reference? --[[tschwinge]] + +> I have no problem with cleaning up obsolete stuff in the forum, tips, etc. +> --[[Joey]] + +That's also what I think: such discussions or comments on [[forum]] discussion +pages, or generally on all pages' [[Discussion]] subpages, can be removed if +either they're simply not valid / interesting / ... anymore, or if they've been +used to improve the *real* documentation. --[[tschwinge]] diff --git a/doc/forum/cutpaste.pm_not_only_file-local.mdwn b/doc/forum/cutpaste.pm_not_only_file-local.mdwn new file mode 100644 index 000000000..0c5221cc9 --- /dev/null +++ b/doc/forum/cutpaste.pm_not_only_file-local.mdwn @@ -0,0 +1,14 @@ +I'd like to use the cutpaste plugin, but not only on a file-local basis: fileA +has \[[!cut id=foo text="foo"]], and fileB does \[[!absorb pagenames=fileA]], +and can then use \[[!paste id=foo]]. + +Therefore, I've written an [*absorb* directive / +plugin](http://schwinge.homeip.net/~thomas/tmp/absorb.pm), which is meant to +absorb pages in order to get hold of their *cut* and *copy* directives' +contents. This does work as expected. But it also absorbs page fileA's *meta* +values, like a *meta title*, etc. How to avoid / solve this? + +Alternatively, do you have a better suggestion about how to achieve what I +described in the first paragraph? + +--[[tschwinge]] diff --git a/doc/forum/cutpaste.pm_not_only_file-local/comment_1_497c62f21fd1b87625b806407c72dbad._comment b/doc/forum/cutpaste.pm_not_only_file-local/comment_1_497c62f21fd1b87625b806407c72dbad._comment new file mode 100644 index 000000000..8cc724a72 --- /dev/null +++ b/doc/forum/cutpaste.pm_not_only_file-local/comment_1_497c62f21fd1b87625b806407c72dbad._comment @@ -0,0 +1,8 @@ +[[!comment format=mdwn + username="http://kerravonsen.dreamwidth.org/" + ip="60.241.8.244" + subject="field and getfield and ymlfront" + date="2010-08-12T02:33:54Z" + content=""" +Have you considered trying the [[plugins/contrib/field]] plugin, and its associated plugins? [[plugins/contrib/ymlfront]] can give you the source (\"cut\") and [[plugins/contrib/getfield]] and/or [[plugins/contrib/report]] can get you the value (\"paste\") including the values from other pages. +"""]] diff --git a/doc/forum/debian_backports_update_someone_please.mdwn b/doc/forum/debian_backports_update_someone_please.mdwn new file mode 100644 index 000000000..7102d12a1 --- /dev/null +++ b/doc/forum/debian_backports_update_someone_please.mdwn @@ -0,0 +1,18 @@ +I'm just in the process of deploying ikiwiki and I'd love to use it in the html5 mode instead of in XHTML. Any chance that the ikiwiki's .deb in debian backports will be updated any time soon? + +> Formerer does a good job keeping the backport up-to-date with whatever is in Debian testing. +> Which is the policy of what Backports should contain. So, I just need to stop releasing ikiwiki +> for 2 weeks. :) --[[Joey]] + +>> And are there any chances you doing it... or rather not doing it? + +>>> Sure, I'm busily not doing it right now. Should reach testing in 3 +>>> days. I generally schedule things so a new ikiwiki reaches testing +>>> every 2 weeks to month. Getting important new features and bugfixes out +>>> can take priority though. --[[Joey]] + +>>>> Great! Thanks. + +>>>> Still not available in the backports; did you break the silence on the wire and got back to work [[Joey]]? + +>>>> I was blinded by my stupidity... thanks! diff --git a/doc/forum/double_forward_slash___39____47____47____39___in_the_address_bar.mdwn b/doc/forum/double_forward_slash___39____47____47____39___in_the_address_bar.mdwn new file mode 100644 index 000000000..b5fb2aa18 --- /dev/null +++ b/doc/forum/double_forward_slash___39____47____47____39___in_the_address_bar.mdwn @@ -0,0 +1,3 @@ +Why do all the pages are preceded with a double forward slash in the address bar... ie. http://example.org//ikiwiki/pagespec/ .. maybe someone knows? + +> Sorted, base url in the .setup file had the unnecessary '/' suffix. diff --git a/doc/forum/editing_a_comment.mdwn b/doc/forum/editing_a_comment.mdwn new file mode 100644 index 000000000..eb534365e --- /dev/null +++ b/doc/forum/editing_a_comment.mdwn @@ -0,0 +1,11 @@ +Is it possible to edit a comment? I did not find any button for it. + +> It was a design decision to not allow editing comments via the web +> interface. The thinking being that comments on blogs tend to not allow +> editing, and allowing wiki-style editing by anyone would sort of defeat +> the purpose of comments. +> +> I do think there is room to support more forum-style comments in ikiwiki. +> As long as the comment is not posted by an anonymous user, it would be +> possible to open up editing to the original commenter. One day, perhaps.. +> --[[Joey]] diff --git a/doc/forum/editing_the_style_sheet.mdwn b/doc/forum/editing_the_style_sheet.mdwn new file mode 100644 index 000000000..b4aa8c89b --- /dev/null +++ b/doc/forum/editing_the_style_sheet.mdwn @@ -0,0 +1,18 @@ +[[!meta date="2006-12-29 04:19:51 +0000"]] + +It would be nice to be able to edit the stylesheet by means of the cgi. Or is this possible? I wasn't able to achieve it. +Ok, that's my last 2 cents for a while. --[Mazirian](http://mazirian.com) + +> I don't support editing it, but if/when ikiwiki gets [[todo/fileupload]] support, +> it'll be possible to upload a style sheet. (If .css is in the allowed +> extensions list.. no idea how safe that would be, a style sheet is +> probably a great place to put XSS attacks and evil javascript that would +> be filtered out of any regular page in ikiwiki). --[[Joey]] + +>> I hadn't thought of that at all. It's a common feature and one I've +>> relied on safely, because the wikis I am maintaining at the moment +>> are all private and restricted to trusted users. Given that the whole +>> point of ikiwiki is to be able to access and edit via the shell as +>> well as the web, I suppose the features doesn't add a lot. By the +>> way, the w3m mode is brilliant. I haven't tried it yet, but the idea +>> is great. diff --git a/doc/forum/ever-growing_list_of_pages.mdwn b/doc/forum/ever-growing_list_of_pages.mdwn new file mode 100644 index 000000000..9920e34bb --- /dev/null +++ b/doc/forum/ever-growing_list_of_pages.mdwn @@ -0,0 +1,29 @@ +What is overyone's idea about the ever-growing list of pages in bugs/ etc.? +Once linked to `done`, they're removed from the rendered [[bugs]] page -- but +they're still present in the repository. + +Shouldn't there be some clean-up at some point for those that have been +resolved? Or should all of them be kept online forever? + +--[[tschwinge]] + +> To answer a question with a question, what harm does having the done bugs +> around cause? At some point in the future perhaps the number of done pages +> will be large enough to be a time or space concern. Do you think we've +> reached a point now? One advantage of having them around is that people +> running older versions of the Ikiwiki software may find the page explaining +> that the bug is fixed if they perform a search. -- [[Jon]] + +> I like to keep old bugs around. --[[Joey]] + +So, I guess it depends on whether you want to represent the development of the +software (meaning: which bugs are open, which are fixed) *(a)* in a snapshot of +the repository (a checkout; that is, what you see rendered on +<http://ikiwiki.info/>), or *(b)* if that information is to be contained in the +backing repository's revision history only. Both approaches are valid. For +people used to using Git for accessing a project's history, *(b)* is what +they're used to, but for those poor souls ;-) that only use a web browser to +access this database, *(a)* is the more useful approach indeed. For me, using +Git, it is a bit of a hindrance, as, when doing a full-text search for a +keyword on a checkout, I'd frequently hit pages that reported a bug, but are +tagged `done` by now. --[[tschwinge]] diff --git a/doc/forum/field_and_forms.mdwn b/doc/forum/field_and_forms.mdwn new file mode 100644 index 000000000..97fda1856 --- /dev/null +++ b/doc/forum/field_and_forms.mdwn @@ -0,0 +1,13 @@ +Dear ikiwiki users, and specially [[users/KathrynAndersen]] ([[users/rubykat]]): +have you considered some way of extending ikiwiki to allow some kind of +on-the-fly generation of web forms to create new pages? these web forms should +offer as many fields as one has defined in some [[page +template|plugins/contrib/ftemplate]], and, once POSTed, should create a page +using that template, with those fields already filled with the values the user +provided. + +I see this a a generalization of the `postform` option of the +[[ikiwiki/directive/inline]] directive. That option tells ikiwiki to create a +form with one field already filled (title). + +What are your ideas about this? diff --git a/doc/forum/field_and_forms/comment_1_a0e976cb79f03dcff5e9a4511b90d160._comment b/doc/forum/field_and_forms/comment_1_a0e976cb79f03dcff5e9a4511b90d160._comment new file mode 100644 index 000000000..3e10dbbd9 --- /dev/null +++ b/doc/forum/field_and_forms/comment_1_a0e976cb79f03dcff5e9a4511b90d160._comment @@ -0,0 +1,19 @@ +[[!comment format=mdwn + username="http://kerravonsen.dreamwidth.org/" + ip="60.241.8.244" + subject="Limitations" + date="2010-11-23T02:18:52Z" + content=""" +I'd already had a look at this idea before you posted this suggestion, and I ran into difficulties. +So far as I can see, it makes most sense to use the mechanisms already in place for editing pages, and enhance them. +Unfortunately, the whole edit-page setup expects a template file (by default, when editing pages, editpage.tmpl) +and anything apart from \"submit\" buttons must have a placeholder in the template file, or it doesn't get displayed at all in the form. +At least, that's what I've found - I could be mistaken. + +But if it's true, that rather puts the kybosh on dynamically generated forms, so far as I can see. +I mean, if you knew beforehand what all your fields were going to be, you could make a copy of editpage.tmpl, add in your fields where you want, and then make a plugin that uses that template instead of editpage.tmpl, but that's very limited. + +If someone could come up with a way of making dynamic forms, that would solve the problem, but I've come up against a brick wall myself. Joey? Anyone? + +-- [[KathrynAndersen]] +"""]] diff --git a/doc/forum/formating:_how_to_align_text_to_the_right.mdwn b/doc/forum/formating:_how_to_align_text_to_the_right.mdwn new file mode 100644 index 000000000..2b56bd70b --- /dev/null +++ b/doc/forum/formating:_how_to_align_text_to_the_right.mdwn @@ -0,0 +1,15 @@ +as in title, how to align text to the right? + +> Add to your local.css a class that aligns text to the right: + + .alignright { text-align: right; } + +> And then you just just use `<span class="alignright">` around +> other html. +> +> You can refine that, and allow right-aligning markdowned text +> by using the [[ikiwiki/directive/template]] +> directive, with a template that contains the html. The +> [[templates/note]] template does something similar. --[[Joey]] + +>> Thanks! diff --git a/doc/forum/google_openid_broken__63__.mdwn b/doc/forum/google_openid_broken__63__.mdwn index 1be9d0487..96ba2d791 100644 --- a/doc/forum/google_openid_broken__63__.mdwn +++ b/doc/forum/google_openid_broken__63__.mdwn @@ -1,3 +1,11 @@ +Now that google supports using thier profiles as OpenIDs, that can be used +directly to sign into ikiwiki. Just use, for example, +<http://www.google.com/profiles/joeyhess> . Tested and it works. --[[Joey]] + +> This seems to work fine if you use the profile directly as an OpenID. It doesn't seem to work with delegation. From that I can see, this is a deliberate decision by Google for security reasons. See the response [here](http://groups.google.com/group/google-federated-login-api/browse_thread/thread/825067789537568c/23451a68c8b8b057?show_docid=23451a68c8b8b057). -- [[Will]] + +## historical discussion + when I login via to this wiki (or ours) via Google's OpenID, I get this error: Error: OpenID failure: no_identity_server: The provided URL doesn't declare its OpenID identity server. @@ -28,3 +36,44 @@ http://openid-provider.appspot.com/larrylud >> URL directly, it doesn't work. I think there is something weird with re-direction. I hope this >> isn't a more general security hole. >> -- [[Will]] + +---- + +So, while the above bug will probably get fixed sooner or later, +the best approach for those of you needing a google openid now is +to use gmail. + + +Just a note that someone has apparently figured out how to use a google +openid, and not a third-party provider either, to edit this site. +The openid is +<https://www.google.com/accounts/o8/id?id=AItOawltlTwUCL_Fr1siQn94GV65-XwQH5XSku4> +(what a mouthfull!), and I don't know who that is or how to use it since it +points to a fairly useless xml document, rather than a web page. --[[Joey]] + +> That string is what's received via the discovery protocol. The user logging in with a Google account is not supposed to write that when logging in, but rather <https://www.google.com/accounts/o8/id>. The OpenID client library will accept that and redirect the user to a sign in page, which will return that string as the OpenID. It's not really usable as an identifier for edits and whatnots, but an alternative would be to use the attribute exchange extension to get the email address and display that. See <http://code.google.com/apis/accounts/docs/OpenID.html#Parameters>. + +> Yahoo's OpenID implementation works alike, but I haven't looked at it as much. It uses <https://me.yahoo.com/> to receive the endpoint. + +> I've added buttons that submit the two above URLs for logging in with a Google and Yahoo OpenID, respectively, to my locally changed OpenID login plugin. + +> Using the Google profile page as the OpenID is really orthogonal to the above. --[[kaol]] + +>> First, I don't accept that the openid google returns from their +>> generic signin url *has* to be so freaking ugly. For contrast, +>> look at the openid you log in as if you use the yahoo url. +>> <https://me.yahoo.com/joeyhess#35f22>. Nice and clean, now +>> munged by ikiwiki to "joeyhess [me.yahoo.com]". +>> +>> Displaying email addresses is not really an option, because ikiwiki +>> can't leak user email addresses like that. Displaying nicknames or +>> usernames is, see [[todo/Separate_OpenIDs_and_usernames]]. +>> +>> It would probably be good if the openid plugin could be configured with +>> a list of generic openid urls, so it can add quick login buttons using +>> those urls. +>> +>> The ugly google url will still be exposed here and there where +>> a unique user id is needed. That can be avoided by not using the generic +>> <https://www.google.com/accounts/o8/id>, but instead your own profile +>> like <http://www.google.com/profiles/joeyhess>. --[[Joey]] diff --git a/doc/forum/how_can_I_use___39____47____39___as_tagbase__63__.mdwn b/doc/forum/how_can_I_use___39____47____39___as_tagbase__63__.mdwn new file mode 100644 index 000000000..8a24152dc --- /dev/null +++ b/doc/forum/how_can_I_use___39____47____39___as_tagbase__63__.mdwn @@ -0,0 +1,13 @@ +I'd like tags to be top-level pages, like /some-tag. + +I achieve this most of the time by *not* defining `tagbase`. + +However, this goes wrong if the name of a tag matches the name of a page further down a tree. + +Example: + + * tag scm, corresponding page /scm + * a page /log/scm tagged 'scm' does not link to /scm + * a page /log/puppet tagged 'scm' links to /log/scm in the Tags: section + +Is this possible, or am I pushing tags too far (again)? -- [[Jon]] diff --git a/doc/forum/how_can_I_use___39____47____39___as_tagbase__63__/comment_1_e7897651ba8d9156526d36d6b7744eae._comment b/doc/forum/how_can_I_use___39____47____39___as_tagbase__63__/comment_1_e7897651ba8d9156526d36d6b7744eae._comment new file mode 100644 index 000000000..361c51b09 --- /dev/null +++ b/doc/forum/how_can_I_use___39____47____39___as_tagbase__63__/comment_1_e7897651ba8d9156526d36d6b7744eae._comment @@ -0,0 +1,8 @@ +[[!comment format=mdwn + username="http://joey.kitenet.net/" + nickname="joey" + subject="comment 1" + date="2010-12-05T20:15:28Z" + content=""" +From the code, it seems to me like setting tagbase to \"/\" would actually do what you want. Does it not work? +"""]] diff --git a/doc/forum/how_do_I_revert_edits_in_the_web_mode__63__.mdwn b/doc/forum/how_do_I_revert_edits_in_the_web_mode__63__.mdwn new file mode 100644 index 000000000..d69b3801b --- /dev/null +++ b/doc/forum/how_do_I_revert_edits_in_the_web_mode__63__.mdwn @@ -0,0 +1,46 @@ +Puzzled a bit :-/ + +> There is no explicit interface for reverting edits. Most of us use `git revert`. --[[Joey]] + +>> That's a blow; I was planning on appointing no techies to keep law and order on our pages :-/ Is there a plugin or at least a plan to add such a 'in demand' feature? + +>>> A lot of things complicate adding that feature to the web interface. +>>> +>>> First, ikiwiki happily uses whatever the VCS's best of breed web +>>> history interface is. (ie, viewvcs, gitweb). To allow reverting +>>> past the bottom of the RecentChanges page, it would need to have its +>>> own history browser. Not sure I want to go there. +>>> +>>> And the mechanics of handling reverting can quickly get complex. +>>> Web reverting should only allow users to revert things they can edit, +>>> but reverting a whole commit in git might touch multiple files. +>>> Some files may not be editable over the web at all. (The +>>> [[tips/untrusted_git_push]] also has to deal with those issues.) +>>> Finally, a revert can fail with a conflict. The revert could touch +>>> multiple files, and multiple ones could conflict. The conflict may +>>> involve non-page files that can't be diffed. So an interface for +>>> resolving such a conflict could be hard. +>>> +>>> Probably web-based reverting would need to be limited to reverting +>>> single file changes, not whole commits, and not having very good +>>> conflict handling. And maybe only being accessible for changes +>>> still visible on RecentChanges. With those limitations, it's certianly +>>> doable (as a plugin even), but given how excellent `git revert` is in +>>> comparison, I have not had a real desire to do so. --[[Joey]] + +>>>> Web edits are single-file anyway, so I wouldn't expect web reverts +>>>> to handle the multi-file case. OTOH, I've sometimes wished ikiwiki +>>>> had its own history browser (somewhere down my todo list). --[[schmonz]] + +>>>> Yup, having a possibility to revert a single file would suffice. + +--- + +Perer Gammie and I are working on reversion over at [[todo/web_reversion]]. +--[[Joey]] + +Update: Web reversion is now supported by ikiwiki. Only changes committed +to your wiki after you upgrade to the version of ikiwiki that supports it +will get revert buttons on the RecentChanges page. If you want to force +adding buttons for older changes, you can delete `recentchanges/*._change` +from your srcdir, and rebuild the wiki. --[[Joey]] diff --git a/doc/forum/how_do_I_revert_edits_in_the_web_mode__63__/comment_1_e4720e8e4fe74bd6cba746e8259832e6._comment b/doc/forum/how_do_I_revert_edits_in_the_web_mode__63__/comment_1_e4720e8e4fe74bd6cba746e8259832e6._comment new file mode 100644 index 000000000..597cab2e4 --- /dev/null +++ b/doc/forum/how_do_I_revert_edits_in_the_web_mode__63__/comment_1_e4720e8e4fe74bd6cba746e8259832e6._comment @@ -0,0 +1,8 @@ +[[!comment format=mdwn + username="https://www.google.com/accounts/o8/id?id=AItOawmlZJCPogIE74m6GSCmkbJoMZiWNOlXcjI" + nickname="Ian" + subject="comment 1" + date="2010-09-24T19:01:08Z" + content=""" ++1 for a \"revert\" web plugin which at least handles the simple cases. -- Ian Osgood, The TOVA Company +"""]] diff --git a/doc/forum/how_do_I_translate_a_TWiki_site.mdwn b/doc/forum/how_do_I_translate_a_TWiki_site.mdwn new file mode 100644 index 000000000..5bfdbb86c --- /dev/null +++ b/doc/forum/how_do_I_translate_a_TWiki_site.mdwn @@ -0,0 +1,44 @@ +[[!meta date="2006-12-19 09:56:21 +0000"]] + +# Excellent - how do I translate a TWiki site? + +I just discovered ikiwiki quite by chance, I was looking for a console/terminal +menu system and found pdmenu. So pdmenu brought me to here and I've found ikiwiki! +It looks as if it's just what I've been wanting for a long time. I wanted something +to create mostly text web pages which, as far as possible, have source which is human +readable or at least in a standard format. ikiwiki does this twice over by using +markdown for the source and producing static HTML from it. + +I'm currently using TWiki and have a fair number of pages in that format, does +anyone have any bright ideas for translating? I can knock up awk scripts fairly +easily, perl is possible (but I'm not strong in perl). + +> Let us know if you come up with something to transition from the other +> format. Another option would be writing a ikiwiki plugin to support the +> TWiki format. --[[Joey]] + +> Jamey Sharp and I have a set of scripts in progress to convert other wikis to ikiwiki, including history, so that we can migrate a few of our wikis. We already have support for migrating MoinMoin wikis to ikiwiki, including conversion of the entire history to Git. We used this to convert the [XCB wiki](http://xcb.freedesktop.org/wiki/) to ikiwiki; until we finalize the conversion and put the new wiki in place of the old one, you can browse the converted result at <http://xcb.freedesktop.org/ikiwiki>. We already plan to add support for TWiki (including history, since you can just run parsecvs on the TWiki RCS files to get Git), so that we can convert the [Portland State Aerospace Society wiki](http://psas.pdx.edu) (currently in Moin, but with much of its history in TWiki, and with many of its pages still in TWiki format using Jamey's TWiki format for MoinMoin). +> +> Our scripts convert by way of HTML, using portions of the source wiki's code to render as HTML (with some additional code to do things like translate MoinMoin's `\[[TableOfContents]]` to ikiwiki's `\[[!toc ]]`), and then using a modified [[!cpan HTML::WikiConverter]] to turn this into markdown and ikiwiki. This produces quite satisfactory results, apart from things that don't have any markdown equivalent and thus remain HTML, such as tables and definition lists. Conversion of the history occurs by first using another script we wrote to translate MoinMoin history to Git, then using our git-map script to map a transformation over the Git history. +> +> We will post the scripts as soon as we have them complete enough to convert our wikis. +> +> -- [[JoshTriplett]] + +>> Thanks for an excellent Xmas present, I will appreciate the additional +>> users this will help switch to ikiwiki! --[[Joey]] + + +>> Sounds great indeed. Learning from [here](http://www.bddebian.com/~wiki/AboutTheTWikiToIkiwikiConversion/) that HTML::WikiConverter needed for your conversion was not up-to-date on Debian I have now done an unofficial package, including your proposed Markdown patches, apt-get'able at <pre>deb http://debian.jones.dk/ sid wikitools</pre> +>> -- [[JonasSmedegaard]] + + +>>I see the "We will post the scripts ...." was committed about a year ago. A current site search for "Moin" does not turn them up. Any chance of an appearance in the near (end of year) future? +>> +>> -- [[MichaelRasmussen]] + +>>> It appears the scripts were never posted? I recently imported my Mediawiki site into Iki. If it helps, my notes are here: <http://iki.u32.net/Mediawiki_Conversion> --[[sabr]] + +>>>>> The scripts have been posted now, see [[joshtriplett]]'s user page, +>>>>> and I've pulled together all ways I can find to [[convert]] other +>>>>> systems into ikiwiki. --[[Joey]] diff --git a/doc/forum/how_to_add_post_titles_in_ikiwiki_blog__63__.mdwn b/doc/forum/how_to_add_post_titles_in_ikiwiki_blog__63__.mdwn new file mode 100644 index 000000000..68eb06c4c --- /dev/null +++ b/doc/forum/how_to_add_post_titles_in_ikiwiki_blog__63__.mdwn @@ -0,0 +1,28 @@ +Look at these two blogs: + +1) http://ciffer.net/~svend/blog/ + +2) http://upsilon.cc/~zack/blog/ + +Well, i set up successfully my blog (i am using inline function in a wiki page) but i have manually to insert blog pos titles and the result is that of blog #2. +Instead i would like to have blog post titles automatically inserted like blog #1 (and they are links too! I want them that way). +I looked in git repo of the two blogs but i couldn't find the answer. +Any help would be really appreciated. + +Thanks! + +Raf + +> Either name the blog post files with the full title you want them to +> have, or use [[ikiwiki/directive/meta]] title to set the title of a blog post. +> +> \[[!meta title="this is my blog post"]] +> +> Either way, the title will automatically be displayed, clickable, at the top. +> (zack has hacked his templates not to do that). --[[Joey]] + +>> Thanks for your answer.<br/> +>> I looked in the [templates](http://git.upsilon.cc/cgi-bin/gitweb.cgi?p=zack-homepage.git;a=tree;f=templates;h=824100e62a06cee41b582ba84fcb9cdd982fe4be;hb=HEAD) folder of zack but couldn't see any hack of that kind.<br/> +>> Anyway, I didn't hack my template...<br/> +>> I will follow your suggestion of using \[[ikiwiki/directive/meta]] title to set titles.<br/> +>> Thanks a lot. --Raf diff --git a/doc/forum/how_to_enable_multimarkdown__63__.mdwn b/doc/forum/how_to_enable_multimarkdown__63__.mdwn new file mode 100644 index 000000000..208aadcb0 --- /dev/null +++ b/doc/forum/how_to_enable_multimarkdown__63__.mdwn @@ -0,0 +1,9 @@ +I enabled multimarkdown in my setup file but I get this message 'remote: multimarkdown is enabled, but Text::MultiMarkdown is not installed'. +I also installed multimarkdown-git for my distro (archlinux), which should take care of installing all required perl modules, I believe. +What am I missing? + +Thanks. + +> You are apparently still missing the [[!cpan Text::MultiMarkdown]] +> perl module. Not being familiar with arch linux, I don't know what +> multimarkdown-git is, so I can't say more than that.. --[[Joey]] diff --git a/doc/forum/how_to_enable_multimarkdown__63__/comment_1_037f858c4d0bcbb708c3efd264379500._comment b/doc/forum/how_to_enable_multimarkdown__63__/comment_1_037f858c4d0bcbb708c3efd264379500._comment new file mode 100644 index 000000000..6045a4a3f --- /dev/null +++ b/doc/forum/how_to_enable_multimarkdown__63__/comment_1_037f858c4d0bcbb708c3efd264379500._comment @@ -0,0 +1,14 @@ +[[!comment format=mdwn + username="https://www.google.com/accounts/o8/id?id=AItOawlzADDUvepOXauF4Aq1VZ4rJaW_Dwrl6xE" + nickname="Dário" + subject="comment 1" + date="2010-07-15T15:37:31Z" + content=""" +multimarkdown-git is a package build that fetches the git version of multimarkdown. +It should install Text::Markdown I believe. +I tried to install it by hand on the cpan command line but it didn't work either: +perl -MCPAN -e shell +install Text::MultiMarkdown + +says couldn't run make file or something. +"""]] diff --git a/doc/forum/how_to_enable_multimarkdown__63__/comment_2_b7d512a535490dabf8d6ce55439741c7._comment b/doc/forum/how_to_enable_multimarkdown__63__/comment_2_b7d512a535490dabf8d6ce55439741c7._comment new file mode 100644 index 000000000..804b71c67 --- /dev/null +++ b/doc/forum/how_to_enable_multimarkdown__63__/comment_2_b7d512a535490dabf8d6ce55439741c7._comment @@ -0,0 +1,8 @@ +[[!comment format=mdwn + username="http://joey.kitenet.net/" + nickname="joey" + subject="comment 2" + date="2010-07-16T19:44:55Z" + content=""" +All I can tell you is that, if multimarkdown is correctly installed (ie, if `perl -e 'use Text::MultiMarkdown'` runs successfully), ikiwiki can use it. +"""]] diff --git a/doc/forum/how_to_load_an_external_page_and_still_have_it_under_ikiwiki_template.mdwn b/doc/forum/how_to_load_an_external_page_and_still_have_it_under_ikiwiki_template.mdwn new file mode 100644 index 000000000..a747911a5 --- /dev/null +++ b/doc/forum/how_to_load_an_external_page_and_still_have_it_under_ikiwiki_template.mdwn @@ -0,0 +1,3 @@ +OK, probably title is bit confusing. Basically I'd like to be able to keep my left hand side menu, which is part of the template, and at the same time load let's say forum on the right hand side, which sits on a separate domain. Is it possible then to construct template that for some special links it runs as lets say in *frameset* mode? + +> I think I'll have to use [[ikiwiki/directive/pagetemplate]] and this <http://stackoverflow.com/questions/153152/resizing-an-iframe-based-on-content> solution diff --git a/doc/forum/how_to_login_as_admin.mdwn b/doc/forum/how_to_login_as_admin.mdwn new file mode 100644 index 000000000..807f82501 --- /dev/null +++ b/doc/forum/how_to_login_as_admin.mdwn @@ -0,0 +1,18 @@ +I even managed to set up ikiwiki so it works fine with git; but how on earth do +I log in as an administrator? In the .setup file the admin user is set to +'zimek' but when I go and register 'zimek' on the web it appears as normal user +not the administrator. What am I missing? + +> That's really all there is to it. The [[automatic_setup|setup]] script +> registers the admin user for you before the wiki goes live. If you didn't +> use it, registering the right account name will get you the admin account. +> +> The name is case sensative, perhaps you really spelled one of them `Zimek`? +> +> Or maybe you're the admin, and don't know it? Everything looks the same for the admin, +> except they can edit even locked pages, and can access the websetup interface from their +> Preferences page, if you have that plugin enabled. --[[Joey]] + +>> Maybe I'm indeed. I know that I've disabled all the plugins while installing ikiwiki. Checking it now ;-) + +>> Yup, I'm the God of my ikiwiki. (Thanks) diff --git a/doc/forum/how_to_setup_ikiwiki_on_a_remote_host.mdwn b/doc/forum/how_to_setup_ikiwiki_on_a_remote_host.mdwn new file mode 100644 index 000000000..1c0f8f561 --- /dev/null +++ b/doc/forum/how_to_setup_ikiwiki_on_a_remote_host.mdwn @@ -0,0 +1,35 @@ +Hi all! +I really like ikiwiki and i tested it on my local machine but i have one question that i can't answer reading documentation (my fault of course)... +I have an account and some space on a free hosting service. +Now, i want to put my ikiwiki on this remote web space so that i can browse it from wherever i want. +I have my source dir and my git dir on my local machine. +How can i upload my ikiwiki on the remote host and manage it via git as i can do when i test it locally? +Where is specified? Where can i find documentation about it? + +Thanks in advance! + +Pab + +> There are several ways to accomplish this, depending on what you really +> want to do. +> +> If your goal is to continue generating the site locally, but then +> transfer it to the remote host for serving, you could use the +> [[plugins/rsync]] plugin. +> +> If your goal is to install and run the ikiwiki software on the remote host, +> then you would follow a similar path to the ones described in these tips: +> [[tips/nearlyfreespeech]] [[tips/DreamHost]]. Or even [[install]] ikiwiki +> from a regular package if you have that kind of access. Then you could +> push changes from your local git to git on the remote host to update the +> wiki. [[tips/Laptop_wiki_with_git]] explains one way to do that. +> --[[Joey]] + +Thanks a lot for your answer. +rsync plugin would be perfect but... how would i manage blog post? +I mean... is it possible to manage ikiwiki blog too with rsync plugin in the way you told me? --Pab + +> If you want to allow people to make comments on your blog, no, the rsync plugin will not help, since it will upload a completely static site where nobody can make comments. Comments require a full IkiWiki setup with CGI enabled, so that people add content (comments) from the web. --[[KathrynAndersen]] + +Ok, i understand, thanks. +Is there any hosting service that permits to have a full installation of iwkiwiki or i am forced to get a vps or to mantain a personal server for that? --Pab diff --git a/doc/forum/html_source_pages_in_version_3.20100704.mdwn b/doc/forum/html_source_pages_in_version_3.20100704.mdwn new file mode 100644 index 000000000..7a620fd57 --- /dev/null +++ b/doc/forum/html_source_pages_in_version_3.20100704.mdwn @@ -0,0 +1,8 @@ +Is this different from using the html/rawhtml plugins? + +> I suppose you're talking about this: + + * po: Added support for .html source pages. (intrigeri) + +> That means the [[plugins/po]] plugin is able to translate html pages +> used by the [[plugins/html]] plugin. --[[Joey]] diff --git a/doc/forum/ikiwiki_+_mathjax.mdwn b/doc/forum/ikiwiki_+_mathjax.mdwn new file mode 100644 index 000000000..1279a2c80 --- /dev/null +++ b/doc/forum/ikiwiki_+_mathjax.mdwn @@ -0,0 +1 @@ +Is it possible to use [mathjax](http://www.mathjax.org/) in ikiwiki to typeset math formulas? If so, is this compatible with the [wmd](http://ikiwiki.info/plugins/wmd/) plugin? diff --git a/doc/forum/ikiwiki_+_mathjax/comment_1_8426a985ecfbb02d364116503ef3a0d4._comment b/doc/forum/ikiwiki_+_mathjax/comment_1_8426a985ecfbb02d364116503ef3a0d4._comment new file mode 100644 index 000000000..f5849e7bf --- /dev/null +++ b/doc/forum/ikiwiki_+_mathjax/comment_1_8426a985ecfbb02d364116503ef3a0d4._comment @@ -0,0 +1,8 @@ +[[!comment format=mdwn + username="https://www.google.com/accounts/o8/id?id=AItOawkIfDOOUJ0h_niLRZZL5HsHHOuQfUrVcQo" + nickname="Carl" + subject="Works with mathjax, with a little help" + date="2011-01-02T22:59:27Z" + content=""" +Yes, mathjax works with ikiwiki. The main trouble I ran into was markdown trying to parse the math. For example markdown and tex both use underscores. I wrote a quick plugin to replace all the TeX with strip markers in the 'filter' phase and put it back in the 'sanitize' phase (just replacing all the TeX content with its Base64 representation temporarily) and that seems to have fixed the problem well enough. +"""]] diff --git a/doc/forum/ikiwiki_+_mathjax/comment_2_ddb7a4d59bbe7145167d122a146e8f65._comment b/doc/forum/ikiwiki_+_mathjax/comment_2_ddb7a4d59bbe7145167d122a146e8f65._comment new file mode 100644 index 000000000..af15e0875 --- /dev/null +++ b/doc/forum/ikiwiki_+_mathjax/comment_2_ddb7a4d59bbe7145167d122a146e8f65._comment @@ -0,0 +1,11 @@ +[[!comment format=mdwn + username="https://www.google.com/accounts/o8/id?id=AItOawnl3JHr3pFPOZMsKgx11_mLCbic1Rb3y8s" + nickname="patrick" + subject="comment 2" + date="2011-01-09T09:48:23Z" + content=""" +Hi Carl, +That's great news, I've been looking for a solution like this for some time. +would you mind sharing your patch or write up a small howto? +Thanks +"""]] diff --git a/doc/forum/ikiwiki_and_big_files.mdwn b/doc/forum/ikiwiki_and_big_files.mdwn new file mode 100644 index 000000000..cd41d9fce --- /dev/null +++ b/doc/forum/ikiwiki_and_big_files.mdwn @@ -0,0 +1,102 @@ +My website has 214 hand written html, 1500 of pictures and a few, err sorry, 114 +video files. All this takes around 1.5 GB of disk space at the moment. +Plain html files take 1.7 MB and fit naturally into git. + +But what about the picture and video files? + +Pictures are mostly static and rarely need to be edited after first upload, +wasting a megabyte or two after an edit while having them in git doesn't really matter. +Videos on the other hand are quite large from megabytes to hundreds. Sometimes +I re-encode them from the original source with better codec parameters and just +replace the files under html root so they are accessible from the same URL. +So having a way to delete a 200 MB file and upload a new one with same name and access URL +is what I need. And it appears git has trouble erasing commits from history, or requires +some serious gitfoo and good backups of the original repository. + +So which ikiwiki backend could handle piles of large binary files? Or should I go for a separate +data/binary blob directory next to ikiwiki content? + +Further complication is my intention to keep URL compatibility with old handwritten and ikiwiki +based site. Sigh, tough job but luckily just a hobby. + +[-Mikko](http://mcfrisk.kapsi.fi) + +ps. here's how to calculate space taken by html, picture and video files: + + ~/www$ unset sum; for size in $( for ext in htm html txt xml log; \ + do find . -iname "*$ext" -exec stat -c "%s" \{\} \; ; done | xargs ); \ + do sum=$(( $sum + $size )); done ; echo $sum + 1720696 + ~/www$ unset sum; for size in $( for ext in jpg gif jpeg png; \ + do find . -iname "*$ext" -exec stat -c "%s" \{\} \; ; done | xargs ); \ + do sum=$(( $sum + $size )); done ; echo $sum + 46032184 + ~/www$ unset sum; for size in $( for ext in avi dv mpeg mp4; \ + do find . -iname "*$ext" -exec stat -c "%s" \{\} \; ; done | xargs ); \ + do sum=$(( $sum + $size )); done ; echo $sum + 1351890888 + +> One approach is to use the [[plugins/underlay]] plugin to +> configure a separate underlay directory, and put the large +> files in there. Those files will then be copied to the generated +> wiki, but need not be kept in revision control. (Or could be +> revision controlled in a separate repository -- perhaps one using +> a version control system that handles large files better than git; +> or perhaps one that you periodically blow away the old history to +> in order to save space.) +> +> BTW, the `hardlink` setting is a good thing to enable if you +> have large files, as it saves both disk space and copying time. +> --[[Joey]] + +Can underlay plugin handle the case that source and destination directories +are the same? I'd rather have just one copy of these underlay files on the server. + +> No, but enabling hardlinks accomplishes the same effect. --[[Joey]] + +And did I goof in the setup file since I got this: + + $ ikiwiki -setup blog.setup -rebuild --verbose + Can't use string ("/home/users/mcfrisk/www/blog/med") as an ARRAY ref while + "strict refs" in use at + /home/users/mcfrisk/bin/share/perl/5.10.0/IkiWiki/Plugin/underlay.pm line 41. + $ grep underlay blog.setup + add_plugins => [qw{goodstuff websetup comments blogspam html sidebar underlay}], + underlaydir => '/home/users/mcfrisk/bin/share/ikiwiki/basewiki', + # underlay plugin + # extra underlay directories to add + add_underlays => '/home/users/mcfrisk/www/blog/media', + $ egrep "(srcdir|destdir)" blog.setup + srcdir => '/home/users/mcfrisk/blog', + destdir => '/home/users/mcfrisk/www/blog', + # allow symlinks in the path leading to the srcdir (potentially insecure) + allow_symlinks_before_srcdir => 1, + # directory in srcdir that contains directive descriptions + +-Mikko + +> The plugin seems to present a bad default value in the setup file. +> (Fixed in git.) A correct configuration would be: + + add_underlays => ['/home/users/mcfrisk/www/blog/media'], + +Umm, doesn't quite fix this yet: + + $ ikiwiki -setup blog.setup -v + Can't use an undefined value as an ARRAY reference at /home/users/mcfrisk/bin/share/perl/5.10.0/IkiWiki + /Plugin/underlay.pm line 44. + $ grep underlay blog.setup + add_plugins => [qw{goodstuff websetup comments blogspam html sidebar underlay}], + underlaydir => '/home/users/mcfrisk/bin/share/ikiwiki/basewiki', + # underlay plugin + # extra underlay directories to add + add_underlays => ['/home/users/mcfrisk/www/blog/media'], + $ ikiwiki --version + ikiwiki version 3.20091032 + +-Mikko + +> Yeah, I've fixed that in git, but you can work around it with this: +> --[[Joey]] + + templatedirs => [], diff --git a/doc/forum/ikiwiki_development_environment_tips.mdwn b/doc/forum/ikiwiki_development_environment_tips.mdwn new file mode 100644 index 000000000..f9c584159 --- /dev/null +++ b/doc/forum/ikiwiki_development_environment_tips.mdwn @@ -0,0 +1,68 @@ +I haven't settled on a comfortable/flexible/quick development environment for hacking on ikiwiki. The VM I host my web pages on is not fast enough to use for RAD and ikiwiki. For developing plugins, it seems a bit heavy-weight to clone the entire ikiwiki repository. I haven't managed to get into a habit of running a cloned version of ikiwiki from it's own dir, rather than installing it (If that's even possible). The ikiwiki site source (source ./doc) is quite large and not a great testbed for hacking (e.g. if you are working on a plugin you need a tailored test suite for that plugin). + +Does anyone have a comfortable setup or tips they would like to share? -- [[Jon]] + +> I've just been setting `libdir` in an existing wiki's setup file. When the plugin's in a decent state, I copy it over to a git checkout and commit. For the plugins I've been working on (auth and VCS), this has been just fine. Are you looking for something more? --[[schmonz]] + +>> I think this suffers from two problems. Firstly, unless you are tracking git +>> master in your existing wiki, there's the possibility that your plugin will +>> not work with a more modern version of ikiwiki (or that it would benefit +>> from using a newly added utility subroutine or similar). + +>>> Unlikely. I don't make changes to the plugin interface that break +>>> existing plugins. (Might change non-exported `IkiWiki::` things +>>> from time to time.) --[[Joey]] + +>> Second, sometimes I +>> find that even writing a plugin can involve making minor changes outside of +>> the plugin code (bug fixes, or moving functionality about). So, I think +>> having some kind of environment built around a git checkout is best. +>> +>> However, this does not address the issue of the tedium writing/maintaining a +>> setup file for testing things. +>> +>> I think I might personally benefit from a more consistent environment (I +>> move from machine-to-machine frequently). -- [[Jon]] + +> If you set `libdir` to point to a checkout of ikiwiki's git repository, +> it will override use of the installed version of ikiwiki, so ikiwiki will +> immediatly use any changed or new `.pm` files (with the exception of +> IkiWiki.pm), and you can use git to manage it all without an installation +> step. If I am modifying IkiWiki.pm, I generally symlink it from +> `/usr/share/perl5/IkiWiki.pm` to my git reporisitory. Granted, not ideal. +> +> I often use my laptop's local version of my personal wiki for testing. +> It has enough stuff that I can easily test most things, and if I need +> a test page I just dump test cases on the sandbox. I can make +> any changes necessary during testing and then `git reset --hard +> origin/master` to avoid publishing them. +> +> If the thing I'm testing involves templates, or underlays, +> I will instead use ikiwiki's `docwiki.setup` for testing, modifying it as +> needed, since it is preconfigured to use the templates and underlays +> from ikiwiki's source repository. +> --[[Joey]] + +> I work with Ikiwiki from the git checkout directory the following way. +> +> * instead of running ikiwiki, I wrote the following `mykiwiki` shell script, +> that also allows me to use my custom lib-ifited multimarkdown: + + #!/bin/sh + + MMDSRC="$HOME/src/multimarkdown/lib" + IKIWIKISRC="$HOME/src/ikiwiki" + PLUGINS="$HOME/src/ikiplugins" + + /usr/bin/perl -I"$MMDSRC" -I"$IKIWIKISRC/blib/lib" -I"$PLUGINS" "$IKIWIKISRC/ikiwiki.out" -libdir "$IKIWIKISRC" "$@" + +> * I also have an installed ikiwiki from Debian unstable, from which I only use the base wiki, so my `.setup` has the following configs: + + # additional directory to search for template files + templatedir => '/home/oblomov/src/ikiwiki/templates', + # base wiki source location + underlaydir => '/usr/share/ikiwiki/basewiki', + # extra library and plugin directory + libdir => '/home/oblomov/src/ikiwiki', + +> Hope that helps --GB diff --git a/doc/forum/ikiwiki_vim_integration.mdwn b/doc/forum/ikiwiki_vim_integration.mdwn new file mode 100644 index 000000000..4724807e8 --- /dev/null +++ b/doc/forum/ikiwiki_vim_integration.mdwn @@ -0,0 +1,17 @@ +Hi all. I upgraded the [ikiwiki-nav plugin](http://www.vim.org/scripts/script.php?script_id=2968) +so that now it supports: + + * Jumping to the file corresponding to the wikilink under the cursor. + * Creating the file corresponding to the wikilink under the cursor (including + directories if necessary.) + * Jumping to the previous/next wikilink in the current file. + * Autocomplete link names. + +Download it from [here](http://www.vim.org/scripts/script.php?script_id=2968) + +I've also created a new page unifying all the hints available here to use vim +with ikiwiki files, in [[tips/vim_and_ikiwiki]] + + +--[[jerojasro]] + diff --git a/doc/forum/ikiwiki_vim_syntaxfile.mdwn b/doc/forum/ikiwiki_vim_syntaxfile.mdwn new file mode 100644 index 000000000..e6942cd2d --- /dev/null +++ b/doc/forum/ikiwiki_vim_syntaxfile.mdwn @@ -0,0 +1,26 @@ +See the new syntax file [[here|tips/vim_syntax_highlighting]]. It fixes both of +the problems reported below. + +---- + +Hi all, + +I'm teaching myself how to write syntax files for vim by fixing several issues +(and up to certain extent, taking over the maintenance) of the vim syntax +(highlighting) file for ikiwiki. + +I'd like you to document here which problems you have found, so I can hunt them +and see if I can fix them. + +## Problems Found + + * Arguments of directives with a value of length 1 cause the following text to + be highlighted incorrectly. Example: + + [[!directive param1="val1" param2="1"]] more text ... + + * A named wikilink in a line, followed by text, and then another wikilink, + makes the text in between the links to be incorrectly highlighted. Example: + + \[[a link|alink]] text that appears incorrectly .... \[[link]] + diff --git a/doc/forum/installation_and_setup_questions.mdwn b/doc/forum/installation_and_setup_questions.mdwn new file mode 100644 index 000000000..50633ec3f --- /dev/null +++ b/doc/forum/installation_and_setup_questions.mdwn @@ -0,0 +1,52 @@ +[[!meta date="2007-03-02 00:57:08 +0000"]] + +Ikiwiki creates a .ikiwiki directory in my wikiwc working directory. Should I +"svn add .ikiwiki" or add it to svn:ignore? + +> `.ikiwiki` is used by ikiwiki to store internal state. You can add it to +> svn:ignore. --[[Joey]] +> > Thanks a lot. + +Is there an easy way to log via e-mail to some webmaster address, instead +of via syslog? + +> Not sure why you'd want to do that, but couldn't you use a tool like +> logwatch to mail selected lines from the syslog? --[[Joey]] + +> > The reason is that I'm not logged in on the web server regularly to +> > check the log files. I'll see whether I can install a logwatch instance. + +I'm trying to install from scratch on a CentOS 4.6 system. I installed perl 5.8.8 from source and then added all the required modules via CPAN. When I build ikiwiki from the tarball, I get this message: + + rendering todo/calendar_--_archive_browsing_via_a_calendar_frontend.mdwn + *** glibc detected *** double free or corruption (!prev): 0x0922e478 *** + make: *** [extra_build] Aborted + +I'm kind of at a loss how to track this down or work around it. Any suggestions? --Monty + +> All I can tell you is that it looks like a problem with your C library or +> perl. Little perl programs like ikiwiki should only be able to trigger +> such bugs, not contain them. :-) Sorry I can't be of more help. +> --[[Joey]] + +> I had a similar problem after upgrading to the latest version of +> Text::Markdown from CPAN. You might try either looking for a Markdown +> package for CentOS or using the latest version of John Gruber's +> Markdown.pl: +> <http://daringfireball.net/projects/downloads/Markdown_1.0.2b8.tbz> +> --[[JasonBlevins]], April 1, 2008 18:22 EDT + +>> Unfortunately I couldn't find a CentOS package for markdown, and I +>> couldn't quite figure out how to use John Gruber's version instead. +>> I tried copying it to site_perl, etc., but the build doesn't pick +>> it up. For now I can just play with it on my Ubuntu laptop for which +>> the debian package installed flawlessly. I'll probably wait for an +>> updated version of Markdown to see if this is fixed in the future. +>> --Monty + +>I suggest that you pull an older version of Text::Markdown from CPAN. I am using <http://backpan.perl.org/authors/id/B/BO/BOBTFISH/Text-Markdown-1.0.5.tar.gz> and that works just fine. +>There is a step change in version and size between this version (dated 11Jan2008) and the next version (1.0.12 dated 18Feb2008). I shall have a little look to see why, in due course. +>Ubuntu Hardy Heron has a debian package now, but that does not work either. +> --Dirk 22Apr2008 + +> This might be related to [Text::Markdown bug #37297](http://rt.cpan.org/Public/Bug/Display.html?id=37297).--ChapmanFlack 9Jul2008 diff --git a/doc/forum/installation_as_non-root_user.mdwn b/doc/forum/installation_as_non-root_user.mdwn new file mode 100644 index 000000000..4997af2ac --- /dev/null +++ b/doc/forum/installation_as_non-root_user.mdwn @@ -0,0 +1,7 @@ +I'd like to install ikiwiki as a non-root user. I can plow through getting all the +perl dependencies installed because that's well documented in the perl world, +but I don't know how to tell ikiwiki to install somewhere other than / --BrianWilson + +> Checkout the tips section for [[tips/DreamHost]]. It should do the trick. --MattReynolds + +[[!meta date="2008-01-13 16:02:52 -0500"]] diff --git a/doc/forum/installation_of_selected_docs.mdwn b/doc/forum/installation_of_selected_docs.mdwn new file mode 100644 index 000000000..81dd1ee00 --- /dev/null +++ b/doc/forum/installation_of_selected_docs.mdwn @@ -0,0 +1,29 @@ +[[!meta date="2007-09-06 19:47:23 +0000"]] + +# Installation of selected docs (html) + +The latest release has around 560 files (over 2MB) in html. + +Any suggestions or ideas on limiting what html is installed? + +For example, I don't see value in every ikiwiki install out there to also install personal "users" ikiwiki pages. + +For now I copy ikiwiki.setup. And then use pax with -L switch to copy the targets of the symlinks of the basewiki. + +I was thinking of making a list of desired documents from the html directory to install. + +--JeremyReed + +> You don't need any of them, unless you want to read ikiwiki's docs locally. +> +> I don't understand why you're installing the basewiki files manually; +> ikiwiki has a Makefile that will do this for you. --[[Joey]] + +>> The Makefile's install doesn't do what I want so I use different installer for it. +>> It assumes wrong location for man pages for me. (And it should consider using INSTALLVENDORMAN1DIR and +>> MAN1EXT but I don't know about section 8 since I don't know of perl value for that.) +>> I don't want w3m cgi installed; it is optional for my package. +>> I will just patch for that instead of using my own installer. +>> Note: I am working on the pkgsrc package build specification for this. This is for creating +>> packages for NetBSD, DragonFly and other systems that use pkgsrc package system. +>> --JeremyReed diff --git a/doc/forum/link_autocompletion_in_vim.mdwn b/doc/forum/link_autocompletion_in_vim.mdwn new file mode 100644 index 000000000..a46c7e4c1 --- /dev/null +++ b/doc/forum/link_autocompletion_in_vim.mdwn @@ -0,0 +1,22 @@ +This page is deprecated. See [[tips/vim_and_ikiwiki]] for the most up to date +content. + +------ + +I extended the functionality of the [ikiwiki-nav plugin](http://www.vim.org/scripts/script.php?script_id=2968) +(see [[here|tips/vim_ikiwiki_ftplugin]]) to allow completion of +wikilinks from inside vim, through the omnicompletion mechanism. + +It still has some bugs, but is usable, and will not destroy your data. It can +only complete links whose definition (text) is on a single line, and still can't +handle "named links" (`\[\[text|link\]\]`). + +I'd love to hear suggestions for improvement for it, and bug reports ;) For +example, regarding how are sorted and presented the available completions +(dates, alphabetically, etc). + +You can find a tarball for it +[here](http://devnull.li/~jerojasro/ikiwiki-nav-dev.tar.gz). To install it, +extract the tarball contents in your `.vim` directory. + +--[[jerojasro]] diff --git a/doc/forum/link_to_an_image_inside_the_wiki_without_inlining_it.mdwn b/doc/forum/link_to_an_image_inside_the_wiki_without_inlining_it.mdwn new file mode 100644 index 000000000..3f2713678 --- /dev/null +++ b/doc/forum/link_to_an_image_inside_the_wiki_without_inlining_it.mdwn @@ -0,0 +1,72 @@ +[[!template id=gitbranch branch=wtk/linktoimageonly author="[[wtk]]"]] + +how can I create a link to an image which is part of the wiki, without having it inserted in my page? + +I tought this: + + \[[look at this|img/lolcat.png]] + +would work, but it doesn't. + +Any hints? --[[jerojasro]] + +> Well, currently the syntax above will display the image +> inline with the specified link text used as an alt attribute. Although +> that does not seem to be documented anywhere. +> +> A few places that use that (found with `git grep '\[\[' | egrep 'png|gif|jpeg|jpg' |grep \|`): +> +> * [[logo]] uses it to provide useful alt texts for the logos. (This +> could easily be changed to use [[ikiwiki/directive/img]] though.) +> * The `change.tmpl` template uses it to display +> the [[diff|wikiicons/diff.png]] with a very useful "diff" alt text. +> Using [[ikiwiki/directive/img]] here would mean that the +> [[plugins/recentchanges]] plugin would depend upon the img +> plugin. +> +> I do like your suggestion, it makes more sense than the current behavior. +> I'm not sure the transition pain to get from here to there is worth it, +> though. +> +> More broadly, if I were writing ikiwiki now, I might choose to leave out the +> auto-inlining of images altogether. In practice, it has added a certian level +> of complexity to ikiwiki, with numerous plugins needing to specify +> `noimageinline` to avoid accidentially inlining an image. And there has not +> been a lot of payoff from having the auto-inlining feature implicitly +> available most places. And the img directive allows much needed control over +> display, so it would be better for users to not have to worry about its +> lesser cousin. But the transition from here to *there* would be another order +> of pain. +> +> Anyway, the cheap and simple answer to your question is to use html +> or markdown instead of a [[ikiwiki/wikilink]]. Ie, +> `[look at this](img/lolcat.jpg)`. --[[Joey]] + +> > thanks a lot, that's a quite straightforward solution. I actually wrote a +> > broken plugin to do that, and now I can ditch it --[[jerojasro]] + +>>> The plugin approach is not a bad idea if you want either the ability +>>> to: +>>> +>>> * Have things that are wikilink-aware (like [[plugins/brokenlinks]] +>>> treat your link to the image as a wikilink. +>>> * Use standard wikilink path stuff (and not have to worry about +>>> a relative html link breaking if the page it's on is inlined, for +>>> example). +>>> +>>> I can help you bang that plugin into shape if need be. --[[Joey]] + +>>>> both my plugin and your suggestion yield broken html links when inlining the page (although probably that's what is expected from your suggestion (`[]()`)) +>>>> +>>>> I thought using the `bestlink` function would take care of that, but alas, it doesn't. +>>>> Get the "plugin" [here](http://devnull.li/~jerojasro/files/linktoimgonly.pm), see the broken +>>>> links generated [here](http://devnull.li/~jerojasro/blog/posts/job_offers/) and the source +>>>> file for that page [here](http://git.devnull.li/cgi-bin/gitweb.cgi?p=blog-jerojasro.git;a=blob;f=posts/job_offers.mdwn;hb=HEAD) --[[jerojasro]] + +>>>>> Use this --[[Joey]] + + return htmllink($params{page}, $params{destpage}, $params{"img"}, + linktext => $params{text}, + noimageinline => 1); + +> [[patch]]: I've updated this plugin for the current ikiwiki. --[[wtk]] diff --git a/doc/forum/multi_domain_setup_possible__63__.mdwn b/doc/forum/multi_domain_setup_possible__63__.mdwn new file mode 100644 index 000000000..01b31aafb --- /dev/null +++ b/doc/forum/multi_domain_setup_possible__63__.mdwn @@ -0,0 +1,15 @@ +Hi! I am searching for a replacement of my blog and webpages made off static HTML with just some custom PHP around it for years already. ikiwiki seems to be one of the hot candidates, since it uses a RCS. + +I would like to have a multi domain setup like this: + +- myname.private.de => more of a personal page +- professional.de => more of my professional work related page +- and possibly others + +Now when I write a blog entry about some Linux, Debian or KDE stuff, I possibly would like to have it shown on my private and my professional domain. + +And I might like to use some kind of inter wiki links now and then. + +Is such a setup possible? I thought about have a big wiki with Apache serving sub directories from it under different domains, but then wiki links like would not work. + +Maybe having the same blog entry, same content on several domains is not such a hot idea, but as long as I do not see a problem with it, I'd like to do it. diff --git a/doc/forum/multi_domain_setup_possible__63__/comment_1_43f5df30d09046ccc4f7c44703979a11._comment b/doc/forum/multi_domain_setup_possible__63__/comment_1_43f5df30d09046ccc4f7c44703979a11._comment new file mode 100644 index 000000000..5b00272b3 --- /dev/null +++ b/doc/forum/multi_domain_setup_possible__63__/comment_1_43f5df30d09046ccc4f7c44703979a11._comment @@ -0,0 +1,17 @@ +[[!comment format=mdwn + username="http://joey.kitenet.net/" + nickname="joey" + subject="branches" + date="2010-08-15T16:06:43Z" + content=""" +This is where the git backend (or bzr if you prefer) shines. Make a site, and then branch it to a second site, and put your personal type stuff on the branch. cherry-pick or merge changes from one branch to another. + +The possibility to do this kind of thing is why our recently launched Ikiwiki hosting service is called +[Branchable.com](http://branchable.com). It makes it easy to create branches of a Ikiwiki site hosted +there: <http://www.branchable.com/tips/branching_an_existing_site/> +(Merging between branches need manual git, for now.) + +BTW, for links between the branched wikis you can just use the [[plugins/shortcut]] plugin. + +--[[Joey]] +"""]] diff --git a/doc/forum/multi_domain_setup_possible__63__/comment_2_75d6581f81b71fb8acbe3561047ea759._comment b/doc/forum/multi_domain_setup_possible__63__/comment_2_75d6581f81b71fb8acbe3561047ea759._comment new file mode 100644 index 000000000..473f52f5c --- /dev/null +++ b/doc/forum/multi_domain_setup_possible__63__/comment_2_75d6581f81b71fb8acbe3561047ea759._comment @@ -0,0 +1,16 @@ +[[!comment format=mdwn + username="http://claimid.com/helios" + nickname="helios" + subject="branches" + date="2010-08-15T16:18:35Z" + content=""" +So I I just put a blog entry, which is just a file on both branches. Seems I have to learn cherry-picking and merging only some changes. + +Still I am duplicating files then and when I edit one file I have to think to also edit the other one or merge the change to it. I thought of a way to tag a blog entry on which site it should appear. And then I just have to edit one file and contents changes on all sites that share it. + +But then I possibly can do some master blog / shared content branch, so that shared content is only stored once. Then I need to find a way to automatically replicate the changes there to all sites it belongs too. But how do I store it. + +I also thought about just using symlinks for files. Can I have two sites in one repository and symlink shared files stuff around? I know bzr can version control symlinks. + +Hmmm, I think I better read more about branching, cherry-picking and merging before I proceed. I used bzr and git, but from the user interface side of things prefer bzr, which should be fast enough for this use case. +"""]] diff --git a/doc/forum/navigation_of_wiki_pages_on_local_filesystem_with_vim.mdwn b/doc/forum/navigation_of_wiki_pages_on_local_filesystem_with_vim.mdwn new file mode 100644 index 000000000..7bfcf3088 --- /dev/null +++ b/doc/forum/navigation_of_wiki_pages_on_local_filesystem_with_vim.mdwn @@ -0,0 +1,130 @@ +**UPDATE** I have created a [[page|tips/follow_wikilinks_from_inside_vim]] in +the tips section about the plugin, how to get it, install it and use it. Check +that out. --[[jerojasro]] + +I wrote a vim function to help me navigate the wiki when I'm editing it. It extends the 'gf' (goto file) functionality. Once installed, you place the cursor on a wiki page name and press 'gf' (without the quotes); if the file exists, it gets loaded. + +This function takes into account the ikiwiki linking rules when deciding which file to go to. + +> 'gf' gets in the way when there are directories with the same name of a wiki page. The +> function below doesn't implement the linking rules properly (test the link (ignoring case), +> if there is no match ascend the dir. hierarchy and start over, until we reach the root of +> the wiki). I'm rewriting it to follow these rules properly +> +> I think the page for [[LinkingRules|ikiwiki/subpage/linkingrules]] should say that ikiwiki **ascends** +> the dir. hierarchy when looking for a wikilink, not that it **descends** it. Am I correct? --[[jerojasro]] + +>> Conventionally, the root directory is considered to be lower than other +>> directories, so I think the current wording is correct. --[[Joey]] + +let me know what you think + +> " NOTE: the root of the wiki is considered the first directory that contains a +> " .ikiwiki folder, except $HOME/.ikiwiki (the usual ikiwiki libdir) +> +> That's not going to work in all situations; for example, with an ikiwiki which uses git as the backend, the normal setup is that one has +> +> * a bare git repository +> * a git repository which ikiwiki builds the wiki from (which has a .ikiwiki directory in it) +> * an *additional* git repository cloned from the bare repository, which is used for making changes from the command-line rather than the web. It is this repository in which one would be editing files with vim, and *this* repository does not have a .ikiwiki directory in it. It does have a .git directory in the root, however, so I suppose you could use that as a method of detection of a root directory, but of course that would only work for git repositories. +> +> -- [[KathrynAndersen]] +> +>> You are completely right; all of my wikis are compiled both locally and +>> remotely, and so the local repo also has a `.ikiwiki` folder. And that's not the +>> "usual" setup. +>> +>> checking for a `.git` dir would not work when the wiki's source files aren't +>> located at the root of the repo. +>> +>> So, besides of doing a `touch .ikiwiki` at the root of the wiki in your local +>> repo, do you see any alternative? +>> +>> -- [[jerojasro]] + +well. I've rewritten the whole thing, to take into account: + + * file matching ignoring case (MyPage matches mypage.mdwn) + * checking all the way down (up) to the root of the wiki (if there is a link `\[[foo]]` on `a/b/page`), + try `a/b/page/foo`, then `a/b/foo`, and so on, up to `foo` + * the alternate name for a page: when looking for the file for `\[[foo]]`, try both `foo.mdwn` and `foo/index.mdwn` + +you can find the file [here](http://git.devnull.li/cgi-bin/gitweb.cgi?p=vim-jerojasro.git;a=blob;f=.vim/ftplugin/ikiwiki_nav.vim;hb=HEAD). To use it, place it in `$HOME/.vim/ftplugin`. After that, hitting `<CR>` (Enter) in normal mode over a wikilink will take you to that page, if it exists. + +the plugin has, as of now, two problems: + + * doesn't work with wikilinks that take more than one line (though this isn't really that bad) + * it assumes that the root of the wiki is the first directory down the filesystem hierarchy that + has a `.ikiwiki` folder on it. If your copy of the wiki doesn't have it, you must create it for + the plugin to work + +-- [[jerojasro]] + +> Interesting. I was at one point looking at "potwiki.vim", which implements a local wiki and follows CamelCase links, creating new files where necessary etc., to see if it could be adapted for ikiwiki (See [[tips/vim syntax highlighting/discussion]]). I didn't get anywhere. -- [[Jon]] + +>> when I wrote the plugin I also considered the possibility of creating files (and their dirs, if necessary) +>> from new wikilinks; the changes needed to get that working are fairly small -- [[jerojasro]] + +> Seems about ready for me to think about pulling it into ikiwiki +> alongside [[tips/vim_syntax_highlighting/ikiwiki.vim]]. If you'll +> please slap a license on it. :) --[[Joey]] +> +>> GPL version 2 or later (if that doesn't cause any problems here). I'll add it +>> to the file --[[jerojasro]] +>> +>>> I see you've put the plugin on vim.org. Do you think it makes sense to +>>> also include a copy in ikiwiki? --[[Joey]] +>>> +>>>> mmm, no. There would be two copies of it, and the git repo. I'd rather have +>>>> a unique place for the "official" version (vim.org), and another for the dev +>>>> version (its git repo). +>>>> +>>>> actually, I would also suggest to upload the [[`ikiwiki.vim`|tips/vim_syntax_highlighting]] file to vim.org --[[jerojasro]] +>>>>> +>>>>> If you have any interest in maintaining the syntax highlighting +>>>>> plugin and putting it there, I'd be fine with that. I think it needs +>>>>> some slight work to catch up with changes to ikiwiki's directives +>>>>> (!-prefixed now), and wikilinks (able to have spaces now). --[[Joey]] + +<a id='syn-maintenance'> + +>>>>> +>>>>>> I don't really know too much about syntax definitions in vim. But I'll give it a stab. I know it fails when there are 2 \[[my text|link]] wikilinks in the same page. +>>>>>> I'm not promising anything, though ;) --[[jerojasro]] +> +> Also, I have a possible other approach for finding ikiwiki's root. One +> could consider that any subdirectory of an ikiwiki wiki is itself +> a standalone wiki, though probably one missing a toplevel index page. +> The relative wikilinks work such that this assumption makes sense; +> you can build any subdirectory with ikiwiki and probably get something +> reasonable with links that work, etc. +> +> So, if that's the case, then one could say that the directory that the +> user considers to be the toplevel of their wiki is really also a subwiki, +> enclosed in a succession of parents that go all the way down to the root +> directory (or alternatively, to the user's home directory). I think that +> logically makes some sense. +> +> And if that's the case, you can resolve an absolute link by looking for +> the page closest to the root that matches the link. +> +>> I like your idea; it doesn't alter the matching of the relative links, and +>> should work fine with absolute links too. I'll implement it, though I see +>> some potential (but small) issues with it --[[jerojasro]] +> +> It may even make sense to change ikiwiki's own handling of "absolute" +> links to work that way. But even without changing ikiwiki, I think it +> would be a reasonable thing for vim to do. It would only fail in two +> unusual circumstances: +> +> 1. There is a file further down, outside what the user considers +> the wiki, that matches. Say a `$HOME/index.mdwn` +> 2. An absolute link is broken in that the page linked to does +> not exist in the root of the wiki. But it does exist in a subdir, +> and vim would go to that file. +> +> --[[Joey]] +> +>> your approach will add more noise when the plugin grows the page-creation +>> feature, since there will be no real root to limit the possible locations for +>> the new page. But it is far better than demanding for a `.ikiwiki` dir --[[jerojasro]] diff --git a/doc/forum/postsignin_redirect_not_working.mdwn b/doc/forum/postsignin_redirect_not_working.mdwn new file mode 100644 index 000000000..bc8855b7b --- /dev/null +++ b/doc/forum/postsignin_redirect_not_working.mdwn @@ -0,0 +1,30 @@ +I'm confused. I got a plugin working that allows a button to call up a login screen but I can't seem to get it to return to the calling page. I end up on the prefs page. + +When the plugin first runs it puts the http_referer into a param: + + $session->param("postsignin" => $ENV{HTTP_REFERER} ); + +Then when it runs for postsignin its supposed to pull it out and send the user to the original page: + + my $page=$q->param("postsignin"); + ... + IkiWiki::redirect($q, $page); + exit; + +Full code is available on the plugin page: [[plugins/contrib/justlogin]]. + +I searched the site and there's very little info available for postsignin or redirect. Perhaps I'm using the wrong function? + +> I don't know why you end up on the prefs page. Have you tried +> looking inside the session database to see what postsignin +> parameter is stored? +> +> But, `cgi_postsignin()` assumes it can directly pass the postsignin cgi +> parameter into `cgi()`. You're expecting it to redirect to an url, and it +> just doesn't do that. Although I have considered adding a redirect +> there, just so that openid login info doesn't appear in the url after +> signin (which breaks eg, reload). That would likely still not make your +> code work, since the value of postsignin is a url query string, not a +> full url. +> +> I'd suggest you put a do=goto redirect into postsignin. --[[Joey]] diff --git a/doc/forum/recentchanges_dir_should_be_under_control_of_RCS__63__.mdwn b/doc/forum/recentchanges_dir_should_be_under_control_of_RCS__63__.mdwn new file mode 100644 index 000000000..2fe97366b --- /dev/null +++ b/doc/forum/recentchanges_dir_should_be_under_control_of_RCS__63__.mdwn @@ -0,0 +1,105 @@ +Hello Joey, + +I noticed that my Ikiwiki started to rebuild pages very slowly after my last changes +when I upgraded Ikiwiki to version 3.20100623. Now I have the latest release 3.20100704, +but it doesn't help me. + +I started to debug the problem and I found that I can see a lot of messages +like below when I try to rebuild my wiki manually: + + svn: '/path/to/ikiwiki/trunk/pages/ostatnie_zmiany' is not a working copy + svn: Can't open file '/path/to/ikiwiki/trunk/pages/ostatnie_zmiany/.svn/entries': No such file or directory + svn log exited 256 + +"ostatnie_zmiany" is a value of `recentchangespage` parameter in my +`ikiwiki.setup` file. It is not under control Subversion I use for Ikiwiki: + + $ svn status pages/ostatnie_zmiany + ? pages/ostatnie_zmiany + + $ ls pages/ostatnie_zmiany/*._change |wc -l + 100 + +`recentchangesnum` parameter has value 100 for me and I noticed that my Ikiwiki +takes a lot of time to parse all `._change` files. Finally it doesn't refresh +/ostatnie_zmiany.html page. + +Do you think I should add `ostatnie_zmiany` directory under control of my +Subversion repo? If it's not necessary, could you please give me any hint +to find a reason of problem with my Ikiwiki? + +My best regards, +Pawel + +> No, the recentchanges pages are automatically generated and should not +> themselves be in revision control. +> +> Ikiwiki has recently started automatically enabing `--gettime`, but +> it should not do it every time, but only on the initial build +> of a wiki. It will print "querying svn for file creation and modification +> times.." when it does this. If it's doing it every time, something +> is wrong. (Specifically, `.ikiwiki/indexdb` must be missing somehow.) +> +> The support for svn with --gettime is rather poor. (While with git it is +> quite fast.) But as it's only supposed to happen on the first build, +> I haven't tried to speed it up. It would be hard to do it fast with svn. +> It would be possible to avoid the warning message above, or even skip +> processing files in directories not checked into svn -- but I'd much +> rather understand why you are seeing this happen on repeated builds. +> --[[Joey]] + +>> Thanks a lot for your reply! I've just checked my `rebuild-pages.sh` +>> script and discovered that it contains +>> `/usr/bin/ikiwiki --setup ikiwiki.setup --gettime` command... :D +>> The warnings disappeared when I removed `--gettime` parameter. +>> Sorry for confusing! :) +>> +>> I have `.ikiwiki/indexdb` file here, but I noticed that it has been +>> modified about 1 minute **after** last Subversion commit: +>> +>> $ LANG=C svn up +>> At revision 5951. +>> +>> $ LANG=C svn log -r 5951 +>> ------------------------------------------------------------------------ +>> r5951 | svn | 2010-07-06 09:02:30 +0200 (Tue, 06 Jul 2010) | 1 line +>> +>> web commit by xahil +>> ------------------------------------------------------------------------ +>> +>> $ LANG=C stat pages/.ikiwiki/indexdb +>> File: `pages/.ikiwiki/indexdb' +>> Size: 184520 Blocks: 368 IO Block: 131072 regular file +>> Device: 2bh/43d Inode: 1931145 Links: 1 +>> Access: (0644/-rw-r--r--) Uid: ( 1005/ svn) Gid: ( 1005/ svn) +>> Access: 2010-07-06 12:06:24.000000000 +0200 +>> Modify: 2010-07-06 09:03:38.000000000 +0200 +>> Change: 2010-07-06 09:03:38.000000000 +0200 +>> +>> I believe it's the time I have to wait to see that my wiki page has been rebuilt. +>> Do you have any idea how to find a reason of that delay? --[[Paweł|ptecza]] + +>>> Well, I hope that your svn post-commit hook is not running your +>>> `rebuild-pages.sh`. That script rebuilds everything, rather than just +>>> refreshing what's been changed. +>>> +>>> Using subversion is not asking for speed. Especially if your svn +>>> repository is on a remote host. You might try disabling +>>> recentchanges and see if that speeds up the refreshes (it will avoid +>>> one `svn log`). +>>> +>>> Otherwise, take a look at [[tips/optimising_ikiwiki]] +>>> for some advice on things that can make ikiwiki run slowly. --[[Joey]] + +>>>> Thanks for the hints! I don't understand it, but it seems that refreshing +>>>> all pages has resolved the problem and now my wiki works well again :) +>>>> +>>>> No, I use `rebuild-pages.sh` script only when I want to rebuild +>>>> my wiki manually, for example when you release new Ikiwiki version +>>>> then I need to update my templates. Some of them have been translated +>>>> to Polish by me. +>>>> +>>>> Fortunately my wiki and its Subversion repo are located on the same host. +>>>> We have a lot of Subversion repos for our projects and I don't want to +>>>> change only wiki repo for better performance. I'm rather satisfied with +>>>> its speed. --[[Paweł|ptecza]] diff --git a/doc/forum/recovering_original_title_with_meta_directive.mdwn b/doc/forum/recovering_original_title_with_meta_directive.mdwn new file mode 100644 index 000000000..ad0b02a9e --- /dev/null +++ b/doc/forum/recovering_original_title_with_meta_directive.mdwn @@ -0,0 +1 @@ +When using the \[[!meta title=""]] directive, the documentation states that a template variable is set when the title is overridden. However, how does one recover the original page title? --[[Glenn|geychaner@mac.com]] diff --git a/doc/forum/remove_css__63__.mdwn b/doc/forum/remove_css__63__.mdwn new file mode 100644 index 000000000..80da621d6 --- /dev/null +++ b/doc/forum/remove_css__63__.mdwn @@ -0,0 +1,5 @@ +I removed a local.css file and pushed the changes to git but the 'compiled' wiki still shows the same css. +Is this a bug or you are supposed to remove the css by hand? +ikiwiki version 3.20100705 + +> It's a [[bug|bugs/underlaydir_file_expose]]. --[[Joey]] diff --git a/doc/forum/report_pagination.mdwn b/doc/forum/report_pagination.mdwn new file mode 100644 index 000000000..03a77b16d --- /dev/null +++ b/doc/forum/report_pagination.mdwn @@ -0,0 +1,18 @@ +I am thinking of adding pagination to the [[plugins/contrib/report]] plugin, but I'm not sure which is the best approach to take. (By "pagination" I mean breaking up a report into multiple pages with N entries per page.) + +Approaches: + +1. generate additional HTML files on the fly which are placed in the sub-directory for the page the report is on. These are not "pages", they are not under revision control, they aren't in the %pagesources hash etc. But using the `will_render` mechanism assures that they will be removed when they are no longer needed. + +2. create new pages which each have a report directive which shows a subset of the full result; add them to revision control, treat them as full pages. Problems with this are: (a) trying to figure out when to create these new pages and when not to, (b) whether or not these pages can be deleted automatically. + +3. some other approach I haven't thought of. + +I'm afraid that whatever approach I take, it will end up being a kludge. + +> Well, it should be perfectly fine, and non-kludgy for a single page to +> generate multiple html files. Just make sure that html files have names +> that won't conflict with the html files generated by some other page. +> +> Of course the caveat is that since such files are not pages, you won't +> be able to use wikilinks to link directly to them, etc. --[[Joey]] diff --git a/doc/forum/screenplay_plugin.mdwn b/doc/forum/screenplay_plugin.mdwn new file mode 100644 index 000000000..5891532f0 --- /dev/null +++ b/doc/forum/screenplay_plugin.mdwn @@ -0,0 +1 @@ +I have a wordstar-style (dot command) screenplay plugin working. How do I know if its good enough or interesting enough to submit to ikiwiki? diff --git a/doc/forum/screenplay_plugin/comment_1_6c353acfc80b972ee3a34c8bb09dede3._comment b/doc/forum/screenplay_plugin/comment_1_6c353acfc80b972ee3a34c8bb09dede3._comment new file mode 100644 index 000000000..fa0762e59 --- /dev/null +++ b/doc/forum/screenplay_plugin/comment_1_6c353acfc80b972ee3a34c8bb09dede3._comment @@ -0,0 +1,8 @@ +[[!comment format=mdwn + username="http://joey.kitenet.net/" + nickname="joey" + subject="comment 1" + date="2011-02-25T18:58:38Z" + content=""" +You can list it on [[plugins/contrib]] right away, and see if people seem interested in it. +"""]] diff --git a/doc/forum/screenplay_plugin/comment_2_1868aeebebefae80531f2031ffba35d3._comment b/doc/forum/screenplay_plugin/comment_2_1868aeebebefae80531f2031ffba35d3._comment new file mode 100644 index 000000000..7168b890f --- /dev/null +++ b/doc/forum/screenplay_plugin/comment_2_1868aeebebefae80531f2031ffba35d3._comment @@ -0,0 +1,8 @@ +[[!comment format=mdwn + username="justint" + ip="24.182.207.250" + subject="thank you" + date="2011-02-27T05:21:54Z" + content=""" +Thank you Joey, I put it up there. Hopefully someone will enjoy it. +"""]] diff --git a/doc/forum/speeding_up_ikiwiki.mdwn b/doc/forum/speeding_up_ikiwiki.mdwn index 0b2164238..799186cf8 100644 --- a/doc/forum/speeding_up_ikiwiki.mdwn +++ b/doc/forum/speeding_up_ikiwiki.mdwn @@ -56,7 +56,7 @@ number is still too large to really visualize: the graphviz PNG and PDF output engines segfault for me, the PS one works but I can't get any PS software to render it without exploding. -Now, the relations in the links hash are not the same thing as IkiWiki's notion of dependencies. Can anyone point me at that data structure / where I might be able to add some debugging foo to generate a graph of it? +Now, the relations in the links hash are not the same thing as Ikiwiki's notion of dependencies. Can anyone point me at that data structure / where I might be able to add some debugging foo to generate a graph of it? Once I've figured out that I might be able to optimize some pagespecs. I understand pagespecs are essentially translated into sequential perl code. I @@ -77,15 +77,14 @@ wrapper. > Dependencies go in the `%IkiWiki::depends` hash, which is not exported. It > can also be dumped out as part of the wiki state - see [[tips/inside_dot_ikiwiki]]. > -> It's a map from page name to increasingly complex pagespec, although -> the `optimize-depends` branch in my git repository changes that to a -> map from a page name to a *list* of pagespecs which are automatically -> or'd together for use (this at least means duplicates can be weeded out). -> -> See [[todo/should_optimise_pagespecs]] for more on that. +> Nowadays, it's a hash of pagespecs, and there +> is also a `IkiWiki::depends_simple` hash of simple page names. > > I've been hoping to speed up IkiWiki too - making a lot of photo albums > with my [[plugins/contrib/album]] plugin makes it pretty slow. > > One thing that I found was a big improvement was to use `quick=yes` on all > my `archive=yes` [[ikiwiki/directive/inline]]s. --[[smcv]] + +> Take a look at [[tips/optimising_ikiwiki]] for lots of helpful advice. +> --[[Joey]] diff --git a/doc/forum/square_brackets_inside_backticks_generates_incorrect_html___40__interpreted_as_wikilinks__41____63__.html b/doc/forum/square_brackets_inside_backticks_generates_incorrect_html___40__interpreted_as_wikilinks__41____63__.html new file mode 100644 index 000000000..fe20a05b1 --- /dev/null +++ b/doc/forum/square_brackets_inside_backticks_generates_incorrect_html___40__interpreted_as_wikilinks__41____63__.html @@ -0,0 +1,19 @@ +the following markdown code +`foo \[["bar"]]` generates the following output: + +foo <span class="createlink"><a href="http://euler/~dabd/wiki/ikiwiki.cgi?page=__34__bar__34__&from=foo&do=create" rel="nofollow">?</a>"bar"</span> + +Perhaps this is a bug in the markdown processor? + +<blockquote> +This has nothing to do with markdown; wikilinks and directives +are not part of markdown, and just get expanded to html before +markdown processing. + +There is a bug open about this: +[[bugs/wiki_links_still_processed_inside_code_blocks]] + +Note that escaping the wikilink with a slash will avoid it being expanded +to html. +--[[Joey]] +</blockquote> diff --git a/doc/forum/suppressing_output_of_pages_included_only_for_their_side_effects.mdwn b/doc/forum/suppressing_output_of_pages_included_only_for_their_side_effects.mdwn new file mode 100644 index 000000000..99784cdd2 --- /dev/null +++ b/doc/forum/suppressing_output_of_pages_included_only_for_their_side_effects.mdwn @@ -0,0 +1,16 @@ +In particular, it's kind of annoying that using the sidebar plugin results in the creation of a free-standing sidebar.html (which in the simplest case of course includes a copy of *its own content* as a sidebar). It would be nice if there were a way to tell Ikiwiki to treat a file like sidebar.mdwn as "inline only": allow its content to be inlined but not to render it separately nor allow linking to it. + +In reading through the code and associated docs, it appears that the ideal method is for the file to be removed from the $changed array by plugin's "needsbuild" hook. Either the sidebar plugin could define such a hook, or perhaps a more general solution is the creation of a meta variable or config file regexp that would handle it according to the user's wishes. + +I'm about ready to code up such a change but want to find out if I'm thinking along the right lines. --[[blipvert]] + +> Internal pages should be able to be used for this, as they are used for +> comments. So you'd have +> `sidebar._mdwn`. However, mwdn would need to be changed to register a +> htmlize hook for the `_mdwn` extension for that to really work. +> +> But, if there's no rendered sidebar page, how can users easily edit the page +> in the web interface? In the specific case of the sidebar, It seems +> better to have the page display something different when built standalone +> than when built as the sidebar. +> --[[Joey]] diff --git a/doc/forum/tag_plugin:_rebuilding_autocreated_pages.mdwn b/doc/forum/tag_plugin:_rebuilding_autocreated_pages.mdwn new file mode 100644 index 000000000..114837031 --- /dev/null +++ b/doc/forum/tag_plugin:_rebuilding_autocreated_pages.mdwn @@ -0,0 +1,11 @@ +I have a bunch of tag pages autogenerated by the tag plugin. As part of a redesign for my wiki, I have changed the `autotag.tmpl` template, but IkiWiki refuses to rebuild existing tag pages using the updated template. I understand that existing tag pages are not rebuilt because they have been marked as "created" in `.ikiwiki/indexdb`. Is there a way to purge all tag pages from `indexdb`? --dkobozev + +> Well, you can delete the indexdb and rebuild, and that will do it. +> The tag plugin is careful not to replace existing pages or even recreate +> tag pages if you delete them, which does cause a problem if you need to +> update them. --[[Joey]] + +>> Thanks. I thought about deleting `indexdb`, but hesitated to do that. According to [[tips/inside dot ikiwiki]], `indexdb` stores "all persistent state about pages". I didn't know if it's harmless to lose all persistent state. --dkobozev + +>>> The persistant state is best thought of as a cache, +>>> so it's safe to delete it. --[[Joey]] diff --git a/doc/forum/tags_acting_strangely:_not_picking_up_all_pages__44___not_populating_feeds.mdwn b/doc/forum/tags_acting_strangely:_not_picking_up_all_pages__44___not_populating_feeds.mdwn new file mode 100644 index 000000000..f52486341 --- /dev/null +++ b/doc/forum/tags_acting_strangely:_not_picking_up_all_pages__44___not_populating_feeds.mdwn @@ -0,0 +1,28 @@ +Ok, I'm confused. See http://lovesgoodfood.com/jason/tags/napowrimo/ and +http://lovesgoodfood.com/jason/tags/NaPoWriMo/ for two examples of not +picking up pages quite right. I didn't realize that tags are randomly +case-sensitive while still capitalized in the output title? See the list +of backlinks on each. Also, the only pages actually being ''listed'' are +from a year ago, but the backlinks include current pages. The posts +''are'' being included on http://lovesgoodfood.com/jason/tags/poetry/ . +The feeds are populated on my host, but not on my laptop (Debian +unstable-ish, as opposed to a git pull on my host). + +Halp? I've blown away the old (including .ikiwiki) and rebuilt to no +effect. The tag pages are meant to be transients (loaded by default, +according to the docs?), but they're still being created. Nothing seems +quite correct. + +-- JasonRiedy + +> What's going on with the case sensativity is that ikiwiki is +> case-insensative, but in the edge case where there are somehow two pages +> that vary only in case, it makes at least a token (partial, probably +> incomplete and buggy) gesture at having the case of links to them +> influence which one is linked to. +> +> Possibly this is interacting badly with tag page autocreation when +> different cases are used for a tag? +> +> I don't know why new posts are not showing up in the tags. Can I download +> the source from somewhere? --[[Joey]] diff --git a/doc/forum/tags_acting_strangely:_not_picking_up_all_pages__44___not_populating_feeds/comment_1_ec4ffab10e60510b53660b70908d1bd8._comment b/doc/forum/tags_acting_strangely:_not_picking_up_all_pages__44___not_populating_feeds/comment_1_ec4ffab10e60510b53660b70908d1bd8._comment new file mode 100644 index 000000000..77db9c615 --- /dev/null +++ b/doc/forum/tags_acting_strangely:_not_picking_up_all_pages__44___not_populating_feeds/comment_1_ec4ffab10e60510b53660b70908d1bd8._comment @@ -0,0 +1,14 @@ +[[!comment format=mdwn + username="http://lovesgoodfood.com/jason/" + nickname="Jason Riedy" + subject="oh no..." + date="2011-04-20T17:56:08Z" + content=""" +I just realized I blew away my outward-facing git repos and setup when I blew away the site. augh. It'll take more time than I have to fix that right now. + +Current git repo is over dumb http at http://lovesgoodfood.com/jason.git until I can fix the rest. And two necessary extra plugins are at http://lovesgoodfood.com/htmlpage.pm and http://lovesgoodfood.com/imgcss.pm . Haven't cleaned / documented them enough to contribute. They shouldn't interfere with the tag plugin. I'll be up your way this weekend, except on a super-slow satellite link so won't be able to play much on-line. Might be able to debug locally. + +If you get a change to poke, I'd be grateful, but there's plenty else to do. ;) Morels should be up... + +-- JasonRiedy +"""]] diff --git a/doc/forum/tags_acting_strangely:_not_picking_up_all_pages__44___not_populating_feeds/comment_2_a47884ffd749df980cd62f4c1e3167ce._comment b/doc/forum/tags_acting_strangely:_not_picking_up_all_pages__44___not_populating_feeds/comment_2_a47884ffd749df980cd62f4c1e3167ce._comment new file mode 100644 index 000000000..683c6a59e --- /dev/null +++ b/doc/forum/tags_acting_strangely:_not_picking_up_all_pages__44___not_populating_feeds/comment_2_a47884ffd749df980cd62f4c1e3167ce._comment @@ -0,0 +1,10 @@ +[[!comment format=mdwn + username="http://lovesgoodfood.com/jason/" + nickname="Jason Riedy" + subject="Less painful to clone." + date="2011-04-21T02:34:15Z" + content=""" +http://lovesgoodfood.com/jason/git/JasonsChatter.git + +(and maybe this time I'll remember to save my setup) +"""]] diff --git a/doc/forum/tags_acting_strangely:_not_picking_up_all_pages__44___not_populating_feeds/comment_3_6c4affdbc637946506d0c28a8648dc6e._comment b/doc/forum/tags_acting_strangely:_not_picking_up_all_pages__44___not_populating_feeds/comment_3_6c4affdbc637946506d0c28a8648dc6e._comment new file mode 100644 index 000000000..3f02528b3 --- /dev/null +++ b/doc/forum/tags_acting_strangely:_not_picking_up_all_pages__44___not_populating_feeds/comment_3_6c4affdbc637946506d0c28a8648dc6e._comment @@ -0,0 +1,32 @@ +[[!comment format=mdwn + username="http://joey.kitenet.net/" + nickname="joey" + subject="comment 3" + date="2011-04-30T20:33:59Z" + content=""" +So, I don't see the issue of only one of the two capitalizations of a tag being updated when a page is added. + +<pre> +joey@wren:~/tmp>ikiwiki -plugin html -plugin inline -tagbase tags -plugin goodstuff --refresh -v JasonsChatter JasonsChatter.html +refreshing wiki.. +scanning posts/foo.html +building posts/foo.html +building tags.mdwn, which depends on posts/foo +building sidebar.mdwn, which depends on posts/foo +building posts.mdwn, which depends on posts/foo +building index.mdwn, which depends on sidebar +building archives/2010/04.mdwn, which depends on posts/foo +building archives.mdwn, which depends on archives/2010/04 +building tags/NaPoWriMo.mdwn, which depends on posts/foo +building tags/poetry.mdwn, which depends on posts/foo +building tags/rwp.mdwn, which depends on posts/foo +building tags/napowrimo.mdwn, which depends on posts/foo +done +</pre> + +Both caps of the tag were updated there. I do see some evidence of your site being updated by ikiwiki running with possibly different configuration. Compare date formats used on <http://lovesgoodfood.com/jason/tags/NaPoWriMo/> and <http://lovesgoodfood.com/jason/tags/napowrimo/>. Now, that could just be a different LANG setting, but if the configuration different goes deeper, it could point toward an explanation of the inconsistency of only one case of a tag being updated to list a page.. possibly. + +I can reproduce tag autocreation creating multiple tag pages that differ only in case. That's a bug, fixed in bad5072c02d506b5b5a0d74cd60639f7f62cc7d3. + +AFAICS, you don't have `tag_autocreate_commit` set to false, so transient tags are not being used. +"""]] diff --git a/doc/forum/transition_from_handwritten_html_to_ikiwiki.mdwn b/doc/forum/transition_from_handwritten_html_to_ikiwiki.mdwn new file mode 100644 index 000000000..a8d04a0ad --- /dev/null +++ b/doc/forum/transition_from_handwritten_html_to_ikiwiki.mdwn @@ -0,0 +1,90 @@ +I'm trying to convert hand written html site to ikiwiki and maintain url compatibility. html plugin with indexpages=1 converts all dir_name/index.html correctly to dir_name urls with wiki/css based content, but somedir/somefile.html files are only accessible as somedir/somefile/. Non .html files seem to accessible with their full paths, for example somedir/pic.jpg from hand written html can be accessed by same path under ikiwiki. + +How to make somedir/somefile.html accessible as somedir/somefile.html under ikiwiki? + +Thanks, + +-Mikko + +> Hello! The options you need to investigate are `--usedirs` and +> `--no-usedirs`. The default `--usedirs` takes any source page foo +> (regardless of its format, be it markdown or html) and converts it into a +> destination page foo/index.html (URL foo/). By comparison, `--no-usedirs` +> maps the source file onto a destination file directly: src/foo.html becomes +> dest/foo.html, src/bar.mdwn becomes dest/bar.html, etc. +> +> It sounds like you want `--no-usedirs`, or the corresponding `usedirs => 0,` +> option in your setup file. See [[usage]] for more information. -- [[Jon]] + +Thanks, usedirs seems to be just the thing I need. + +-Mikko + +Actually usedirs didn't do exactly what I want. The old site contains both +somedir/index.html and somedir/somename.html files. With html plugin and +indexpages=1 the somedir/index.html pages are accessed correctly but +somedir/somefile.html files not. + +With usedirs => 0, somedir/somename.html pages are accessed correctly but +somedir/index.html pages are not. Actually the handwritten somedir/index.html +files were removed on a rebuild: + + $ ikiwiki -setup blog.setup -rebuild -v + ... + removing test2/index.html, no longer built by test2 + +Is there a way for both index.html and somename.html raw html files to show up through ikiwki? + +-Mikko + +> I think you want usedirs => 0 and indexpages => 0? +> +> What IkiWiki does is to map the source filename to an abstract page name +> (indexpages alters how this is done), then map the abstract page name +> to an output filename (usedirs alters how this is done). +> +> The three columns here are input, abstract page name, output: +> +> usedirs => 0, indexpages => 0: +> a/index.html -> a/index -> a/index.html +> a/b.html -> a/b -> a/b.html +> usedirs => 1, indexpages => 0: +> a/index.html -> a/index -> a/index/index.html +> a/b.html -> a/b -> a/b/index.html +> usedirs => 0, indexpages => 1: +> a/index.html -> a -> a.html +> a/b.html -> a/b -> a/b.html +> usedirs => 1, indexpages => 1: +> a/index.html -> a -> a/index.html +> a/b.html -> a/b -> a/b/index.html +> +> The abstract page name is what you use in wikilinks and pagespecs. +> +> What I would suggest you do instead, though, is break your URLs once +> (but put in Apache redirections), to get everything to be consistent; +> I strongly recommend usedirs => 1 and indexpages => 0, then always +> advertising URLs that look like <http://www.example.com/a/b/>. This is +> what ikiwiki.info itself does, for instance. --[[smcv]] + +Thanks for the explanation. usedirs => 0 and indexpages => 0 does the trick, +but I'll try to setup mod_rewrite from foo/bar.html to foo/bar in the final +conversion. + +-Mikko + +> That's roughly what I do, but you can do it with `Redirect` and `RedirectMatch` from `mod_alias`, rather than fire up rewrite. Mind you I don't write a generic rule, I have a finite set of pages to redirect which I know. -- [[Jon]] + +I'm getting closer. Now with usedirs => 1 and raw html pages, ikiwiki transforms foo/index.html to foo/index/index.html. +Can ikiwiki be instructed map foo/index.html to page foo instead that foo/index? + +-Mikko + +> If you don't already have a foo.html in your source, why not just rename foo/index.html to foo.html? With usedirs, it will then map to foo/index.html. Before, you had 'foo/' and 'foo/index.html' as working URLS, and they will work after too. +> +> If you did have a foo.html and a foo/index.html, hmm, that's a tricky one. -- [[Jon]] + +> We may be going round in circles - that's what indexpages => 1 does :-) +> See the table I constructed above, which explains the mapping from input +> files to abstract page names, and then the mapping from abstract page +> names to output files. (I personally think that moving your source pages +> around like Jon suggested is a better solution, though. --[[smcv]] diff --git a/doc/forum/understanding_filter_hooks.mdwn b/doc/forum/understanding_filter_hooks.mdwn index 061d6d295..e6ddc91cc 100644 --- a/doc/forum/understanding_filter_hooks.mdwn +++ b/doc/forum/understanding_filter_hooks.mdwn @@ -7,3 +7,11 @@ but right now I have to have a look at the content, which I don't like so much. Is there a better hook to use for this? I need to transform the input before preprocessing. [[DavidBremner]] + +>You can check the type of the page without having to look at the content of the page: + + my $page_file=$pagesources{$page}; + my $page_type=pagetype($page_file); + +>Then you can check whether `$page_type` is "tex". +>--[[KathrynAndersen]] diff --git a/doc/forum/upgrade_steps.mdwn b/doc/forum/upgrade_steps.mdwn new file mode 100644 index 000000000..1c85e6402 --- /dev/null +++ b/doc/forum/upgrade_steps.mdwn @@ -0,0 +1,147 @@ +[[!meta date="2007-08-27 21:52:18 +0000"]] + +I upgrades from 1.40 to 2.6.1. I ran "ikiwiki --setup" using my existing ikiwiki.setup configuration. +I had many errors like: + + /home/bsdwiki/www/wiki/wikilink/index.html independently created, not overwriting with version from wikilink + BEGIN failed--compilation aborted at (eval 5) line 129. + +and: + + failed renaming /home/bsdwiki/www/wiki/smileys.ikiwiki-new to /home/bsdwiki/www/wiki/smileys: Is a directory + BEGIN failed--compilation aborted at (eval 5) line 129. + +Probably about six errors like this. I worked around this by removing the files and directories it complained about. +Finally it finished. + +> As of version 2.0, ikiwiki enables usedirs by default. See +> [[tips/switching_to_usedirs]] for details. --[[Joey]] + +>> I read the config wrong. I was thinking that it showed the defaults even though commented out +>> (like ssh configs do). I fixed that part. --JeremyReed + +My next problem was that ikiwiki start letting me edit without any password authentication. It used to prompt +me for a password but now just goes right into the "editing" mode. +The release notes for 2.0 say password auth is still on by default. + +> It sounds like you have the anonok plugin enabled? + +>> Where is the default documented? My config doesn't have it uncommented. + +The third problem is that when editing my textbox is empty -- no content. + +This is using my custom rcs.pm which has been used thousands of times. + +> Have you rebuilt the cgi wrapper since you upgraded ikiwiki? AFAIK I +> fixed a bug that could result in the edit box always being empty back in +> version 2.3. The only other way it could happen is if ikiwiki does not +> have saved state about the page that it's editing (in .ikiwiki/index). + +>> Rebuilt it several times. Now that I think of it, I think my early problem of having +>> no content in the textbox was before I rebuilt the cgi. And after I rebuilt the whole webpage was empty. + +Now I regenerated my ikiwiki.cgi again (no change to my configuration, +and I just get an empty HTML page when attempting editing or "create". + +> If the page is completly empty then ikiwiki is crashing before it can +> output anything, though this seems unlikely. Check the webserver logs. + +Now I see it created directories for my data. I fixed that by setting +usedirs (I see that is in the release notes for 2.0) and rerunning ikiwiki --setup +but I still have empty pages for editing (no textbox no html at all). + +> Is IkiWiki crashing? If so, it would probably leave error text in the apache logs. --[[TaylorKillian]] + +>> Not using apache. Nothing useful in logs other thn the HTTP return codes are "0" and bytes is "-" +>> on the empty ikiwiki.cgi output (should say " 200 " followed by bytes). + +>>> You need to either figure out what your web server does with stderr +>>> from cgi programs, or run ikiwiki.cgi at the command line with an +>>> appropriate environment so it thinks it's being called from a web +>>> server, so you can see how it's failing. --[[Joey]] + +(I am posting this now, but will do some research and post some more.) + +Is there any webpage with upgrade steps? + +> Users are expected to read [[news]], which points out any incompatible +> changes or cases where manual action is needed. + +>> I read it but read the usedirs option wrong :(. +>> Also it appears to be missing the news from between 1.40 to 2.0 unless they dont' exist. +>> If they do exist maybe they have release notes I need? + +>>> All the old ones are in the NEWS file. --[[Joey]] + +--JeremyReed + +My followup: I used a new ikiwiki.setup based on the latest version. But no changes for me. + +Also I forgot to mention that do=recentchanges works good for me. It uses my +rcs_recentchanges in my rcs perl module. + +The do=prefs does nothing though -- just a blank webpage. + +> You need to figure out why ikiwiki is crashing. The webserver logs should +> tell you. + +I also set verbose => 1 and running ikiwiki --setup was verbose, but no changes in running CGI. +I was hoping for some output. + +I am guessing that my rcs perl module stopped working on the upgrade. I didn't notice any release notes +on changes to revision control modules. Has something changed? I will also look. + +> No, the rcs interface has not needed to change in a long time. Also, +> nothing is done with the rcs for do=prefs. + +>> Thanks. I also checked differences between 1.40 Rcs plugins and didn't notice anything significant. + +--JeremyReed + +Another Followup: I created a new ikiwiki configuration and did the --setup to +create an entirely different website. I have same problem there. No prompt for password +and empty webpage when using the cgi. +I never upgraded any perl modules so maybe a new perl module is required but I don't see any errors so I don't know. + +The only errors I see when building and installing ikiwiki are: + + Can't exec "otl2html": No such file or directory at IkiWiki/Plugin/otl.pm line 66. + + gettext 0.14 too old, not updating the pot file + +I don't use GNU gettext on here. + +I may need to revert back to my old ikiwiki install which has been used to thousands of times (with around +1000 rcs commits via ikiwiki). + +--JeremyReed + +I downgraded to version 1.40 (that was what I had before I wrote wrong above). +Now ikiwiki is working for me again (but using 1.40). I shouldn't have tested on production system :) + +--JeremyReed + +I am back. On a different system, I installed ikiwiki 2.6.1. Same problem -- blank CGI webpage. + +So I manually ran with: + + REQUEST_METHOD=GET QUERY_STRING='do=create&page=jcr' kiwiki.cgi + +And clearly saw the error: + + [IkiWiki::main] Fatal: Bad template engine CGI::FormBuilder::Template::div: Can't locate CGI/FormBuilder/Template/div.pm + +So I found my version was too old and 3.05 is the first to provide "Div" support. I upgraded my p5-CGI-FormBuilder to 3.0501. +And ikiwiki CGI started working for me. + +The Ikiwiki docs about this requirement got removed in Revision 4367. There should be a page that lists the requirements. +(I guess I could have used the debian/control file.) + +> There is a page, [[install]] documents that 3.05 is needed. + +>> Sorry, I missed that. With hundreds of wikipages it is hard to read all of them. +>> I am updating the download page now to link to it. + +I am now using ikiwiki 2.6.1 on my testing system. + +--JeremyReed diff --git a/doc/forum/use_php-markdown-extra_with_ikiwiki__63__.mdwn b/doc/forum/use_php-markdown-extra_with_ikiwiki__63__.mdwn new file mode 100644 index 000000000..86ed70fd2 --- /dev/null +++ b/doc/forum/use_php-markdown-extra_with_ikiwiki__63__.mdwn @@ -0,0 +1,3 @@ +Is it possible to use php-markdown-extra with ikiwiki? + +Thanks. diff --git a/doc/forum/use_php-markdown-extra_with_ikiwiki__63__/comment_1_66d48218361caa4c07bd714b82ed0021._comment b/doc/forum/use_php-markdown-extra_with_ikiwiki__63__/comment_1_66d48218361caa4c07bd714b82ed0021._comment new file mode 100644 index 000000000..af60ecbdb --- /dev/null +++ b/doc/forum/use_php-markdown-extra_with_ikiwiki__63__/comment_1_66d48218361caa4c07bd714b82ed0021._comment @@ -0,0 +1,8 @@ +[[!comment format=mdwn + username="http://kerravonsen.dreamwidth.org/" + ip="60.241.8.244" + subject="PHP != Perl" + date="2010-07-10T12:44:15Z" + content=""" +Er, why? IkiWiki is written in Perl. Presumably php-markdown-extra is written in PHP, which is a completely different language. +"""]] diff --git a/doc/forum/use_php-markdown-extra_with_ikiwiki__63__/comment_2_f2ee0a4dce571d329f795e52139084c0._comment b/doc/forum/use_php-markdown-extra_with_ikiwiki__63__/comment_2_f2ee0a4dce571d329f795e52139084c0._comment new file mode 100644 index 000000000..ce60f4b3a --- /dev/null +++ b/doc/forum/use_php-markdown-extra_with_ikiwiki__63__/comment_2_f2ee0a4dce571d329f795e52139084c0._comment @@ -0,0 +1,8 @@ +[[!comment format=mdwn + username="https://www.google.com/accounts/o8/id?id=AItOawlzADDUvepOXauF4Aq1VZ4rJaW_Dwrl6xE" + nickname="Dário" + subject="comment 2" + date="2010-07-10T21:48:13Z" + content=""" +Because php-markdown-extra extends the basic markdown language (footnotes, etc.) +"""]] diff --git a/doc/forum/use_php-markdown-extra_with_ikiwiki__63__/comment_3_e388714f457ccb6ef73630179914558c._comment b/doc/forum/use_php-markdown-extra_with_ikiwiki__63__/comment_3_e388714f457ccb6ef73630179914558c._comment new file mode 100644 index 000000000..72ce7bb6f --- /dev/null +++ b/doc/forum/use_php-markdown-extra_with_ikiwiki__63__/comment_3_e388714f457ccb6ef73630179914558c._comment @@ -0,0 +1,9 @@ +[[!comment format=mdwn + username="http://kerravonsen.dreamwidth.org/" + ip="202.173.183.92" + subject="I still don't get it" + date="2010-07-11T07:18:35Z" + content=""" +But if you need the \"extra\" features of Markdown, all you have to do is turn on the \"multimarkdown\" option in your configuration. It makes no sense to try to use PHP with Perl. + +"""]] diff --git a/doc/forum/using_l10n__39__d_basewiki.mdwn b/doc/forum/using_l10n__39__d_basewiki.mdwn new file mode 100644 index 000000000..a361d18c9 --- /dev/null +++ b/doc/forum/using_l10n__39__d_basewiki.mdwn @@ -0,0 +1,7 @@ +Hey there! + +I'm trying to get the translated version of basewiki activated in my wiki. Setting "locale => 'de_DE.UTF-8'" gave me some german messages on the CLI and a few changes in the wiki itself but the basewiki is still english. The files in /usr/share/ikiwiki/po/de/ are there. + +As I understand, [[plugins/po]] is just for translating. + +So, what am I doing wrong? diff --git a/doc/forum/using_l10n__39__d_basewiki/comment_1_eaab671848ee6129f6fe9399474eeac0._comment b/doc/forum/using_l10n__39__d_basewiki/comment_1_eaab671848ee6129f6fe9399474eeac0._comment new file mode 100644 index 000000000..1f21b485b --- /dev/null +++ b/doc/forum/using_l10n__39__d_basewiki/comment_1_eaab671848ee6129f6fe9399474eeac0._comment @@ -0,0 +1,8 @@ +[[!comment format=mdwn + username="http://joey.kitenet.net/" + nickname="joey" + subject="comment 1" + date="2010-12-05T20:12:17Z" + content=""" +The translated basewiki depends on the po plugin being enabled and configured with the language(s) to use. +"""]] diff --git a/doc/forum/using_l10n__39__d_basewiki/comment_2_d907676a1db1210ca59506673c564359._comment b/doc/forum/using_l10n__39__d_basewiki/comment_2_d907676a1db1210ca59506673c564359._comment new file mode 100644 index 000000000..c8d1e4e04 --- /dev/null +++ b/doc/forum/using_l10n__39__d_basewiki/comment_2_d907676a1db1210ca59506673c564359._comment @@ -0,0 +1,10 @@ +[[!comment format=mdwn + username="http://xlogon.net/bacuh" + ip="93.182.190.4" + subject="comment 2" + date="2010-12-05T22:48:53Z" + content=""" +This works, thanks. + +But is there also a way to get \"Edit\" etc. and the buttons behind it translated? +"""]] diff --git a/doc/forum/using_l10n__39__d_basewiki/comment_3_5e9d5bc5ecaf63f9bfe3315b09a279aa._comment b/doc/forum/using_l10n__39__d_basewiki/comment_3_5e9d5bc5ecaf63f9bfe3315b09a279aa._comment new file mode 100644 index 000000000..f72bb37af --- /dev/null +++ b/doc/forum/using_l10n__39__d_basewiki/comment_3_5e9d5bc5ecaf63f9bfe3315b09a279aa._comment @@ -0,0 +1,10 @@ +[[!comment format=mdwn + username="http://joey.kitenet.net/" + nickname="joey" + subject="comment 3" + date="2010-12-05T22:53:12Z" + content=""" +That requires translating the templates, which has never quite been finished. [[todo/l10n]] discusses that. + +(You can edit the templates yourself of course and manually translate.) +"""]] diff --git a/doc/forum/using_svn+ssh_with_ikiwiki.mdwn b/doc/forum/using_svn+ssh_with_ikiwiki.mdwn new file mode 100644 index 000000000..8d9c27e46 --- /dev/null +++ b/doc/forum/using_svn+ssh_with_ikiwiki.mdwn @@ -0,0 +1,11 @@ +Just as an experiment, I tried running ikiwiki using a remote repository, i.e. via "svn+ssh". After setting up the repo and relocating the working copy, unfortunately, it doesn't work; editing a page gives the error: + +> Error: no element found at line 3, column 0, byte 28 at /opt/local/lib/perl5/vendor_perl/5.10.1/darwin-multi-2level/XML/Parser.pm line 187 + +I think this is because, despite a SetEnv directive in the apache configuration, the CGI wrapper is expunging SVN_SSH from the environment (based on perusing the source of Wrapper.pm and looking at "envsave" there at the top). Is this the case? --[[Glenn|geychaner@mac.com]] + +> That seems likely. You can edit Wrapper.pm and add SVN_SSH to the @envsave list and rebuild your wrappers to test it. --Joey + +A better way(?) would be to add a plugin to set the SVN_SSH variable at the appropriate moment (or even to add this to the SVN plugin). What kind of hook should this be; it needs to run just *after* the CGI script cleans its environment? --[[Glenn|geychaner@mac.com]] + +Actually, this probably doesn't need to be a plugin; setting SVN_SSH in ENV can probably be done through the setup file. (Right?) --[[Glenn|geychaner@mac.com]] diff --git a/doc/forum/utf8_warnings_for___34____92__xAB__34__.mdwn b/doc/forum/utf8_warnings_for___34____92__xAB__34__.mdwn new file mode 100644 index 000000000..72f2d38e0 --- /dev/null +++ b/doc/forum/utf8_warnings_for___34____92__xAB__34__.mdwn @@ -0,0 +1,47 @@ +# getting Warnings about UTF8-Chars. + +I'm getting multiple warnings: + + utf8 "\xAB" does not map to Unicode at /usr/share/perl5/IkiWiki.pm line 774, <$in> chunk 1. + + +I'm assuming this is once per File, but even in verbose mode, it doesn't tell me which file is a problem. +It first reads all the files, and afterwards when parsing/compiling them, it outputs the warning, so I can't +deduce the offending files. + +Is there a way to have ikiwiki output the position, where it encounters the character? + +Probably all this has to do with locale-settings, and usage of mixed locales in a distributed setup ... +I'd rather cleanup some of the file(name)s of unexpected characters. --[[jwalzer]] + +-------- + +**Update** : So I took the chance to insert debug into ikiwiki.pm: + + root@novalis:/usr/share/perl5# diff -p /tmp/IkiWiki.orig.pm IkiWiki.pm + *** /tmp/IkiWiki.orig.pm Sun Feb 14 15:16:08 2010 + --- IkiWiki.pm Sun Feb 14 15:16:28 2010 + *************** sub readfile ($;$$) { + *** 768,773 **** + --- 768,774 ---- + } + + local $/=undef; + + debug("opening File: $file:"); + open (my $in, "<", $file) || error("failed to read $file: $!"); + binmode($in) if ($binary); + return \*$in if $wantfd; + + +But what I see now is not quite helpful, as it seems, STDERR and DEBUG are asyncronous, so they mix up in a way, that I can't really see, whats the problem ... Maybe I'm better off for troubleshooting, to insert an printf to strerr to have it in the same stream.. --[[jwalzer]] + + +---- + +**Update:** The "print STDERR $file;"-Trick did it .. I was able to find a mdwn-file, that (was generated by a script of me) had \0xAB in it. + +Nevertheless I still wonder if this should be a problem. This character happend to be in an *\[\[meta title='$CHAR'\]\]-tag* and an *\[$CHAR\]http://foo)-Link* + +Should this throw an warning? Maybe this warning could be catched an reported inclusively the containing filename? maybe even with an override, if one knows that it is correct that way? --[[jwalzer]] + +[[!tag solved]] diff --git a/doc/forum/web_service_API__44___fastcgi_support.mdwn b/doc/forum/web_service_API__44___fastcgi_support.mdwn new file mode 100644 index 000000000..4a78fb932 --- /dev/null +++ b/doc/forum/web_service_API__44___fastcgi_support.mdwn @@ -0,0 +1,13 @@ +This is a half-baked thought of mine so I thought I would post it in forum for discussion. + +There are some things that ikiwiki.cgi is asked to do which do not involve changing the repository: these include form generation, handling logins, the "goto" from [[recentchanges]], edit previews, etc. + +For one thing I am working on slowly ([[todo/interactive todo lists]]), I've hit a situation where I am likely to need to implement doing markup evaluation for a subset of a page. The problem I face is, if a user edits content in the browser, markup, ikiwiki directives etc. need to be expanded. I could possibly do this with a round-trip through edit preview, but that would be for the whole content of a page, and I hit the problem with editing a list item. + +> (slight addendum on this front. I'm planning to split the javascript code for interactive todo lists into two parts: one for handling round trips of content to and from ikiwiki.cgi, and the various failure modes that might occur (permission denied, edit conflicts, login required, etc.) ; then the list-specific stuff can build on top of this. The first chunk might be reusable by others for other AJAXY-edit fu.) + +Anyway - I've realised that a big part of the interactive todo lists stuff is trying to handle round trips to ikiwiki.cgi through javascript. A web services API would make handling the various conditions of this easier (e.g. need to login, login failed, etc.). I'm not sure what else might benefit from a web services API and I have no real experience of either using or writing them so I don't know what pros/cons there are for REST vs SOAP etc. + +Second, and in a way related, I've been mooting hacking fastcgi support into ikiwiki. Essentially one ikiwiki.cgi process would persist and serve CGI-ish requests on stdin/stdout. The initial content-scanning and dependency generation would happen once and not need to be repeated for future requests. Although, all state-changing operations would need to be careful to ensure the in-memory models were accurate. Also, I don't know how suited the data structures would be for persistence, since the current model is build em up, throw em away, they might not be space-efficient enough for persistence. + +If I did attempt this, I would want to avoid restructuring things in a way which would impair ikiwiki's core philosophy of being a static compiler. -- [[Jon]] diff --git a/doc/forum/what_is_the_easiest_way_to_implement_order:_disallow_all__44___allow_chosen__95__few_page_editing_policy__63__.mdwn b/doc/forum/what_is_the_easiest_way_to_implement_order:_disallow_all__44___allow_chosen__95__few_page_editing_policy__63__.mdwn new file mode 100644 index 000000000..fbc5c58e2 --- /dev/null +++ b/doc/forum/what_is_the_easiest_way_to_implement_order:_disallow_all__44___allow_chosen__95__few_page_editing_policy__63__.mdwn @@ -0,0 +1,3 @@ +As in title, I'd like to allow editing only some pages on my wiki. Rest by default is not editable by users except admin. Thanks + +> See [[plugins/lockedit]]. --[[schmonz]] diff --git a/doc/forum/where_are_the_tags.mdwn b/doc/forum/where_are_the_tags.mdwn new file mode 100644 index 000000000..ecb49fe43 --- /dev/null +++ b/doc/forum/where_are_the_tags.mdwn @@ -0,0 +1,9 @@ +Where is the tag cloud/tag listing of all the tags used in this wiki? I know we +have tags enabled. --[[jerojasro]] + +> This wiki does not use one global toplevel set of tags (`tagbase` is not +> set). +> +> There are tags used for the [[plugins]], and a tag cloud of those +> there. [[wishlist]] and [[patch]] are tags too, but I don't see the point +> of a tag cloud for such tags. --[[Joey]] diff --git a/doc/forum/where_are_the_tags/comment_1_6a559c3bfe72011c45b006d33176da3d._comment b/doc/forum/where_are_the_tags/comment_1_6a559c3bfe72011c45b006d33176da3d._comment new file mode 100644 index 000000000..6878a7af8 --- /dev/null +++ b/doc/forum/where_are_the_tags/comment_1_6a559c3bfe72011c45b006d33176da3d._comment @@ -0,0 +1,14 @@ +[[!comment format=mdwn + username="http://intranetsdoneright.blogspot.com/" + ip="85.127.82.246" + subject="Tag cloud or list of tags possible?" + date="2011-04-30T06:10:14Z" + content=""" +I want to create a helpful welcome page http://pyjs.org/wiki/ mainly by using links or listings of what is called \"special pages\" in MediaWiki. (Currently, we have a list of all wiki articles, this frightens new readers.) + +Can I somehow list all tags of the wiki, or is there a tag cloud? +How can I list the first level of subpages (not the pages, just the \"hierarchy\" level or so)? +What other \"special\" lists or so can ikiwiki generate (e.g. users, tags, changes, ...)? + +Thanks for any help or directions! +"""]] diff --git a/doc/forum/wiki_clones_on_dynamic_IPs.mdwn b/doc/forum/wiki_clones_on_dynamic_IPs.mdwn new file mode 100644 index 000000000..f69f6501e --- /dev/null +++ b/doc/forum/wiki_clones_on_dynamic_IPs.mdwn @@ -0,0 +1,10 @@ +OK, this is not really a ikiwiki problem... but ikiwiki makes wiki clones +really easy to setup, and this is related to having a website cloned at +different places and pulling from the servers which are online. + +My setup is like this: I have a server at home and another at my dorm +which will serve as a wiki clone. Each of them has a dynamic IP with DynDNS +set up. Now the problem lies in linking my domain to these two DynDNS addresses. +Multiple CNAMEs are not supported, and I don't know if there's any utility +which can update the A records on the DNS to point directly point to the +two separate IPs. diff --git a/doc/forum/wiki_name_in_page_titles.mdwn b/doc/forum/wiki_name_in_page_titles.mdwn index 01ff8d817..4e9e51835 100644 --- a/doc/forum/wiki_name_in_page_titles.mdwn +++ b/doc/forum/wiki_name_in_page_titles.mdwn @@ -23,4 +23,10 @@ that provides a `IS_HOMEPAGE` template variable? --[[JasonBlevins]] >> few other small plugins brewing so I'll try to put up some contrib >> pages for them soon. --[[JasonBlevins]] -[path]: http://code.jblevins.org/ikiwiki/plugins.git/plain/path.pm +[path]: http://jblevins.org/git/ikiwiki/plugins.git/plain/path.pm + +> I used the following trick in some page.tmpl: +> +> <title><TMPL_VAR WIKINAME><TMPL_IF NAME="PARENTLINKS">: <TMPL_VAR TITLE></TMPL_IF></title> +> +> --[[JeanPrivat]] diff --git a/doc/forum/wishlist-discussion:_Editformular_showing_existing_tags.mdwn b/doc/forum/wishlist-discussion:_Editformular_showing_existing_tags.mdwn new file mode 100644 index 000000000..49c55e20e --- /dev/null +++ b/doc/forum/wishlist-discussion:_Editformular_showing_existing_tags.mdwn @@ -0,0 +1,15 @@ +# How about: + +having a list of all existing tags in the Edit-Formular as a selectionbox? + +Assume I have tagbase=/tags/ and for every tag I have given to articles an existing page there. + +Would it be possible to list all these tags together with the Formular, as selectionbox. +Maybe even with parsing of the content and preselecting the tags, that are given in the article and vice-versa when selecting the fields then also generating the \[\[\!tag\]\]-sourcecode ? + +this would need a bit JS-work and somehow on compiletime we need to put the list of tags somewhere, where the cgi could read them from. +This way, even a pagespec would suffice to determine the usable list of tags and not only the tagbase-variable. + +> I think this would be very hard to achieve with the current tag plugin, due to the nature of its implementation. +> +> I've had a "tag2" plugin on the go for a while which supports this. It's in a very rough stage but I'll try to find it and upload it somewhere. -- [[Jon]] diff --git a/doc/forum/wishlist:_Allow_OpenID_users_to_set_a_display_name.mdwn b/doc/forum/wishlist:_Allow_OpenID_users_to_set_a_display_name.mdwn new file mode 100644 index 000000000..2551ef75e --- /dev/null +++ b/doc/forum/wishlist:_Allow_OpenID_users_to_set_a_display_name.mdwn @@ -0,0 +1,3 @@ +Preferences should offer me a way to define my display name. This name should, with proper markings, be used in git commits, etc. -- RichiH + +> I.e. care should be taken that I don't simply change it to joeyh@kitenet.net and start writing crap. For example, the display name could be displayed before the OpenID string, by default. Which would be worthwhile change anyway, no matter if you will support display names. -- RichiH diff --git a/doc/forum/wishlist:_support_staging_area.mdwn b/doc/forum/wishlist:_support_staging_area.mdwn new file mode 100644 index 000000000..7628ad00a --- /dev/null +++ b/doc/forum/wishlist:_support_staging_area.mdwn @@ -0,0 +1,12 @@ +It would be nice if ikiwiki had built-in support for a staging area. + +Default branch name would be *staging*, default result path would be *staging* as well. + +Thus, if I push into the staging branch, it should automagically create a full copy of the site in staging/ . + +This would make playing with ikiwiki easier while not cluttering the main branch and site. + +A simple .htaccess would take care of keeping the staging are limited to people who should be allowed to access it. + + +Richard diff --git a/doc/freesoftware.mdwn b/doc/freesoftware.mdwn index 7ac1ac6b4..2243d9b1f 100644 --- a/doc/freesoftware.mdwn +++ b/doc/freesoftware.mdwn @@ -4,7 +4,7 @@ ikiwiki, and this documentation wiki, are licensed under the terms of the GNU [[GPL]], version 2 or later. The parts of ikiwiki that become part of your own wiki (the [[basewiki]] -pages (but not the smilies) and the [[templates|wikitemplates]]) are licensed +pages (but not the smilies) and the [[templates]]) are licensed as follows: > Redistribution and use in source and compiled forms, with or without diff --git a/doc/git.mdwn b/doc/git.mdwn index a3b56c682..ebea400ee 100644 --- a/doc/git.mdwn +++ b/doc/git.mdwn @@ -26,7 +26,8 @@ be browsed, subscribed to etc on its You are of course free to set up your own ikiwiki git repository with your own [[patches|patch]]. If you list it here, the `gitremotes` script will automatically add it to git remotes. Your repo will automatically be pulled -into [[Joey]]'s working tree. This is recommended. :-) +into [[Joey]]'s working repository where he can see your branches and +think about merging them. This is recommended. :-) <!-- Machine-parsed format: * wikilink <git:url> --> @@ -36,6 +37,7 @@ into [[Joey]]'s working tree. This is recommended. :-) * l10n `git://l10n.ikiwiki.info/` Open push localization branch used for <http://l10n.ikiwiki.info/> * [[smcv]] `git://git.pseudorandom.co.uk/git/smcv/ikiwiki.git` + ([browse](http://git.pseudorandom.co.uk/smcv/ikiwiki.git)) * [[intrigeri]] `git://gaffer.ptitcanardnoir.org/ikiwiki.git` * [[gmcmanus]] `git://github.com/gmcmanus/ikiwiki.git` * [[jelmer]] `git://git.samba.org/jelmer/ikiwiki.git` @@ -51,23 +53,28 @@ into [[Joey]]'s working tree. This is recommended. :-) * [[schmonz]] `git://github.com/schmonz/ikiwiki.git` * [[will]] `http://www.cse.unsw.edu.au/~willu/ikiwiki.git` * [[kaizer]] `git://github.com/engla/ikiwiki.git` +* [[bbb]] `http://git.boulgour.com/bbb/ikiwiki.git` +* [[KathrynAndersen]] `git://github.com/rubykat/ikiplugins.git` +* [[ktf]] `git://github.com/ktf/ikiwiki.git` +* [[tove]] `git://github.com/tove/ikiwiki.git` +* [[GiuseppeBilotta]] `git://git.oblomov.eu/ikiwiki` +* [[roktas]] `git://github.com/roktas/ikiwiki.git` +* [[davrieb|David_Riebenbauer]] `git://git.liegesta.at/git/ikiwiki` + ([browse](http://git.liegesta.at/?p=ikiwiki.git;a=summary)) +* [[GustafThorslund]] `http://gustaf.thorslund.org/src/ikiwiki.git` +* [[peteg]] `git://git.hcoop.net/git/peteg/ikiwiki.git` +* [[privat]] `git://github.com/privat/ikiwiki.git` +* [[blipvert]] `git://github.com/blipvert/ikiwiki.git` +* [[bzed|BerndZeimetz]] `git://git.recluse.de/users/bzed/ikiwiki.git` +* [[wtk]] `git://github.com/wking/ikiwiki.git` +* [[sunny256]] `git://github.com/sunny256/ikiwiki.git` +* [[fmarier]] `git://gitorious.org/~fmarier/ikiwiki/fmarier-sandbox.git` +* [[levitte]] `git://github.com/levitte/ikiwiki.git` +* jo `git://git.debian.org/users/jo-guest/ikiwiki.git` + ([browse](http://git.debian.org/?p=users/jo-guest/ikiwiki.git;a=summary)) +* [[timonator]] `git://github.com/timo/ikiwiki.git` ## branches -Some of the branches included in the main repository include: +Current branches of ikiwiki are listed on [[branches]]. -* `gallery` contains the [[todo/Gallery]] plugin. It's not yet merged - due to license issues. Also some bits need to be tweaked to make it - work with the current *master* branch again. -* `html` is an unfinished attempt at making ikiwiki output HTML 4.01 - instead of xhtml. -* `wikiwyg` adds [[todo/wikiwyg]] support. It is unmerged pending some - changes. -* `debian-stable` is used for updates to the old version included in - Debian's stable release, and `debian-testing` is used for updates to - Debian's testing release. (These and similar branches will be rebased.) -* `ignore` gets various branches merged to it that Joey wishes to ignore - when looking at everyone's unmerged changes. -* `pristine-tar` contains deltas that - [pristine-tar](http://kitenet.net/~joey/code/pristine-tar) - can use to recreate released tarballs of ikiwiki diff --git a/doc/ikiwiki-calendar.mdwn b/doc/ikiwiki-calendar.mdwn index e2cc612f3..03cbdd86c 100644 --- a/doc/ikiwiki-calendar.mdwn +++ b/doc/ikiwiki-calendar.mdwn @@ -4,7 +4,7 @@ ikiwiki-calendar - create calendar archive pages # SYNOPSIS -ikiwiki-calendar [-f] your.setup [pagespec] [year] +ikiwiki-calendar [-f] your.setup [pagespec] [startyear [endyear]] # DESCRIPTION @@ -16,19 +16,21 @@ You must specify the setup file for your wiki. The pages will be created inside its `srcdir`, beneath the `archivebase` directory used by the calendar plugin (default "archives"). -You will probably want to specify a [[ikiwiki/PageSpec]] -to control which pages are included on the calendars. The -default is all pages. To limit it to only posts in a blog, +To control which pages are included on the calendars, +a [[ikiwiki/PageSpec]] can be specified. The default is +all pages, or the pages specified by the `comments_pagespec` +setting in the config file. A pagespec can also be specified +on the command line. To limit it to only posts in a blog, use something like "posts/* and !*/Discussion". It defaults to creating calendar pages for the current -year, as well as the previous year, and the next year. -If you specify a year, it will create pages for that year. +year. If you specify a year, it will create pages for that year. +Specify a second year to create pages for a span of years. Existing pages will not be overwritten by this command by default. Use the `-f` switch to force it to overwrite any existing pages. -## CRONTAB +# CRONTAB While this command only needs to be run once a year to update the archive pages for each new year, you are recommended to set up @@ -41,7 +43,7 @@ An example crontab: # TEMPLATES -This command uses two [[template|wikitemplates]] to generate +This command uses two [[templates]] to generate the pages, `calendarmonth.tmpl` and `calendaryear.tmpl`. # AUTHOR diff --git a/doc/ikiwiki-makerepo.mdwn b/doc/ikiwiki-makerepo.mdwn index 9e742c211..acb1211de 100644 --- a/doc/ikiwiki-makerepo.mdwn +++ b/doc/ikiwiki-makerepo.mdwn @@ -24,6 +24,14 @@ ikiwiki wrapper. Note that for monotone, you are assumed to already have run "mtn genkey" to generate a key. + +# EXAMPLE + +`ikiwiki-makerepo git /var/www/wiki /home/user/wiki/` + +The above command creates a new git repo in /home/user/wiki as well as a new git repo in the /var/www/wiki directory. +It then initializes the /home/user/wiki git repo and makes the /var/www/wiki a clone. + # AUTHOR Joey Hess <joey@ikiwiki.info> diff --git a/doc/ikiwiki-makerepo/discussion.mdwn b/doc/ikiwiki-makerepo/discussion.mdwn new file mode 100644 index 000000000..660823fbb --- /dev/null +++ b/doc/ikiwiki-makerepo/discussion.mdwn @@ -0,0 +1 @@ +Not sure how well I described the example. :-/ -- Jeremiah diff --git a/doc/ikiwiki.mdwn b/doc/ikiwiki.mdwn index e0a971d96..4d840696c 100644 --- a/doc/ikiwiki.mdwn +++ b/doc/ikiwiki.mdwn @@ -14,3 +14,4 @@ Some documentation on using ikiwiki: * [[ikiwiki/markdown]] * [[ikiwiki/openid]] * [[ikiwiki/searching]] +* [[templates]] diff --git a/doc/ikiwiki/directive/aggregate/discussion.mdwn b/doc/ikiwiki/directive/aggregate/discussion.mdwn new file mode 100644 index 000000000..ddece9746 --- /dev/null +++ b/doc/ikiwiki/directive/aggregate/discussion.mdwn @@ -0,0 +1,10 @@ +It would be awesome if table could aggregrate remote CSVs too. I want something like: + + !table file="http://cyclehireapp.com/cyclehirelive/cyclehire.csv" + +> Ok, but that has nothing to do with the aggregate plugin. File a +> [[todo]]? +> +> Anyway, it seems difficult, how would it know when the remote content +> had changed? Aggregate has its cron job support and has time stamps +> in rss feeds to rely on. --[[Joey]] diff --git a/doc/ikiwiki/directive/calendar.mdwn b/doc/ikiwiki/directive/calendar.mdwn index b2ac75b11..cb40f884e 100644 --- a/doc/ikiwiki/directive/calendar.mdwn +++ b/doc/ikiwiki/directive/calendar.mdwn @@ -28,7 +28,7 @@ to display or list pages created in the given time frame. The `ikiwiki-calendar` command can be used to automatically generate the archive pages. It also refreshes the wiki, updating the calendars to highlight the current day. This command is typically run at midnight from -cron. An example crontab: +cron. An example crontab: @@ -40,18 +40,21 @@ An example crontab: "month" or "year". The default is a month view calendar. * `pages` - Specifies the [[ikiwiki/PageSpec]] of pages to link to from the month calendar. Defaults to "*". -* `archivebase` - Configures the base of the archives hierarchy. The - default is "archives". Note that this default can also be overridden +* `archivebase` - Configures the base of the archives hierarchy. + The default is "archives". Note that this default can also be overridden for the whole wiki by setting `archivebase` in ikiwiki's setup file. + Calendars link to pages under here, with names like "2010/04" and + "2010". These pages can be automatically created using the + `ikiwiki-calendar` program. * `year` - The year for which the calendar is requested. Defaults to the - current year. + current year. Can also use -1 to refer to last year, and so on. * `month` - The numeric month for which the calendar is requested, in the range 1..12. Used only for the month view calendar, and defaults to the - current month. + current month. Can also use -1 to refer to last month, and so on. * `week_start_day` - A number, in the range 0..6, which represents the day of the week that the month calendar starts with. 0 is Sunday, 1 is Monday, and so on. Defaults to 0, which is Sunday. -* `months_per_row` - In the annual calendar, number of months to place in +* `months_per_row` - In the year calendar, number of months to place in each row. Defaults to 3. [[!meta robots="noindex, follow"]] diff --git a/doc/ikiwiki/directive/comment.mdwn b/doc/ikiwiki/directive/comment.mdwn index 21386dfc3..398130e2e 100644 --- a/doc/ikiwiki/directive/comment.mdwn +++ b/doc/ikiwiki/directive/comment.mdwn @@ -29,10 +29,12 @@ metadata of the comment. nearly any format, since it's parsed by [[!cpan TimeDate]] * `username` - Used to record the username (or OpenID) of a logged in commenter. +* `nickname` - Name to display for a logged in commenter. + (Optional; used for OpenIDs.) * `ip` - Can be used to record the IP address of a commenter, if they posted anonymously. * `claimedauthor` - Records the name that the user entered, - if anonmous commenters are allowed to enter their (unverified) + if anonymous commenters are allowed to enter their (unverified) name. [[!meta robots="noindex, follow"]] diff --git a/doc/ikiwiki/directive/commentmoderation.mdwn b/doc/ikiwiki/directive/commentmoderation.mdwn new file mode 100644 index 000000000..8553b5b17 --- /dev/null +++ b/doc/ikiwiki/directive/commentmoderation.mdwn @@ -0,0 +1,9 @@ +The `commentmoderation` directive is supplied by the +[[!iki plugins/comments desc=comments]] plugin, and is used to link +to the comment moderation queue. + +Example: + + \[[!commentmoderation desc="here is the comment moderation queue"]] + +[[!meta robots="noindex, follow"]] diff --git a/doc/ikiwiki/directive/date.mdwn b/doc/ikiwiki/directive/date.mdwn new file mode 100644 index 000000000..b89241e4c --- /dev/null +++ b/doc/ikiwiki/directive/date.mdwn @@ -0,0 +1,16 @@ +The `date` directive is supplied by the [[!iki plugins/date desc=date]] plugin. + +This directive can be used to display a date on a page, using the same +display method that is used to display the modification date in the page +footer, and other dates in the wiki. This can be useful for consistency +of display, or if you want to embed parseable dates into the page source. + +Like the dates used by the [[meta]] directive, the date can be entered in +nearly any format, since it's parsed by [[!cpan TimeDate]]. + +For example, an update to a page with an embedded date stamp could look +like: + + Updated \[[!date "Wed, 25 Nov 2009 01:11:55 -0500"]]: mumble mumble + +[[!meta robots="noindex, follow"]] diff --git a/doc/ikiwiki/directive/edittemplate.mdwn b/doc/ikiwiki/directive/edittemplate.mdwn index d731bdb47..c486e821b 100644 --- a/doc/ikiwiki/directive/edittemplate.mdwn +++ b/doc/ikiwiki/directive/edittemplate.mdwn @@ -21,7 +21,7 @@ something like: Details: The template page can also contain [[!cpan HTML::Template]] directives, -similar to other ikiwiki [[templates]]. Currently only one variable is +like other ikiwiki [[templates]]. Currently only one variable is set: `<TMPL_VAR name>` is replaced with the name of the page being created. diff --git a/doc/ikiwiki/directive/flattr.mdwn b/doc/ikiwiki/directive/flattr.mdwn new file mode 100644 index 000000000..5083005ce --- /dev/null +++ b/doc/ikiwiki/directive/flattr.mdwn @@ -0,0 +1,45 @@ +The `flattr` directive is supplied by the [[!iki plugins/flattr desc=flattr]] plugin. + +This directive allows easily inserting Flattr buttons onto wiki pages. + +Flattr supports both static buttons and javascript buttons. This directive +only creates dynamic javascript buttons. If you want to insert a static +Flattr button, you can simply copy the html code for it from Flattr, instead. +Note that this directive inserts javascript code into the page, that +loads more javascript code from Flattr.com. So only use it if you feel +comfortable with that. + +The directive can be used to display a button for a thing you have already +manually submitted to Flattr. In this mode, the only parameter you need to +include is the exact url to the thing that was submitted to Flattr. +(If the button is for the current page, you can leave that out.) For +example, this is the Flattr button for ikiwiki. Feel free to add it to all +your pages. ;) + + \[[!flattr url="http://ikiwiki.info/" button=compact]] + +The directive can also be used to create a button that automatically +submits a page to Flattr when a user clicks on it. In this mode you +need to include parameters to specify your uid, and a title, category, tags, +and description for the page. For example, this is a Flattr button for +a blog post: + + \[[!flattr uid=25634 title="my new blog post" category=text + tags="blog,example" description="This is a post on my blog."]] + +Here are all possible parameters you can pass to the Flattr directive. + +* `button` - Set to "compact" for a small button. +* `url` - The url to the thing to be Flattr'd. If omitted, defaults + to the url of the current page. +* `uid` - Your numeric Flattr userid. Not needed if the flattr plugin + has been configured with a global `flattr_userid`. +* `title` - A short title for the thing, to show on its Flattr page. +* `description` - A description of the thing, to show on its Flattr + page. +* `category` - One of: text, images, video, audio, software, rest. +* `tags` - A list of tags separated by a comma. +* `language` - A language code. +* `hidden` - Set to 1 to hide the button from listings on Flattr.com. + +[[!meta robots="noindex, follow"]] diff --git a/doc/ikiwiki/directive/format.mdwn b/doc/ikiwiki/directive/format.mdwn index 23830e9cd..7d11d225f 100644 --- a/doc/ikiwiki/directive/format.mdwn +++ b/doc/ikiwiki/directive/format.mdwn @@ -22,7 +22,7 @@ Note that if the highlight plugin is enabled, this directive can also be used to display syntax highlighted code. Many languages and formats are supported. For example: - \[[format perl """ + \[[!format perl """ print "hello, world\n"; """]] diff --git a/doc/ikiwiki/directive/if.mdwn b/doc/ikiwiki/directive/if.mdwn index 2cbf70cdf..492adf499 100644 --- a/doc/ikiwiki/directive/if.mdwn +++ b/doc/ikiwiki/directive/if.mdwn @@ -43,6 +43,8 @@ with the following additional tests: * included() - Tests whether the page is being included onto another page. + Tests whether the page is being included onto another page, for example + via [[inline]] or [[map]]. Note that pages inserted into other pages + via [[template]] are not matched here. [[!meta robots="noindex, follow"]] diff --git a/doc/ikiwiki/directive/img.mdwn b/doc/ikiwiki/directive/img.mdwn index 94cc754bd..cda62b58f 100644 --- a/doc/ikiwiki/directive/img.mdwn +++ b/doc/ikiwiki/directive/img.mdwn @@ -18,7 +18,8 @@ making the image smaller than the specified size. You can also specify only the width or the height, and the other value will be calculated based on it: "200x", "x200" -You can also pass `alt`, `title`, `class`, `align` and `id` parameters. +You can also pass `alt`, `title`, `class`, `align`, `id`, `hspace`, and +`vspace` parameters. These are passed through unchanged to the html img tag. If you include a `caption` parameter, the caption will be displayed centered beneath the image. diff --git a/doc/ikiwiki/directive/inline.mdwn b/doc/ikiwiki/directive/inline.mdwn index c6a23ce3c..22c18d9a1 100644 --- a/doc/ikiwiki/directive/inline.mdwn +++ b/doc/ikiwiki/directive/inline.mdwn @@ -75,6 +75,9 @@ Here are some less often needed parameters: disable generating any feeds. * `emptyfeeds` - Set to "no" to disable generation of empty feeds. Has no effect if `rootpage` or `postform` is set. +* `id` - Set to specify the value of the HTML `id` attribute for the + feed links or the post form. Useful if you have multiple forms in the + same page. * `template` - Specifies the template to fill out to display each inlined page. By default the `inlinepage` template is used, while the `archivepage` template is used for archives. Set this parameter to @@ -116,6 +119,3 @@ Here are some less often needed parameters: in conjunction with this one. [[!meta robots="noindex, follow"]] - -A related directive is the [[ikiwiki/directive/edittemplate]] directive, which allows -default text for a new page to be specified. diff --git a/doc/ikiwiki/directive/inline/discussion.mdwn b/doc/ikiwiki/directive/inline/discussion.mdwn index be0665d04..5489d5f16 100644 --- a/doc/ikiwiki/directive/inline/discussion.mdwn +++ b/doc/ikiwiki/directive/inline/discussion.mdwn @@ -1,3 +1,10 @@ +## Combine inline and toggle + +Is it possible to combine the behaviour of toggle and inline? ie, have it present of list of 'headlines' which are created from seperate subpages which can be clicked to expand to the body of the inlined page. Thanks. + +-- Thiana + +--- ## How do you provide the per post discussion links in your own blog? > That's configured by the "actions" parameter to the inline directive. See @@ -124,3 +131,33 @@ My index page has: Else can you please suggest a smarter way of getting certain data out from pages for a inline index? --[[hendry]] + +## A different idea: smuggling hook routines in through %params. + +The part that fetches the inlined content is quite compact. It's just the if ($needcontent) {} chunk. Would a patch that accepts a perl sub smuggled through something like $params{inliner_} be accepted? If that param exists, call it instead of the current content of that chunk. Pass $page, %params, and $template. Receive $content, possibly seeing $template modified. The custom directives can add inliner_ to %params and call IkiWiki::preprocess_inline. I suppose IkiWiki::Plugin::inline could be modified to strip any *_ out of the directive's arguments to prevent any custom behavior from leaking into the inline directive. + +I'm about to try this for a CV/resume type of thing. I want only one element with a specific id out of the generated content (with a little post-processing). I don't need performance for my case. + +Update: Pretty much works. I need a way to skip sources, but inline shrinks the list of all pages *before* trying to form them. Next little bit... + +--[[JasonRiedy]] + +--- + +## Interaction of `show` and `feedshow` + +Reading the documentation I would think that `feedshow` does not +influence `show`. + + \[[!inline pages="./blog/*" archive=yes quick=yes feedshow=10 sort=title reverse=yes]] + +Only ten pages are listed in this example although `archive` is set to +yes. Removing `feedshow=10` all matching pages are shown. + +Is that behaviour intended? + +> Is something going wrong because `quick="yes"` [[»turns off generation of any feeds«|inline]]? --[[PaulePanter]] + +--[[PaulePanter]] + +>> Bug was that if feedshow was specified without show it limited to it incorrectly. Fixed. --[[Joey]] diff --git a/doc/ikiwiki/directive/linkmap.mdwn b/doc/ikiwiki/directive/linkmap.mdwn index 38cf0fd11..baa6fff61 100644 --- a/doc/ikiwiki/directive/linkmap.mdwn +++ b/doc/ikiwiki/directive/linkmap.mdwn @@ -7,9 +7,7 @@ graph showing the links between a set of pages in the wiki. Example usage: Only links between mapped pages will be shown; links pointing to or from unmapped pages will be omitted. If the pages to include are not specified, -the links between all pages (and other files) in the wiki are mapped. For -best results, only a small set of pages should be mapped, since otherwise -the map can become very large, unwieldy, and complicated. +the links between all pages (and other files) in the wiki are mapped. Here are descriptions of all the supported parameters to the `linkmap` directive: @@ -18,5 +16,14 @@ directive: * `height`, `width` - Limit the size of the map to a given height and width, in inches. Both must be specified for the limiting to take effect, otherwise the map's size is not limited. +* `connected` - Controls whether to include pages on the map that link to + no other pages (connected=no, the default), or to only show pages that + link to others (connected=yes). + +For best results, only a small set of pages should be mapped, since +otherwise the map can become very large, unwieldy, and complicated. +If too many pages are included, the map may get so large that graphviz +cannot render it. Using the `connected` parameter is a good way to prune +out pages that clutter the map. [[!meta robots="noindex, follow"]] diff --git a/doc/ikiwiki/directive/map.mdwn b/doc/ikiwiki/directive/map.mdwn index 09c95a0c9..4b6499547 100644 --- a/doc/ikiwiki/directive/map.mdwn +++ b/doc/ikiwiki/directive/map.mdwn @@ -13,6 +13,8 @@ the [[meta]] directive). For example: \[[!map pages="* and !blog/* and !*/Discussion" show=title]] + \[[!map pages="* and !blog/* and !*/Discussion" show=description]] + Hint: To limit the map to displaying pages less than a certain level deep, use a [[ikiwiki/PageSpec]] like this: `pages="* and !*/*/*"` diff --git a/doc/ikiwiki/directive/map/discussion.mdwn b/doc/ikiwiki/directive/map/discussion.mdwn index 062b4267a..b7ac17b1a 100644 --- a/doc/ikiwiki/directive/map/discussion.mdwn +++ b/doc/ikiwiki/directive/map/discussion.mdwn @@ -1,3 +1,14 @@ +### Sorting + +Is there a way to have the generated maps sorted by *title* instead of *filename* when show=title is used? +Thanks + +-- Thiana + +> [[bugs/map_sorts_by_pagename_and_not_title_when_show__61__title_is_used]] --[[Joey]] + +---- + Question: Is there a way to generate a listing that shows *both* title and description meta information? Currently, a \[\[!map ...]] shows only one of the two, but I'd like to generate a navigation that looks like a description list. For example: * This is the title meta information. @@ -72,3 +83,15 @@ Is there any way to do that? I don't mind mucking around with `\[[!meta]]` on e >>> I think that the ideas and code in >>> [[todo/tracking_bugs_with_dependencies]] might also handle this case. >>> --[[Joey]] + +---- + +I feel like this should be obvious, but I can't figure out how to sort numerically. + +I have `map pages="./* and !*/Discussion and !*/sidebar"` and a bunch of pages with names like 1, 2, 3, 11, 12, 1/1.1, 12/12.3 etc. I want to sort them numerically. I see lots of conversation implying there's a simple way to do it, but not how. + +> No, you can't: map can't currently use a non-default sort order. If it +> could, then you could use [[plugins/sortnaturally]]. There's a +> [[feature_request|todo/sort_parameter_for_map_plugin_and_directive]]; +> [[a_bug_references_it|bugs/map_sorts_by_pagename_and_not_title_when_show=title_is_used]]. +> --[[smcv]] diff --git a/doc/ikiwiki/directive/meta.mdwn b/doc/ikiwiki/directive/meta.mdwn index 000f461c9..6b381f138 100644 --- a/doc/ikiwiki/directive/meta.mdwn +++ b/doc/ikiwiki/directive/meta.mdwn @@ -7,7 +7,8 @@ Enter the metadata as follows: \[[!meta field="value" param="value" param="value"]] The first form sets a given field to a given value, while the second form -also specifies some additional sub-parameters. +also specifies some additional sub-parameters. You can have only one field +per `meta` directive, use more directives if you want to specify more fields. The field values are treated as HTML entity-escaped text, so you can include a quote in the text by writing `"` and so on. @@ -23,6 +24,13 @@ Supported fields: be set to a true value in the template; this can be used to format things differently in this case. + An optional `sortas` parameter will be used preferentially when + [[ikiwiki/pagespec/sorting]] by `meta(title)`: + + \[[!meta title="The Beatles" sortas="Beatles, The"]] + + \[[!meta title="David Bowie" sortas="Bowie, David"]] + * license Specifies a license for the page, for example, "GPL". Can contain @@ -37,14 +45,19 @@ Supported fields: Specifies the author of a page. + An optional `sortas` parameter will be used preferentially when + [[ikiwiki/pagespec/sorting]] by `meta(author)`: + + \[[!meta author="Joey Hess" sortas="Hess, Joey"]] + * authorurl Specifies an url for the author of a page. * description - Specifies a "description" of the page. You could use this to provide - a summary, for example, to be picked up by the [[map]] directive. + Specifies a short description for the page. This will be put in + the html header, and can also be displayed by eg, the [[map]] directive. * permalink @@ -64,6 +77,21 @@ Supported fields: \[[!meta stylesheet=somestyle rel="alternate stylesheet" title="somestyle"]] + + However, this will be scrubbed away if the + [[!iki plugins/htmlscrubber desc=htmlscrubber]] plugin is enabled, + since it can be used to insert unsafe content. + +* script + + Adds a script to a page. The script is treated as a wiki link to + a `.js` file in the wiki, so it cannot be used to add links to external + scripts. The optional `defer` and `async` keywords can be used to set + the corresponding HTML4 and HTML5 script options. Example: + + \[[!meta script=somescript defer async]] + + The tag is subject to scrubbing as with the stylesheet and link fields. * openid @@ -79,7 +107,7 @@ Supported fields: Example: - \\[[!meta openid="http://joeyh.myopenid.com/" + \[[!meta openid="http://joeyh.myopenid.com/" server="http://www.myopenid.com/server" xrds-location="http://www.myopenid.com/xrds?username=joeyh.myopenid.com""]] @@ -153,6 +181,15 @@ Supported fields: value. The date/time can be given in any format that [[!cpan TimeDate]] can understand, just like the `date` field. +* foaf + + Adds a Friend of a Friend ([FOAF](http://wiki.foaf-project.org/w/Autodiscovery)) + reference to a page. + + Example: + + \[[!meta foaf=foaf.rdf]] + If the field is not one of the above predefined fields, the metadata will be written to the generated html page as a <meta> header. However, this won't be allowed if the [[!iki plugins/htmlscrubber desc=htmlscrubber]] plugin is enabled, diff --git a/doc/ikiwiki/directive/more.mdwn b/doc/ikiwiki/directive/more.mdwn index 506551910..bda1427f3 100644 --- a/doc/ikiwiki/directive/more.mdwn +++ b/doc/ikiwiki/directive/more.mdwn @@ -11,6 +11,11 @@ leads to the full version of the page. Use it like this: If the `linktext` parameter is omitted it defaults to just "more". +An optional `pages` parameter can be used to specify a +[[ikiwiki/PageSpec]], and then the "more" link will only be displayed +when the page is inlined into a page matching that PageSpec, and otherwise +the full content shown. + Note that you can accomplish something similar using a [[toggle]] instead. [[!meta robots="noindex, follow"]] diff --git a/doc/ikiwiki/directive/pagestats.mdwn b/doc/ikiwiki/directive/pagestats.mdwn index f14c80b07..8d904f5a3 100644 --- a/doc/ikiwiki/directive/pagestats.mdwn +++ b/doc/ikiwiki/directive/pagestats.mdwn @@ -4,10 +4,16 @@ This directive can generate stats about how pages link to each other. It can produce either a tag cloud, or a table counting the number of links to each page. -Here's how to use it to create a [[tag]] cloud: +Here's how to use it to create a [[tag]] cloud, with tags sized based +on frequency of use: \[[!pagestats pages="tags/*"]] +Here's how to create a list of tags, sized by use as they would be in a +cloud. + + \[[!pagestats style="list" pages="tags/*"]] + And here's how to create a table of all the pages on the wiki: \[[!pagestats style="table"]] @@ -20,6 +26,15 @@ entries, while ignoring other pages that use those tags, you could use: Or to display a cloud of tags related to Linux, you could use: - \[[!pagestats pages="tags/* and not tags/linux" among="tagged(linux)"]] + \[[!pagestats pages="tags/* and !tags/linux" among="tagged(linux)"]] + +The optional `show` parameter limits display to the specified number of +pages. For instance, to show a table of the top ten pages with the most +links: + + \[[!pagestats style="table" show="10"]] + +The optional `class` parameter can be used to control the class +of the generated tag cloud `div` or page stats `table`. [[!meta robots="noindex, follow"]] diff --git a/doc/ikiwiki/directive/pagestats/discussion.mdwn b/doc/ikiwiki/directive/pagestats/discussion.mdwn index 3c9dc7104..99029e88e 100644 --- a/doc/ikiwiki/directive/pagestats/discussion.mdwn +++ b/doc/ikiwiki/directive/pagestats/discussion.mdwn @@ -7,4 +7,12 @@ I would rather not find and create a page for every tag I have created or will c Thanks ----- +> Hello unknown person. + +> I think it would require a different approach to what "tags" are, and/or what "pagestats" are. The pagestats plugin gives statistical information about *pages*, so it requires the pages in question to exist before it can get information about them. The tags plugin creates links to tag *pages*, with the expectation that a human being will create said pages and put whatever content they want on them (such as describing what the tag is about, and a map linking back to the tagged pages). + +> The approach that [PmWiki](http://www.pmwiki.org) takes is that it enables the optional auto-creation of (empty) pages which match a particular "group" (set of sub-pages); thus one could set all the "tags/*" pages to be auto-created, creating a new tags/foo page the first time the \[[!tag foo]] directive is used. See [[todo/auto-create_tag_pages_according_to_a_template]] for more discussion on this idea. +> -- [[KathrynAndersen]] + +> Update: Ikiwiki can auto-create tags now, though it only defaults to +> doing so when tagbase is set. --[[Joey]] diff --git a/doc/ikiwiki/directive/pagetemplate.mdwn b/doc/ikiwiki/directive/pagetemplate.mdwn index 8ad901c1a..401b38099 100644 --- a/doc/ikiwiki/directive/pagetemplate.mdwn +++ b/doc/ikiwiki/directive/pagetemplate.mdwn @@ -1,17 +1,13 @@ The `pagetemplate` directive is supplied by the [[!iki plugins/pagetemplate desc=pagetemplate]] plugin. -This directive allows a page to be displayed using a different template than -the default `page.tmpl` template. +This directive allows a page to be displayed using a different +[[template|templates]] than the default `page.tmpl` template. The page text is inserted into the template, so the template controls the overall look and feel of the wiki page. This is in contrast to the [[ikiwiki/directive/template]] directive, which allows inserting templates _into_ the body of a page. -This directive can only reference templates that are already installed -by the system administrator, typically into the -`/usr/share/ikiwiki/templates` directory. Example: - \[[!pagetemplate template="my_fancy.tmpl"]] [[!meta robots="noindex, follow"]] diff --git a/doc/ikiwiki/directive/sidebar.mdwn b/doc/ikiwiki/directive/sidebar.mdwn new file mode 100644 index 000000000..599695d22 --- /dev/null +++ b/doc/ikiwiki/directive/sidebar.mdwn @@ -0,0 +1,20 @@ +The `sidebar` directive is supplied by the [[!iki plugins/sidebar desc=sidebar]] plugin. + +This directive can specify a custom sidebar to display on the page, +overriding any sidebar that is displayed globally. + +If no custom sidebar content is specified, it forces the sidebar page to +be used as the sidebar, even if the `global_sidebars` setting has been +used to disable use of the sidebar page by default. + +## examples + + \[[!sidebar content=""" + This is my custom sidebar for this page. + + \[[!calendar pages="posts/*"]] + """]] + + \[[!sidebar]] + +[[!meta robots="noindex, follow"]] diff --git a/doc/ikiwiki/directive/table.mdwn b/doc/ikiwiki/directive/table.mdwn index e27a94b7c..a6692f92c 100644 --- a/doc/ikiwiki/directive/table.mdwn +++ b/doc/ikiwiki/directive/table.mdwn @@ -6,8 +6,8 @@ or DSV (delimiter-separated values) format. ## examples \[[!table data=""" - Customer|Amount - Fulanito|134,34 + Customer |Amount + Fulanito |134,34 Menganito|234,56 Menganito|234,56 """]] @@ -42,4 +42,9 @@ cells. For example: as the table header. Set it to "no" to make a table without a header, or "column" to make the first column be the header. +For tab-delimited tables (often obtained by copying and pasting from HTML +or a spreadsheet), `delimiter` must be set to a literal tab character. These +are difficult to type in most web browsers - copying and pasting one from +the table data is likely to be the easiest way. + [[!meta robots="noindex, follow"]] diff --git a/doc/ikiwiki/directive/table/discussion.mdwn b/doc/ikiwiki/directive/table/discussion.mdwn new file mode 100644 index 000000000..87d2e0cd1 --- /dev/null +++ b/doc/ikiwiki/directive/table/discussion.mdwn @@ -0,0 +1 @@ +The problem I have in my tables, is that some fields contain example HTML that needs to be escaped. diff --git a/doc/ikiwiki/directive/tag.mdwn b/doc/ikiwiki/directive/tag.mdwn index 64736f8cd..c8d9b9816 100644 --- a/doc/ikiwiki/directive/tag.mdwn +++ b/doc/ikiwiki/directive/tag.mdwn @@ -19,7 +19,8 @@ instead: Note that if the wiki is configured to use a tagbase, then the tags will be located under a base directory, such as "tags/". This is a useful way to avoid having to write the full path to tags, if you want to keep them -grouped together out of the way. +grouped together out of the way. Also, since ikiwiki then knows where to put +tags, it will automatically create tag pages when new tags are used. Bear in mind that specifying a tagbase means you will need to incorporate it into the `link()` [[ikiwiki/PageSpec]] you use: e.g., if your tagbase is @@ -28,7 +29,7 @@ into the `link()` [[ikiwiki/PageSpec]] you use: e.g., if your tagbase is If you want to override the tagbase for a particular tag, you can use something like this: - \[[!tag ./foo]] + \[[!tag /foo]] \[[!taglink /foo]] [[!meta robots="noindex, follow"]] diff --git a/doc/ikiwiki/directive/template.mdwn b/doc/ikiwiki/directive/template.mdwn index d538b69b1..9e3ae54df 100644 --- a/doc/ikiwiki/directive/template.mdwn +++ b/doc/ikiwiki/directive/template.mdwn @@ -1,18 +1,91 @@ The `template` directive is supplied by the [[!iki plugins/template desc=template]] plugin. -[[Templates]] are files that can be filled out and inserted into pages in the -wiki, by using the template directive. The directive has an `id` parameter +The template directive allows wiki pages to be used as templates. +These templates can be filled out and inserted into other pages in the +wiki using the directive. The [[templates]] page lists templates +that can be used with this directive. + +The directive has an `id` parameter that identifies the template to use. The remaining parameters are used to fill out the template. -Example: +## Example \[[!template id=note text="""Here is the text to insert into my note."""]] This fills out the `note` template, filling in the `text` field with the specified value, and inserts the result into the page. -For a list of available templates, and details about how to create more, -see the [[templates]] page. +## Using a template + +Generally, a value can include any markup that would be allowed in the wiki +page outside the template. Triple-quoting the value even allows quotes to +be included in it. Combined with multi-line quoted values, this allows for +large chunks of marked up text to be embedded into a template: + + \[[!template id=foo name="Sally" color="green" age=8 notes=""" + * \[[Charley]]'s sister. + * "I want to be an astronaut when I grow up." + * Really 8 and a half. + """]] + +## Creating a template + +The template is a regular wiki page, located in the `templates/` +subdirectory inside the source directory of the wiki. + +Alternatively, templates can be stored in a directory outside the wiki, +as files with the extension ".tmpl". +By default, these are searched for in `/usr/share/ikiwiki/templates`, +the `templatedir` setting can be used to make another directory be searched +first. When referring to templates outside the wiki source directory, the "id" +parameter is not interpreted as a pagespec, and you must include the full filename +of the template page, including the ".tmpl" extension. E.g.: + + \[[!template id=blogpost.tmpl]] + +The template uses the syntax used by the [[!cpan HTML::Template]] perl +module, which allows for some fairly complex things to be done. Consult its +documentation for the full syntax, but all you really need to know are a +few things: + +* Each parameter you pass to the template directive will generate a + template variable. There are also some pre-defined variables like PAGE + and BASENAME. +* To insert the value of a variable, use `<TMPL_VAR variable>`. Wiki markup + in the value will first be converted to html. +* To insert the raw value of a variable, with wiki markup not yet converted + to html, use `<TMPL_VAR raw_variable>`. +* To make a block of text conditional on a variable being set use + `<TMPL_IF variable>text</TMPL_IF>`. +* To use one block of text if a variable is set and a second if it's not, + use `<TMPL_IF variable>text<TMPL_ELSE>other text</TMPL_IF>` + +Here's a sample template: + + <span class="infobox"> + Name: \[[<TMPL_VAR raw_name>]]<br /> + Age: <TMPL_VAR age><br /> + <TMPL_IF color> + Favorite color: <TMPL_VAR color><br /> + <TMPL_ELSE> + No favorite color.<br /> + </TMPL_IF> + <TMPL_IF notes> + <hr /> + <TMPL_VAR notes> + </TMPL_IF> + </span> + +The filled out template will be formatted the same as the rest of the page +that contains it, so you can include WikiLinks and all other forms of wiki +markup in the template. Note though that such WikiLinks will not show up as +backlinks to the page that uses the template. + +Note the use of "raw_name" inside the [[ikiwiki/WikiLink]] generator in the +example above. This ensures that if the name contains something that might +be mistaken for wiki markup, it's not converted to html before being +processed as a [[ikiwiki/WikiLink]]. + [[!meta robots="noindex, follow"]] diff --git a/doc/ikiwiki/directive/toc.mdwn b/doc/ikiwiki/directive/toc.mdwn index bf504dafc..bb1afa1ac 100644 --- a/doc/ikiwiki/directive/toc.mdwn +++ b/doc/ikiwiki/directive/toc.mdwn @@ -14,6 +14,12 @@ the `levels` parameter: The toc directive will take the level of the first header as the topmost level, even if there are higher levels seen later in the file. +To create a table of contents that only shows headers starting with a given +level, use the `startlevel` parameter. For example, to show only h2 and +smaller headers: + + \[[!toc startlevel=2]] + The table of contents will be created as an ordered list. If you want an unordered list instead, you can change the list-style in your local style sheet. diff --git a/doc/ikiwiki/openid.mdwn b/doc/ikiwiki/openid.mdwn index a79655284..2fa972ede 100644 --- a/doc/ikiwiki/openid.mdwn +++ b/doc/ikiwiki/openid.mdwn @@ -9,16 +9,10 @@ that allows you to have one login that you can use on a growing number of websites. -To sign up for an OpenID, visit one of the following identity providers: +If you have an account with some of the larger web service providers, +you might already have an OpenID. +[Directory of OpenID providers](http://openiddirectory.com/openid-providers-c-1.html) -* [MyOpenID](https://www.myopenid.com/) -* [GetOpenID](https://getopenid.com/) -* [Videntity](http://videntity.org/) -* [LiveJournal](http://www.livejournal.com/openid/) -* [TrustBearer](https://openid.trustbearer.com/) -* or any of the [many others out there](http://openiddirectory.com/openid-providers-c-1.html) - -Your OpenID is the URL that you are given when you sign up. [[!if test="enabled(openid)" then=""" To sign in to this wiki using OpenID, just enter it in the OpenID field in the signin form. You do not need to give this wiki a password or go through any diff --git a/doc/ikiwiki/pagespec.mdwn b/doc/ikiwiki/pagespec.mdwn index 5f0f44e2e..fe1af4c15 100644 --- a/doc/ikiwiki/pagespec.mdwn +++ b/doc/ikiwiki/pagespec.mdwn @@ -24,31 +24,36 @@ match all pages except for Discussion pages and the SandBox: Some more elaborate limits can be added to what matches using these functions: +* "`glob(someglob)`" - matches pages and other files that match the given glob. + Just writing the glob by itself is actually a shorthand for this function. +* "`page(glob)`" - like `glob()`, but only matches pages, not other files * "`link(page)`" - matches only pages that link to a given page (or glob) * "`tagged(tag)`" - matches pages that are tagged or link to the given tag (or tags matched by a glob) * "`backlink(page)`" - matches only pages that a given page links to -* "`creation_month(month)`" - matches only pages created on the given month +* "`creation_month(month)`" - matches only files created on the given month + number * "`creation_day(mday)`" - or day of the month * "`creation_year(year)`" - or year -* "`created_after(page)`" - matches only pages created after the given page +* "`created_after(page)`" - matches only files created after the given page was created -* "`created_before(page)`" - matches only pages created before the given page +* "`created_before(page)`" - matches only files created before the given page was created -* "`glob(someglob)`" - matches pages that match the given glob. Just writing - the glob by itself is actually a shorthand for this function. * "`internal(glob)`" - like `glob()`, but matches even internal-use pages that globs do not usually match. * "`title(glob)`", "`author(glob)`", "`authorurl(glob)`", - "`license(glob)`", "`copyright(glob)`" - match pages that have the given - metadata, matching the specified glob. + "`license(glob)`", "`copyright(glob)`", "`guid(glob)`" + - match pages that have the given metadata, matching the specified glob. * "`user(username)`" - tests whether a modification is being made by a user with the specified username. If openid is enabled, an openid can also - be put here. + be put here. Glob patterns can be used in the username. For example, + to match all openid users, use `user(*://*)` * "`admin()`" - tests whether a modification is being made by one of the wiki admins. * "`ip(address)`" - tests whether a modification is being made from the specified IP address. +* "`comment(glob)`" - matches comments to a page matching the glob. +* "`comment_pending(glob)`" - matches unmoderated, pending comments. * "`postcomment(glob)`" - matches only when comments are being posted to a page matching the specified glob @@ -74,3 +79,7 @@ filenames of the pages in the wiki, so a pagespec "foo" used on page "a/b" will not match a page named "a/foo" or "a/b/foo". To match relative to the directory of the page containing the pagespec, you can use "./". For example, "./foo" on page "a/b" matches page "a/foo". + +To indicate the name of the page the PageSpec is used in, you can +use a single dot. For example, `link(.)` matches all the pages +linking to the page containing the PageSpec. diff --git a/doc/ikiwiki/pagespec/attachment.mdwn b/doc/ikiwiki/pagespec/attachment.mdwn index 419f00ee4..fa2bc5867 100644 --- a/doc/ikiwiki/pagespec/attachment.mdwn +++ b/doc/ikiwiki/pagespec/attachment.mdwn @@ -7,11 +7,12 @@ If attachments are enabled, the wiki admin can control what types of attachments will be accepted, via the `allowed_attachments` configuration setting. -For example, to limit arbitrary files to 50 kilobytes, but allow -larger mp3 files to be uploaded by joey into a specific directory, and -check all attachments for viruses, something like this could be used: +For example, to limit most users to uploading small images, and nothing else, +while allowing larger mp3 files to be uploaded by joey into a specific +directory, and check all attachments for viruses, something like this could be +used: - virusfree() and ((user(joey) and podcast/*.mp3 and mimetype(audio/mpeg) and maxsize(15mb)) or (!ispage() and maxsize(50kb))) + virusfree() and ((user(joey) and podcast/*.mp3 and mimetype(audio/mpeg) and maxsize(15mb)) or (mimetype(image/*) and maxsize(50kb))) The regular [[ikiwiki/PageSpec]] syntax is expanded with the following additional tests: diff --git a/doc/ikiwiki/pagespec/discussion.mdwn b/doc/ikiwiki/pagespec/discussion.mdwn index 4eed3722c..4c553925a 100644 --- a/doc/ikiwiki/pagespec/discussion.mdwn +++ b/doc/ikiwiki/pagespec/discussion.mdwn @@ -92,3 +92,14 @@ does not seem suitable for this, as > \[[!map pages="./*"]] also lists the current page and all its siblings. + +--- + +I am a little lost. I want to match the start page `/index.mdwn`. So I use + + \[[!inline pages="/index"]] + +which does not work though. I also tried it in this Wiki. Just take a look at the end of the [[SandBox|sandbox]]. --[[PaulePanter]] + +> Unlike wikilinks, pagespecs match relative to the top of the wiki by +> default. So lose the "/" and it will work. --[[Joey]] diff --git a/doc/ikiwiki/pagespec/po.mdwn b/doc/ikiwiki/pagespec/po.mdwn index e0264dd50..f9956404c 100644 --- a/doc/ikiwiki/pagespec/po.mdwn +++ b/doc/ikiwiki/pagespec/po.mdwn @@ -11,6 +11,13 @@ wiki: specified as a ISO639-1 (two-letter) language code. * "`currentlang()`" - tests whether a page is written in the same language as the current page. +* "`needstranslation()`" - tests whether a page needs translation + work. Only slave pages match this PageSpec. A minimum target + translation percentage can optionally be passed as an integer + parameter: "`needstranslation(50)`" matches only pages less than 50% + translated. Note that every non-po page is considered to be written in `po_master_language`, as specified in `ikiwiki.setup`. + +[[!meta robots="noindex, follow"]] diff --git a/doc/ikiwiki/pagespec/sorting.mdwn b/doc/ikiwiki/pagespec/sorting.mdwn index 41aa58151..ccd7f7eaa 100644 --- a/doc/ikiwiki/pagespec/sorting.mdwn +++ b/doc/ikiwiki/pagespec/sorting.mdwn @@ -4,8 +4,23 @@ specifying the order that matching pages are shown in. The following sort orders can be specified. * `age` - List pages from the most recently created to the oldest. + * `mtime` - List pages with the most recently modified first. -* `title` - Order by title. -* `title_natural` - Only available if [[!cpan Sort::Naturally]] is - installed. Orders by title, but numbers in the title are treated + +* `title` - Order by title (page name). +[[!if test="enabled(sortnaturally)" then=""" +* `title_natural` - Orders by title, but numbers in the title are treated as such, ("1 2 9 10 20" instead of "1 10 2 20 9") +"""]] +[[!if test="enabled(meta)" then=""" +* `meta(title)` - Order according to the `\[[!meta title="foo" sortas="bar"]]` + or `\[[!meta title="foo"]]` [[ikiwiki/directive]], or the page name if no + full title was set. `meta(author)`, `meta(date)`, `meta(updated)`, etc. + also work. +"""]] + +In addition, you can combine several sort orders and/or reverse the order of +sorting, with a string like `age -title` (which would sort by age, then by +title in reverse order if two pages have the same age). + +[[!meta robots="noindex, follow"]] diff --git a/doc/ikiwiki/subpage.mdwn b/doc/ikiwiki/subpage.mdwn index e047b860c..862f45ec1 100644 --- a/doc/ikiwiki/subpage.mdwn +++ b/doc/ikiwiki/subpage.mdwn @@ -5,8 +5,8 @@ this page, [[SubPage]] has some related pages placed under it, like wiki rather than just having a great big directory full of pages. To add a SubPage, just make a subdirectory and put pages in it. For -example, this page is SubPage.mdwn in this wiki's source, and there is also -a SubPage subdirectory, which contains SubPage/LinkingRules.mdwn. Subpages +example, this page is subpage.mdwn in this wiki's source, and there is also +a subpage subdirectory, which contains subpage/linkingrules.mdwn. Subpages can be nested as deeply as you'd like. Linking to and from a SubPage is explained in [[LinkingRules]]. diff --git a/doc/ikiwiki/wikilink.mdwn b/doc/ikiwiki/wikilink.mdwn index f561d5850..cf3b89c76 100644 --- a/doc/ikiwiki/wikilink.mdwn +++ b/doc/ikiwiki/wikilink.mdwn @@ -9,9 +9,6 @@ wikilink, just prefix it with a `\`, like `\\[[WikiLink]]`. There are some special [[SubPage/LinkingRules]] that come into play when linking between [[SubPages|SubPage]]. -Also, if the file linked to by a WikiLink looks like an image, it will -be displayed inline on the page. - WikiLinks are matched with page names in a case-insensitive manner, so you don't need to worry about getting the case the same, and can capitalise links at the start of a sentence, and so on. @@ -23,14 +20,10 @@ page, but the link will appear like this: [[foo_bar|SandBox]]. To link to an anchor inside a page, you can use something like `\[[WikiLink#foo]]` . -## Directives and WikiLinks - -ikiwiki has two syntaxes for -[[directives|directive]]. The older syntax -used spaces to distinguish between directives and -wikilinks; as a result, with that syntax in use, you cannot use spaces -in WikiLinks, and must replace spaces with underscores. The newer -syntax, enabled with the `prefix_directives` option in an ikiwiki -setup file, prefixes directives with `!`, and thus does not prevent -links with spaces. Future versions of ikiwiki will turn this option -on by default. +If the file linked to by a WikiLink looks like an image, it will +be displayed inline on the page. + +--- + +You can also put an url in a WikiLink, to link to an external page. +Email addresses can also be used to generate a mailto link. diff --git a/doc/ikiwiki/wikilink/discussion.mdwn b/doc/ikiwiki/wikilink/discussion.mdwn index b146c9447..89affc502 100644 --- a/doc/ikiwiki/wikilink/discussion.mdwn +++ b/doc/ikiwiki/wikilink/discussion.mdwn @@ -43,6 +43,11 @@ BTW, ikiwiki doesn't displays the #foo anchor in the example >> Fixed that --[[Joey]] +The 'name' attribute of the 'a' element is a depracated way to create a named anchor. The right way to do that is using the 'id' attribute of any element. This is because an anchor may refer to a complete element rather than some point in the page. + +Standard purity aside, if you define an anchor (using either 'a name' or 'id') to a single point in the document but refer to a complete section, the browser may just show that specific point at the bottom of the page rather than trying to show all the section. +--[[tzafrir]] + --- Considering a hierarchy like `foo/bar/bar`, I had the need to link from the @@ -79,3 +84,8 @@ Is it possible to refer to a page, say \[[foobar]], such that the link text is t > Not yet. :-) Any suggestion for a syntax for it? Maybe something like \[[|foobar]] ? --[[Joey]] I like your suggestion because it's short and conscise. However, it would be nice to be able to refer to more or less arbitrary meta tags in links, not just "title". To do that, the link needs two parameters: the page name and the tag name, i.e. \[[pagename!metatag]]. Any sufficiently weird separater can be used instead of '!', of course. I like \[[pagename->metatag]], too, because it reminds me of accessing a data member of a structure (which is what referencing a meta tag is, really). --Peter + +> I dislike \[[pagename->metatag]] because other wikis use that as their normal link/label syntax. +> I'm not sure that it is a good idea to refer to arbitrary meta tags in links in the first place - what other meta tags would you really be interested in? Description? Author? It makes sense to me to refer to the title, because that is a "label" for a page. +> As for syntax, I do like the \[[|foobar]] idea, or perhaps something like what <a href="http://www.pmwiki.org">PmWiki</a> does - they have their links the other way around, so they go \[[page|label]] and for link-text-as-title, they have \[[page|+]]. So for IkiWiki, that would be \[[+|page]] I guess. +> --[[KathrynAndersen]] diff --git a/doc/ikiwikiusers.mdwn b/doc/ikiwikiusers.mdwn index a98abf578..659c26cb4 100644 --- a/doc/ikiwikiusers.mdwn +++ b/doc/ikiwikiusers.mdwn @@ -1,16 +1,33 @@ +General information +=================== + +Feel free to add your own ikiwiki site! In case you have created a custom theme consider adding it to [the theme list](http://ikiwiki.info/themes/) + +See also: [Debian ikiwiki popcon graph](http://qa.debian.org/popcon.php?package=ikiwiki) +and [google search for ikiwiki powered sites](http://www.google.com/search?q=%22powered%20by%20ikiwiki%22). + +While nothing makes me happier than knowing that ikiwiki has happy users, +dropping some change in the [[TipJar]] is a nice way to show extra +appreciation. + +Ikiwiki Hosting +=============== + +* [Branchable](http://branchable.com/) + Projects & Organizations ======================== * [This wiki](http://ikiwiki.info) (of course!) +<!-- * [NetBSD wiki](http://wiki.netbsd.org) --> * The [GNU Hurd](http://www.gnu.org/software/hurd/) * [DragonFly BSD](http://www.dragonflybsd.org/) -* [Monotone](http://monotone.ca/wiki/FrontPage/) +* [Monotone](http://wiki.monotone.ca/) * The [Free Software Foundation](http://fsf.org) uses it for their internal wiki, with subversion. * The [cairo graphics library](http://cairographics.org/) website. * The [Portland State Aerospace Society](http://psas.pdx.edu) website. Converted from a combination of TWiki and MoinMoin to ikiwiki, including full history ([[rcs/Git]] backend). * [Planet Debian upstream](http://updo.debian.net/) * [Debian Mentors wiki](http://jameswestby.net/mentors/) -* The [Sparse wiki](http://kernel.org/pub/linux/kernel/people/josh/sparse). * [The BSD Associate Admin Book Project](http://bsdwiki.reedmedia.net/) * The [maildirman wiki](http://svcs.cs.pdx.edu/maildirman) * The [linuxbierwanderung wiki/homepage](http://www.linuxbierwanderung.org) @@ -25,7 +42,11 @@ Projects & Organizations * The [libkdtree project](http://libkdtree.alioth.debian.org) * The [pcc](http://pcc.ludd.ltu.se/) (Portable C Compiler) project. (Simple rcs backend) * [The TOVA Company](http://www.tovatest.com) public site. We also use it for internal documentation and issue tracking, all with a [[rcs/Git]] backend. -* Technical support websites for [Homebase](http://support.homebase.dk) and [Kaospilotene](http://support.kaospilot.no) (each with [source](http://source.homebase.dk/) [provided](http://source.kaospilot.no/)) +* Reusable technical support websites, developed for [Redpill](http://redpill.dk/) realms: + * [master demo site](http://support.redpill.dk/) ([source](http://source.redpill.dk/)) + * [Homebase](http://support.homebase.dk/) ([source](http://source.homebase.dk/)) + * [Bitbase](http://support.bitbase.dk/) ([source](http://source.bitbase.dk/)) + * [Børneuniversitetet](http://support.borneuni.dk/) ([source](http://source.borneuni.dk/)) * [CampusGrün Hamburg](http://www.campusgruen.org/) * The [awesome window manager homepage](http://awesome.naquadah.org/) * [Enemies of Carlotta](http://www.e-o-c.org/) @@ -40,13 +61,29 @@ Projects & Organizations * [Chaos Computer Club Düsseldorf](https://www.chaosdorf.de) * [monkeysphere](http://web.monkeysphere.info/) * [The Walden Effect](http://www.waldeneffect.org/) -* The support pages for [Trinity Centre for High Performance Computing](http://www.tchpc.tcd.ie/support/) * [St Hugh of Lincoln Catholic Primary School in Surrey](http://www.sthugh-of-lincoln.surrey.sch.uk/) -* [Pigro Network](http://www.pigro.net) is running a hg based ikiwiki. (And provides ikiwiki hosting for $10/m.) * [Cosin Homepage](http://cosin.ch) uses an Ikiwiki with a subversion repository. * [Bosco Free Orienteering Software](http://bosco.durcheinandertal.ch) * [MIT Student Information Processing Board](http://sipb.mit.edu/) * [Tinc VPN](http://tinc-vpn.org/) +* [The XCB library](http://xcb.freedesktop.org/) +* [The Philolexian Society of Columbia University](http://www.columbia.edu/cu/philo/) +* [Fachschaft Informatik HU Berlin](http://fachschaft.informatik.hu-berlin.de/) +* [Wetknee Books](http://www.wetknee.com/) +* [IPOL Image Processing On Line](http://www.ipol.im) +* [Debian Costa Rica](http://cr.debian.net/) +* [Fvwm Wiki](http://fvwmwiki.xteddy.org) +* [Serialist](http://serialist.net/)'s static pages (documentation, blog). We actually have ikiwiki generate its static content as HTML fragments using a modified page.tmpl template, and then the FastCGI powering our site grabs those fragments and embeds them in the standard dynamic site template. +* [Apua IT](http://apua.se/) +* [PDFpirate Community](http://community.pdfpirate.org/) +* [Banu](https://banu.com/) uses Ikiwiki for its website, to convert static Markdown pages into PHP scripts which are served along with non-Ikiwiki PHP generated contents. The static contents benefit from use of Ikiwiki's plugins. Ikiwiki is purely used as a CMS and no wiki or web-based editing is allowed. Ikiwiki is run offline, and the resulting scripts are uploaded using rsync to the website. +* [Software in the Public Interest](http://spi-inc.org/) +* [NXT Improved Firmware](http://nxt-firmware.ni.fr.eu.org/) +* [The FreedomBox Foundation](http://www.freedomboxfoundation.org/) +* [TenderWarehouse Community](http://community.tenderwarehouse.org/) +* [AntPortal](http://antportal.com/wiki/) - also see our templates and themes on [github](https://github.com/AntPortal/ikiwiked) +* [The Amnesic Incognito Live System](https://tails.boum.org/index.en.html) +* [The Progress Linux OS wiki](http://wiki.progress-linux.org/) Personal sites and blogs ======================== @@ -65,7 +102,6 @@ Personal sites and blogs * [Christian Aichinger's homepage](http://greek0.net/) * Ben A'Lee's [homepage](http://subvert.org.uk/~bma/) and [wiki](http://wiki.subvert.org.uk/). * [Adam Shand's homepage](http://adam.shand.net/iki/) -* [Recai Oktaş's homepage](http://kirkambar.net/) (uses [[rcs/Git]] backend, Turkish language only). * [Hess family wiki](http://kitenet.net/~family/) * [Zack](http://upsilon.cc/~zack)'s homepage, including [his weblog](http://upsilon.cc/~zack/blog/) * [Taquiones: Victor Moral's personal website in Spanish](http://taquiones.net) @@ -77,11 +113,8 @@ Personal sites and blogs * [Tales from the Gryphon](http://www.golden-gryphon.com/blog/manoj/), Manoj Srivastava's free software blog. * [Proper Treatment 正當作法](http://conway.rutgers.edu/~ccshan/wiki/) * [lost scraps](http://web.mornfall.net), pages/blog of Petr Ročkai aka mornfall -* [Ronan Le Hy's blog](http://bayesien.org), in French. -* <http://iainmclaren.com>. * [formorers blog and website](http://www.formorer.de/webwiki/) * [Mark Jaroski's blog](http://movemearound.org/) -* I keep my personal project notes and specs in a private ikiwiki - it's the perfect tool for this task. - [the daniel](http://neoglam.com) * [Schabis blaue Seite](http://schabi.de) - I abuse ikiwiki as blog/cms combo, and will migrate all existing content into ikiwiki eventually. * [Ben Coffey's blog and personal site](http://inelegant.org/). * [blog of LukClaes](http://zomers.be/~luk/blog/). @@ -93,10 +126,8 @@ Personal sites and blogs * [[KarlMW]]'s [homepage](http://mowson.org/karl/), generated with an ikiwiki [asciidoc plugin](http://mowson.org/karl/colophon/). * [Carl Worth's Boring Web Pages](http://www.cworth.org) -* [Charles Mauch](http://xtermin.us)'s website uses ikiwiki. * I keep my personal and project notes in a private ikiwiki - it's the perfect tool for this task. - [h01ger](http://layer-acht.org/) * [[NicolasLimare]] ([nil](http://poivron.org/~nil/)+[lab](http://www.ann.jussieu.fr/~limare/)+[id](http://nicolas.limare.net/)+[french translation of the basewiki](http://poivron.org/~nil/ikiwiki-fr/)) -* [Patrick Winnertz (winnie)](https://www.der-winnie.de) * Andrew Sackville-West has setup a [family wiki](http://wiki.swclan.homelinux.org) * [Simon Ward's site](http://bleah.co.uk/) and [blog](http://bleah.co.uk/blog/). * [Paul Wise's homepage and blog](http://bonedaddy.net/pabs3/) @@ -114,7 +145,7 @@ Personal sites and blogs * [[Adam_Trickett|ajt]]'s home intranet/sanbox system ([Internet site & blog](http://www.iredale.net/) -- not ikiwiki yet) * [[Simon_McVittie|smcv]]'s [website](http://www.pseudorandom.co.uk/) and [blog](http://smcv.pseudorandom.co.uk/) -* Svend's [website](http://www.ciffer.net/~svend/) and [blog](http://www.ciffer.net/~svend/blog/) +* Svend's [website](http://ciffer.net/~svend/) and [blog](http://ciffer.net/~svend/blog/) * [muammar's site](http://muammar.me) * [Per Bothner's blog](http://per.bothner.com/blog/) * [Bernd Zeimetz (bzed)](http://bzed.de/) @@ -126,13 +157,32 @@ Personal sites and blogs * [tumashu's page](http://tumashu.github.com) This is my personal site in github created with ikiwiki and only a page,you can get the [source](http://github.com/tumashu/tumashu/tree/master) * [Skirv's Wiki](http://wiki.killfile.org) - formerly Skirv's Homepage * [Jimmy Tang - personal blog and wiki](http://www.sgenomics.org/~jtang) -* [Weakish Jiang's Homepage](http://weakish.pigro.net) - -Please feel free to add your own ikiwiki site! - -See also: [Debian ikiwiki popcon graph](http://qa.debian.org/popcon.php?package=ikiwiki) -and [google search for ikiwiki powered sites](http://www.google.com/search?q=%22powered%20by%20ikiwiki%22). - -While nothing makes me happier than knowing that ikiwiki has happy users, -dropping some change in the [[TipJar]] is a nice way to show extra -appreciation. +* [Nico Schottelius' homepage](http://www.nico.schottelius.org) +* [Andreas Zwinkaus homepage](http://beza1e1.tuxen.de) +* [Salient Dream](http://salient.dre.am) +* [Walden Effect](http://waldeneffect.org) +* [Avian Aqua Miser](http://www.avianaquamiser.com/) +* [Cosmic Cookout](http://www.cosmiccookout.com/) +* [Backyard Deer](http://www.backyarddeer.com/) +* [Alex Ghitza homepage and blog](http://aghitza.org/) +* [Andreas's homepage](http://0x7.ch/) - Ikiwiki, Subversion and CSS template +* [Chris Dombroski's boring bliki](https://www.icanttype.org/) +* [Josh Triplett's homepage](http://joshtriplett.org/) - Git backend with the CGI disabled, to publish a static site with the convenience of ikiwiki. +* [Ertug Karamatli](http://pages.karamatli.com) +* [Jonatan Walck](http://jonatan.walck.i2p/) a weblog + wiki over [I2P](http://i2p2.de/). Also [mirrored](http://jonatan.walck.se/) to the Internet a few times per day. +* [Daniel Wayne Armstrong](http://circuidipity.com/) +* [Mukund](https://mukund.org/) +* [Nicolas Schodet](http://ni.fr.eu.org/) +* [weakish](http://weakish.github.com) +* [Thomas Kane](http://planetkane.org/) +* [Marco Silva](http://marcot.eti.br/) a weblog + wiki using the [darcs](http://darcs.net) backend +* [NeX-6](http://nex-6.taht.net/) ikiwiki blog and wiki running over ipv6 +* [Jason Riedy](http://lovesgoodfood.com/jason/), which may occasionally look funny if I'm playing with my branch... +* [pmate](http://pmate.nfshost.com)'s homepage and [blog](http://pmate.nfshost.com/blog/) +* [tychoish.com](http://tychoish.com/) - a blog/wiki mashup. blog posts are "rhizomes." +* [Martin Burmester](http://www.martin-burmester.de/) +* [Øyvind A. Holm (sunny256)](http://www.sunbase.org) — Read my Ikiwiki praise [here](http://www.sunbase.org/blog/why_ikiwiki/). +* [Mirco Bauer (meebey)](http://www.meebey.net/) +* [Richard "RichiH" Hartmann](http://richardhartmann.de/blog) - I thought I had added myself a year ago. Oups :) +* [Jonas Smedegaard](http://dr.jones.dk/) multilingual "classic" website w/ blog +* [Siri Reiter](http://sirireiter.dk/) portfolio website with a blog (in danish) diff --git a/doc/ikiwikiusers/discussion.mdwn b/doc/ikiwikiusers/discussion.mdwn index 39a9bb921..2c211b097 100644 --- a/doc/ikiwikiusers/discussion.mdwn +++ b/doc/ikiwikiusers/discussion.mdwn @@ -33,3 +33,7 @@ Hopefully I will be one of the ikiwiki users one day :) cheers --[[Chao]] ---- Are there automated hosting sites for ikiwiki yet? If you know one, can you add one in a new section on [[ikiwikiusers]] please? If you don't know any and you're willing to pay to set one up (shouldn't be much more expensive than a single ikiwiki IMO), [contact me](http://www.ttllp.co.uk/contact.html) and let's talk. -- MJR + +---- + +People who have interests in getting a webhost for ikiwiki may have a look at [this site](http://www.pigro.net). -- weakish diff --git a/doc/index.mdwn b/doc/index.mdwn index 93526c42c..9398a2e8e 100644 --- a/doc/index.mdwn +++ b/doc/index.mdwn @@ -10,7 +10,8 @@ There are many other [[features]], including support for [[Setup]] has a tutorial for setting up ikiwiki, or you can read the [[man_page|usage]]. There are some [[examples]] of things you can do -with ikiwiki, and some [[tips]]. +with ikiwiki, and some [[tips]]. Basic documentation for ikiwiki plugins +and syntax is provided [[here|ikiwiki]]. All wikis are supposed to have a [[SandBox]], so this one does too. @@ -20,8 +21,9 @@ ikiwiki [[!version ]]. ## developer resources The [[RoadMap]] describes where the project is going. +The [[forum]] is open for discussions. [[Bugs]], [[TODO]] items, [[wishlist]] items, and [[patches|patch]] can be submitted and tracked using this wiki. -ikiwiki is developed by [[Joey]] and many contributors, +Ikiwiki is developed by [[Joey]] and many contributors, and is [[FreeSoftware]]. diff --git a/doc/index/discussion.mdwn b/doc/index/discussion.mdwn index d52cb0a2d..749042910 100644 --- a/doc/index/discussion.mdwn +++ b/doc/index/discussion.mdwn @@ -1,464 +1 @@ -Seems like there should be a page for you to post your thoughts about -ikiwiki, both pro and con, anything that didn't work, ideas, or whatever. -Do so here.. - -Note that for more formal bug reports or todo items, you can also edit the -[[bugs]] and [[todo]] pages. - -[[!toc ]] - -# Installation/Setup questions - -Ikiwiki creates a .ikiwiki directory in my wikiwc working directory. Should I -"svn add .ikiwiki" or add it to svn:ignore? - -> `.ikiwiki` is used by ikiwiki to store internal state. You can add it to -> svn:ignore. --[[Joey]] -> > Thanks a lot. - -Is there an easy way to log via e-mail to some webmaster address, instead -of via syslog? - -> Not sure why you'd want to do that, but couldn't you use a tool like -> logwatch to mail selected lines from the syslog? --[[Joey]] - -> > The reason is that I'm not logged in on the web server regularly to -> > check the log files. I'll see whether I can install a logwatch instance. - -I'm trying to install from scratch on a CentOS 4.6 system. I installed perl 5.8.8 from source and then added all the required modules via CPAN. When I build ikiwiki from the tarball, I get this message: - - rendering todo/calendar_--_archive_browsing_via_a_calendar_frontend.mdwn - *** glibc detected *** double free or corruption (!prev): 0x0922e478 *** - make: *** [extra_build] Aborted - -I'm kind of at a loss how to track this down or work around it. Any suggestions? --Monty - -> All I can tell you is that it looks like a problem with your C library or -> perl. Little perl programs like ikiwiki should only be able to trigger -> such bugs, not contain them. :-) Sorry I can't be of more help. -> --[[Joey]] - -> I had a similar problem after upgrading to the latest version of -> Text::Markdown from CPAN. You might try either looking for a Markdown -> package for CentOS or using the latest version of John Gruber's -> Markdown.pl: -> <http://daringfireball.net/projects/downloads/Markdown_1.0.2b8.tbz> -> --[[JasonBlevins]], April 1, 2008 18:22 EDT - ->> Unfortunately I couldn't find a CentOS package for markdown, and I ->> couldn't quite figure out how to use John Gruber's version instead. ->> I tried copying it to site_perl, etc., but the build doesn't pick ->> it up. For now I can just play with it on my Ubuntu laptop for which ->> the debian package installed flawlessly. I'll probably wait for an ->> updated version of Markdown to see if this is fixed in the future. ->> --Monty - ->I suggest that you pull an older version of Text::Markdown from CPAN. I am using <http://backpan.perl.org/authors/id/B/BO/BOBTFISH/Text-Markdown-1.0.5.tar.gz> and that works just fine. ->There is a step change in version and size between this version (dated 11Jan2008) and the next version (1.0.12 dated 18Feb2008). I shall have a little look to see why, in due course. ->Ubuntu Hardy Heron has a debian package now, but that does not work either. -> --Dirk 22Apr2008 - -> This might be related to [Text::Markdown bug #37297](http://rt.cpan.org/Public/Bug/Display.html?id=37297).--ChapmanFlack 9Jul2008 - ----- - -# Installation of selected docs (html) - -The latest release has around 560 files (over 2MB) in html. - -Any suggestions or ideas on limiting what html is installed? - -For example, I don't see value in every ikiwiki install out there to also install personal "users" ikiwiki pages. - -For now I copy ikiwiki.setup. And then use pax with -L switch to copy the targets of the symlinks of the basewiki. - -I was thinking of making a list of desired documents from the html directory to install. - ---JeremyReed - -> You don't need any of them, unless you want to read ikiwiki's docs locally. -> -> I don't understand why you're installing the basewiki files manually; -> ikiwiki has a Makefile that will do this for you. --[[Joey]] - ->> The Makefile's install doesn't do what I want so I use different installer for it. ->> It assumes wrong location for man pages for me. (And it should consider using INSTALLVENDORMAN1DIR and ->> MAN1EXT but I don't know about section 8 since I don't know of perl value for that.) ->> I don't want w3m cgi installed; it is optional for my package. ->> I will just patch for that instead of using my own installer. ->> Note: I am working on the pkgsrc package build specification for this. This is for creating ->> packages for NetBSD, DragonFly and other systems that use pkgsrc package system. ->> --JeremyReed - -# Installation as non-root user - -I'd like to install ikiwiki as a non-root user. I can plow through getting all the -perl dependencies installed because that's well documented in the perl world, -but I don't know how to tell ikiwiki to install somewhere other than / --BrianWilson - -> Checkout the tips section for [[tips/DreamHost]]. It should do the trick. --MattReynolds - ----- - -# Upgrade steps - -I upgrades from 1.40 to 2.6.1. I ran "ikiwiki --setup" using my existing ikiwiki.setup configuration. -I had many errors like: - - /home/bsdwiki/www/wiki/wikilink/index.html independently created, not overwriting with version from wikilink - BEGIN failed--compilation aborted at (eval 5) line 129. - -and: - - failed renaming /home/bsdwiki/www/wiki/smileys.ikiwiki-new to /home/bsdwiki/www/wiki/smileys: Is a directory - BEGIN failed--compilation aborted at (eval 5) line 129. - -Probably about six errors like this. I worked around this by removing the files and directories it complained about. -Finally it finished. - -> As of version 2.0, ikiwiki enables usedirs by default. See -> [[tips/switching_to_usedirs]] for details. --[[Joey]] - ->> I read the config wrong. I was thinking that it showed the defaults even though commented out ->> (like ssh configs do). I fixed that part. --JeremyReed - -My next problem was that ikiwiki start letting me edit without any password authentication. It used to prompt -me for a password but now just goes right into the "editing" mode. -The release notes for 2.0 say password auth is still on by default. - -> It sounds like you have the anonok plugin enabled? - ->> Where is the default documented? My config doesn't have it uncommented. - -The third problem is that when editing my textbox is empty -- no content. - -This is using my custom rcs.pm which has been used thousands of times. - -> Have you rebuilt the cgi wrapper since you upgraded ikiwiki? AFAIK I -> fixed a bug that could result in the edit box always being empty back in -> version 2.3. The only other way it could happen is if ikiwiki does not -> have saved state about the page that it's editing (in .ikiwiki/index). - ->> Rebuilt it several times. Now that I think of it, I think my early problem of having ->> no content in the textbox was before I rebuilt the cgi. And after I rebuilt the whole webpage was empty. - -Now I regenerated my ikiwiki.cgi again (no change to my configuration, -and I just get an empty HTML page when attempting editing or "create". - -> If the page is completly empty then ikiwiki is crashing before it can -> output anything, though this seems unlikely. Check the webserver logs. - -Now I see it created directories for my data. I fixed that by setting -usedirs (I see that is in the release notes for 2.0) and rerunning ikiwiki --setup -but I still have empty pages for editing (no textbox no html at all). - -> Is IkiWiki crashing? If so, it would probably leave error text in the apache logs. --[[TaylorKillian]] - ->> Not using apache. Nothing useful in logs other thn the HTTP return codes are "0" and bytes is "-" ->> on the empty ikiwiki.cgi output (should say " 200 " followed by bytes). - ->>> You need to either figure out what your web server does with stderr ->>> from cgi programs, or run ikiwiki.cgi at the command line with an ->>> appropriate environment so it thinks it's being called from a web ->>> server, so you can see how it's failing. --[[Joey]] - -(I am posting this now, but will do some research and post some more.) - -Is there any webpage with upgrade steps? - -> Users are expected to read [[news]], which points out any incompatible -> changes or cases where manual action is needed. - ->> I read it but read the usedirs option wrong :(. ->> Also it appears to be missing the news from between 1.40 to 2.0 unless they dont' exist. ->> If they do exist maybe they have release notes I need? - ->>> All the old ones are in the NEWS file. --[[Joey]] - ---JeremyReed - -My followup: I used a new ikiwiki.setup based on the latest version. But no changes for me. - -Also I forgot to mention that do=recentchanges works good for me. It uses my -rcs_recentchanges in my rcs perl module. - -The do=prefs does nothing though -- just a blank webpage. - -> You need to figure out why ikiwiki is crashing. The webserver logs should -> tell you. - -I also set verbose => 1 and running ikiwiki --setup was verbose, but no changes in running CGI. -I was hoping for some output. - -I am guessing that my rcs perl module stopped working on the upgrade. I didn't notice any release notes -on changes to revision control modules. Has something changed? I will also look. - -> No, the rcs interface has not needed to change in a long time. Also, -> nothing is done with the rcs for do=prefs. - ->> Thanks. I also checked differences between 1.40 Rcs plugins and didn't notice anything significant. - ---JeremyReed - -Another Followup: I created a new ikiwiki configuration and did the --setup to -create an entirely different website. I have same problem there. No prompt for password -and empty webpage when using the cgi. -I never upgraded any perl modules so maybe a new perl module is required but I don't see any errors so I don't know. - -The only errors I see when building and installing ikiwiki are: - - Can't exec "otl2html": No such file or directory at IkiWiki/Plugin/otl.pm line 66. - - gettext 0.14 too old, not updating the pot file - -I don't use GNU gettext on here. - -I may need to revert back to my old ikiwiki install which has been used to thousands of times (with around -1000 rcs commits via ikiwiki). - ---JeremyReed - -I downgraded to version 1.40 (that was what I had before I wrote wrong above). -Now ikiwiki is working for me again (but using 1.40). I shouldn't have tested on production system :) - ---JeremyReed - -I am back. On a different system, I installed ikiwiki 2.6.1. Same problem -- blank CGI webpage. - -So I manually ran with: - - REQUEST_METHOD=GET QUERY_STRING='do=create&page=jcr' kiwiki.cgi - -And clearly saw the error: - - [IkiWiki::main] Fatal: Bad template engine CGI::FormBuilder::Template::div: Can't locate CGI/FormBuilder/Template/div.pm - -So I found my version was too old and 3.05 is the first to provide "Div" support. I upgraded my p5-CGI-FormBuilder to 3.0501. -And ikiwiki CGI started working for me. - -The Ikiwiki docs about this requirement got removed in Revision 4367. There should be a page that lists the requirements. -(I guess I could have used the debian/control file.) - -> There is a page, [[install]] documents that 3.05 is needed. - ->> Sorry, I missed that. With hundreds of wikipages it is hard to read all of them. ->> I am updating the download page now to link to it. - -I am now using ikiwiki 2.6.1 on my testing system. - ---JeremyReed - ----- -# Excellent - how do I translate a TWiki site? - -I just discovered ikiwiki quite by chance, I was looking for a console/terminal -menu system and found pdmenu. So pdmenu brought me to here and I've found ikiwiki! -It looks as if it's just what I've been wanting for a long time. I wanted something -to create mostly text web pages which, as far as possible, have source which is human -readable or at least in a standard format. ikiwiki does this twice over by using -markdown for the source and producing static HTML from it. - -I'm currently using TWiki and have a fair number of pages in that format, does -anyone have any bright ideas for translating? I can knock up awk scripts fairly -easily, perl is possible (but I'm not strong in perl). - -> Let us know if you come up with something to transition from the other -> format. Another option would be writing a ikiwiki plugin to support the -> TWiki format. --[[Joey]] - -> Jamey Sharp and I have a set of scripts in progress to convert other wikis to ikiwiki, including history, so that we can migrate a few of our wikis. We already have support for migrating MoinMoin wikis to ikiwiki, including conversion of the entire history to Git. We used this to convert the [XCB wiki](http://xcb.freedesktop.org/wiki/) to ikiwiki; until we finalize the conversion and put the new wiki in place of the old one, you can browse the converted result at <http://xcb.freedesktop.org/ikiwiki>. We already plan to add support for TWiki (including history, since you can just run parsecvs on the TWiki RCS files to get Git), so that we can convert the [Portland State Aerospace Society wiki](http://psas.pdx.edu) (currently in Moin, but with much of its history in TWiki, and with many of its pages still in TWiki format using Jamey's TWiki format for MoinMoin). -> -> Our scripts convert by way of HTML, using portions of the source wiki's code to render as HTML (with some additional code to do things like translate MoinMoin's `\[[TableOfContents]]` to ikiwiki's `\[[!toc ]]`), and then using a modified [[!cpan HTML::WikiConverter]] to turn this into markdown and ikiwiki. This produces quite satisfactory results, apart from things that don't have any markdown equivalent and thus remain HTML, such as tables and definition lists. Conversion of the history occurs by first using another script we wrote to translate MoinMoin history to Git, then using our git-map script to map a transformation over the Git history. -> -> We will post the scripts as soon as we have them complete enough to convert our wikis. -> -> -- [[JoshTriplett]] - ->> Thanks for an excellent Xmas present, I will appreciate the additional ->> users this will help switch to ikiwiki! --[[Joey]] - - ->> Sounds great indeed. Learning from [here](http://www.bddebian.com/~wiki/AboutTheTWikiToIkiwikiConversion/) that HTML::WikiConverter needed for your conversion was not up-to-date on Debian I have now done an unofficial package, including your proposed Markdown patches, apt-get'able at <pre>deb http://debian.jones.dk/ sid wikitools</pre> ->> -- [[JonasSmedegaard]] - - ->>I see the "We will post the scripts ...." was committed about a year ago. A current site search for "Moin" does not turn them up. Any chance of an appearance in the near (end of year) future? ->> ->> -- [[MichaelRasmussen]] - ->>> It appears the scripts were never posted? I recently imported my Mediawiki site into Iki. If it helps, my notes are here: <http://iki.u32.net/Mediawiki_Conversion> --[[sabr]] - ->>>>> The scripts have been posted now, see [[joshtriplett]]'s user page, ->>>>> and I've pulled together all ways I can find to [[convert]] other ->>>>> systems into ikiwiki. --[[Joey]] - ----- - -# LaTeX support? - -Moved to [[todo/latex]] --[[Joey]] - ----- - -# Using with CVS? - -Moved to a [[todo_item|todo/CVS_backend]]. --[[JoshTriplett]] - ----- - -# Show differences before saving page? - -Moved to the existing [[todo_item|todo/preview_changes]]. --[[JoshTriplett]] - ----- - -# Max submit size? - -Any setting for limiting how many kilobytes can be submitted via the "edit" form? --- [[JeremyReed]] - ->>> See [[todo/fileupload]] for an idea on limiting page size. --[[Joey]] - ----- - -# Editing the style sheet. - -It would be nice to be able to edit the stylesheet by means of the cgi. Or is this possible? I wasn't able to achieve it. -Ok, that's my last 2 cents for a while. --[Mazirian](http://mazirian.com) - -> I don't support editing it, but if/when ikiwiki gets [[todo/fileupload]] support, -> it'll be possible to upload a style sheet. (If .css is in the allowed -> extensions list.. no idea how safe that would be, a style sheet is -> probably a great place to put XSS attacks and evil javascript that would -> be filtered out of any regular page in ikiwiki). --[[Joey]] - ->> I hadn't thought of that at all. It's a common feature and one I've ->> relied on safely, because the wikis I am maintaining at the moment ->> are all private and restricted to trusted users. Given that the whole ->> point of ikiwiki is to be able to access and edit via the shell as ->> well as the web, I suppose the features doesn't add a lot. By the ->> way, the w3m mode is brilliant. I haven't tried it yet, but the idea ->> is great. - ----- - -# Should not create an existing page - -This might be a bug, but will discuss it here first. -Clicking on an old "?" or going to a create link but new Markdown content exists, should not go into "create" mode, but should do a regular "edit". - -> I belive that currently it does a redirect to the new static web page. -> At least that's the intent of the code. --[[Joey]] - ->> Try at your site: `?page=discussion&from=index&do=create` ->> It brings up an empty textarea to start a new webpage -- even though it already exists here. --reed - ->>> Ah, right. Notice that the resulting form allows saving the page as ->>> discussion, or users/discussion, but not index/discussion, since this ->>> page already exists. If all the pages existed, it would do the redirect ->>> thing. --[[Joey]] - ----- - -# Spaces in WikiLinks? - -Hello Joey, - -I've just switched from ikiwiki 2.0 to ikiwiki 2.2 and I'm really surprised -that I can't use the spaces in WikiLinks. Could you please tell me why the spaces -aren't allowed in WikiLinks now? - -My best regards, - ---[[PaweB|ptecza]] - -> See [[bugs/Spaces_in_link_text_for_ikiwiki_links]] - ----- - -# Build in OpenSolaris? - -Moved to [[bugs/build_in_opensolaris]] --[[Joey]] - ----- - -# Various ways to use Subversion with ikiwiki - -I'm playing around with various ways that I can use subversion with ikiwiki. - -* Is it possible to have ikiwiki point to a subversion repository which is on a different server? The basic checkin/checkout functionality seems to work but there doesn't seem to be any way to make the post-commit hook work for a non-local server? - -> This is difficult to do since ikiwiki's post-commit wrapper expects to -> run on a machine that contains both the svn repository and the .ikiwiki -> state directory. However, with recent versions of ikiwiki, you can get -> away without running the post-commit wrapper on commit, and all you lose -> is the ability to send commit notification emails. - -> (And now that [[recentchanges]] includes rss, you can just subscribe to -> that, no need to worry about commit notification emails anymore.) - -* Is it possible / sensible to have ikiwiki share a subversion repository with other data (either completely unrelated files or another ikiwiki instance)? This works in part but again the post-commit hook seems problematic. - ---[[AdamShand]] - -> Sure, see ikiwiki's subversion repository for example of non-wiki files -> in the same repo. If you have two wikis in one repository, you will need -> to write a post-commit script that calls the post-commit wrappers for each -> wiki. - ----- - -# Regex for Valid Characters in Filenames - -I'm sure that this is documented somewhere but I've ransacked the wiki and I can't find it. :-( What are the allowed characters in an ikiwiki page name? I'm writing a simple script to make updating my blog easier and need to filter invalid characters (so far I've found that # and , aren't allowed ;-)). Thanks for any pointers. -- [[AdamShand]] - -> The default `wiki_file_regexp` matches filenames containing only -> `[-[:alnum:]_.:/+]` -> -> The titlepage() function will convert freeform text to a valid -> page name. See [[todo/should_use_a_standard_encoding_for_utf_chars_in_filenames]] -> for an example. --[[Joey]] - ->> Perfect, thanks! ->> ->> In the end I decided that I didn't need any special characters in filenames and replaced everything but alphanumeric characters with underscores. In addition to replacing bad characters I also collapse multiple underscores into a single one, and strip off trailing and leading underscores to make tidy filenames. If it's useful to anybody else here's a sed example: ->> ->> # echo "++ Bad: ~@#$%^&*()_=}{[];,? Iki: +_-:./ Num: 65.5 ++" | sed -e 's/[^A-Za-z0-9_]/_/g' -e 's/__*/_/g' -e 's/^_//g' -e 's/_$//g' ->> Bad_Iki_Num_65_5 ->> ->>--[[AdamShand]] - -# Upgrade steps from RecentChanges CGI to static page? - -Where are the upgrade steps for RecentChanges change from CGI to static feed? -I run multiple ikiwiki-powered sites on multiple servers, but today I just upgraded one to 2.32.3. -Please have a look at -<http://bsdwiki.reedmedia.net/wiki/recentchanges.html> -Any suggestions? - -> There are no upgrade steps required. It does look like you need to enable -> the meta plugin to get a good recentchanges page though.. --[[Joey]] - -# News site where articles are submitted and then reviewed before posting? - -I am considering moving a news site to Ikiwiki. I am hoping that Ikiwiki has a feature where anonymous posters can submit a form that moderators can review and then accept for it to be posted on a news webpage (like front page of the website). - -> Well, you can have one blog that contains unreviewed articles, and -> moderators can then add a tag that makes the article show up in the main -> news feed. There's nothing stopping someone submitting an article -> pre-tagged though. If you absolutely need to lock that down, you could -> have one blog with unreviewed articles in one subdirectory, and reviewers -> then move the file over to another subdirectory when they're ready to -> publish it. (This second subdirectory would be locked to prevent others -> from writing to it.) --[[Joey]] - -Also it would be good if the news page would keep maybe just the latest 10 entries with links to an archive that make it easy to browse to old entries by date. (Could have over a thousand news articles.) - -> The inline plugin allows setting up things like this. - -Plus users be able to post feedback to news items. If anonymous, they must be approved first. I'd prefer to not use normal "wiki" editor for feedback. - -Any thoughts or examples on this? Any links to examples of news sites or blogs with outside feedback using ikiwiki? - -Thanks --[[JeremyReed]] - +All discussion that used to be here has moved to the [[forum]]. diff --git a/doc/install.mdwn b/doc/install.mdwn index cc3a4c29f..f38ae2aab 100644 --- a/doc/install.mdwn +++ b/doc/install.mdwn @@ -41,5 +41,6 @@ If you're using a shared hosting provider, of the sort where you don't have root, you can still install ikiwiki. There are tutorials covering this for a few providers: + * [[tips/NearlyFreeSpeech]] * [[tips/DreamHost]] diff --git a/doc/install/discussion.mdwn b/doc/install/discussion.mdwn index 02cdb29c9..c06893ec1 100644 --- a/doc/install/discussion.mdwn +++ b/doc/install/discussion.mdwn @@ -269,3 +269,65 @@ Any suggestions? Whew! perl Makefile.PL INSTALL_BASE=$HOME PREFIX= make make install + +--- + +03 September 2010, Report on successful manual install in Debian 5 (Lenny) AMD64: + +note: Maybe much more easy using backports, but using this tools you get a plain user cpan :) + +This where my steps: + +As root (#): + + aptitude install build-essential curl perl + + +As plain user ($), I use to install user perl modules using local::lib + + mkdir -p "$HOME/downloads" + cd "$HOME/downloads/" + wget http://search.cpan.org/CPAN/authors/id/G/GE/GETTY/local-lib-1.006007.tar.gz + wget http://ftp.de.debian.org/debian/pool/main/i/ikiwiki/ikiwiki_3.20100831.tar.gz + tar -zxf local-lib-1.006007.tar.gz + cd local-lib-1.006007/ + perl Makefile.PL --bootstrap=~/.perl5 + make test && make install + echo 'eval $(perl -I$HOME/.perl5/lib/perl5 -Mlocal::lib=$HOME/.perl5)' >>~/.bashrc + . ~/.bashrc + curl -L http://cpanmin.us | perl - App::cpanminus + cpanm CGI::FormBuilder + cpanm CGI::Session + cpanm HTML::Parser + cpanm HTML::Template + cpanm HTML::Scrubber + cpanm Text::Markdown + cpanm URI + cd .. + tar -zxf ikiwiki_3.20100831.tar.gz + cd ikiwiki/ + perl Makefile.PL INSTALL_BASE= PREFIX=/home/$USER/.perl5 + make test # All tests successful. + make install INSTALL_BASE=/home/$USER/.perl5 + . ~/.bashrc + +Using cpan or cpanm with local::lib, you can install any other dependency, as plain user (in your home). XS modules may need -dev packages. + +After all, here it's: + + ikiwiki -version + ikiwiki version 3.20100831 + +It seems like this installation looses the /etc files (we're as plain user), but this can be used as a workaround: + + ikiwiki -setup ~/downloads/ikiwiki/auto.setup + +I've not investigated more the /etc files ussage, but does not seems like a good idea to be as plain user... + + /etc/ikiwiki/wikilist does not exist + ** Failed to add you to the system wikilist file. + ** (Probably ikiwiki-update-wikilist is not SUID root.) + ** Your wiki will not be automatically updated when ikiwiki is upgraded. + + +Iñigo diff --git a/doc/news/code_swarm/discussion.mdwn b/doc/news/code_swarm/discussion.mdwn new file mode 100644 index 000000000..3ecc81b86 --- /dev/null +++ b/doc/news/code_swarm/discussion.mdwn @@ -0,0 +1,3 @@ +Looks like ImageMagick isn't install on the new server! :-) -- AdamShand + +> Thanks for pointing out problem, fixed now. --[[Joey]] diff --git a/doc/news/discussion.mdwn b/doc/news/discussion.mdwn index 351e39c62..d6a548f8b 100644 --- a/doc/news/discussion.mdwn +++ b/doc/news/discussion.mdwn @@ -1,3 +1,9 @@ +## 3.20091017 news item removed? +Hi! Why have you [removed](http://git.ikiwiki.info/?p=ikiwiki;a=blobdiff;f=doc/news/version_3.20091017.mdwn;h=0000000000000000000000000000000000000000;hp=aba830a82f881bd97d11fe644eb2c78b99c2258d;hb=9fdd9af2db2bd21e543fa0f5f4bfa85b56b8dd5c;hpb=b74dceb884a60f6f7be395378a009ee414726d0b) the item for +3.20091017? Perhaps, it's an error, isn't it? The corresponding code AFAIU is still there. --Ivan Z. + +> I always remove old news items when making a new release. The info is still there in the changelog if needed. --[[Joey]] + ## Ikiwiki 3.12 Joey, what about news for Ikiwiki 3.12? The changelog says is has been released diff --git a/doc/news/ikiwiki-hosting.mdwn b/doc/news/ikiwiki-hosting.mdwn new file mode 100644 index 000000000..092530a14 --- /dev/null +++ b/doc/news/ikiwiki-hosting.mdwn @@ -0,0 +1,16 @@ +ikiwiki-hosting is an interface on top of Ikiwiki to allow easy management +of lots of ikiwiki sites. I developed it for +[Branchable](http://www.branchable.com/), an Ikiwiki hosting provider. +It has a powerful, scriptable command-line interface, and also +includes special-purpose ikiwiki plugins for things like a user control +panel. + +To get a feel for it, here are some examples: + + ikisite create foo.ikiwiki.net --admin http://joey.kitenet.net/ + ikisite branch foo.ikiwiki.net bar.ikiwiki.net + ikisite backup bar.ikiwiki.net --stdout | ssh otherhost 'ikisite restore bar.ikiwiki.net --stdin' + +ikiwiki-hosting is free software, released under the AGPL. Its website: +<http://ikiwiki-hosting.branchable.com/> +--[[Joey]] diff --git a/doc/news/openid.mdwn b/doc/news/openid.mdwn index 4f1ee7bf7..87f640321 100644 --- a/doc/news/openid.mdwn +++ b/doc/news/openid.mdwn @@ -10,4 +10,4 @@ log back in, try out the OpenID signup process if you don't already have an OpenID, and see how OpenID works for you. And let me know your feelings about making such a switch. --[[Joey]] -[[!poll 64 "Accept only OpenID for logins" 21 "Accept only password logins" 36 "Accept both"]] +[[!poll 67 "Accept only OpenID for logins" 21 "Accept only password logins" 41 "Accept both"]] diff --git a/doc/news/openid/discussion.mdwn b/doc/news/openid/discussion.mdwn index e611fa77b..bc9856ad9 100644 --- a/doc/news/openid/discussion.mdwn +++ b/doc/news/openid/discussion.mdwn @@ -80,3 +80,17 @@ which fails here? Or is something broken in Ikiwiki's implementation? > [[bugs/OpenID_delegation_fails_on_my_server]] --[[Joey]] Yes. I'd only recently set up my server as a delegate under wordpress, so still thought that perhaps the issue was on my end. But I'd since used my delegate successfully elsewhere, so I filed it as a bug against ikiwiki. + +---- +###Pretty Painless +I just tried logging it with OpenID and it Just Worked. Pretty painless. If you want to turn off password authentication on ikiwiki.info, I say go for it. --[[blipvert]] + +> I doubt I will. The new login interface basically makes password login +> and openid cooexist nicely. --[[Joey]] + +###LiveJournal openid +One caveat to the above is that, of course, OpenID is a distributed trust system which means you do have to think about the trust aspect. A case in point is livejournal.com whose OpenID implementation is badly broken in one important respect: If a LiveJournal user deletes his or her journal, and a different user registers a journal with the same name (this is actually quite a common occurrence on LiveJournal), they in effect inherit the previous journal owner's identity. LiveJournal does not even have a mechanism in place for a remote site even to detect that a journal has changed hands. It is an extremely dodgy situation which they seem to have *no* intention of fixing, and the bottom line is that the "identity" represented by a *username*.livejournal.com token should not be trusted as to its long-term uniqueness. Just FYI. --[[blipvert]] + +---- + +Submitting bugs in the OpenID components will be difficult if OpenID must be working first... diff --git a/doc/news/server_move_2009.mdwn b/doc/news/server_move_2009.mdwn new file mode 100644 index 000000000..8be5debe1 --- /dev/null +++ b/doc/news/server_move_2009.mdwn @@ -0,0 +1,6 @@ +[[!meta title="server move"]] + +The ikiwiki.info domain has been moved to a new server. If you can see +this, your DNS has already caught up and you are using the new server. +By the way, the new server should be somewhat faster. +--[[Joey]] diff --git a/doc/news/version_3.141/discussion.mdwn b/doc/news/version_3.141/discussion.mdwn deleted file mode 100644 index 1f5f39282..000000000 --- a/doc/news/version_3.141/discussion.mdwn +++ /dev/null @@ -1,16 +0,0 @@ -Version 3.141!? Is it not a mistake? Maybe you meant 3.14.1 or 3.15? ---[[Paweł|users/ptecza]] - -> I suspect the next version will be 3.1415 ;) -- [[Jon]] - ->> And next 3.14159, 3.141592, etc. :) I think that version schema ->> should be patented by Joey ;) --[[Paweł|users/ptecza]] - ->>> That's not exactly new; quoting from <http://www-cs-faculty.stanford.edu/~knuth/abcde.html>: ->>> ->>>> The latest and best TeX is currently version 3.1415926 (and plain.tex is version 3.141592653); METAFONT is currently version 2.718281 (and plain.mf is version 2.71). My last will and testament for TeX and METAFONT is that their version numbers ultimately become $\pi$ and $e$, respectively. At that point they will be completely error-free by definition. ->>> ->>> --[[tschwinge]] - ->>>> Thanks for the info, Thomas! I didn't know about it. Sorry Joey, ->>>> but Don Knuth was faster. What a pity... ;) --[[Paweł|users/ptecza]] diff --git a/doc/news/version_3.14159.mdwn b/doc/news/version_3.14159.mdwn deleted file mode 100644 index 21f91fdb4..000000000 --- a/doc/news/version_3.14159.mdwn +++ /dev/null @@ -1,5 +0,0 @@ -ikiwiki 3.14159 released with [[!toggle text="these changes"]] -[[!toggleable text=""" - * svn: Fix rcs\_rename to properly scope call to dirname. - * img: Pass the align parameter through to the generated img tag. - * Move OpenID pretty-printing from openid plugin to core (smcv)"""]]
\ No newline at end of file diff --git a/doc/news/version_3.141592.mdwn b/doc/news/version_3.141592.mdwn deleted file mode 100644 index 5911e07f9..000000000 --- a/doc/news/version_3.141592.mdwn +++ /dev/null @@ -1,19 +0,0 @@ -ikiwiki 3.141592 released with [[!toggle text="these changes"]] -[[!toggleable text=""" - * Add new hooks: canremove, canrename, rename. (intrigeri) - * rename: Refactor subpage rename handling code into rename hook. (intrigeri) - * po: New plugin, suporting translation of wiki pages using po files. - (intrigeri) - * Add build machinery to build po files to translate the underlay wikis, - * Add further build machinery to generate translated underlays from - the po file, for use by wikis whose primary language is not English. - * Add Danish basewiki translation by Jonas Smedegaard. - * img: Fix adding of dependency from page to the image. - * pagestats: add `among` parameter, which only counts links from specified - pages (smcv) - * pagestats: when making a tag cloud, don't emit links where the tag is - unused (smcv) - * map: Avoid emitting an unclosed ul element if the map is empty. (harishcm) - * inline: Add pagenames parameter that can be used to list a set of - pages to inline, in a specific order, without using a PageSpec. (smcv) - * Add getsource plugin (Will, smcv)"""]]
\ No newline at end of file diff --git a/doc/news/version_3.1415926.mdwn b/doc/news/version_3.1415926.mdwn deleted file mode 100644 index d31812c8e..000000000 --- a/doc/news/version_3.1415926.mdwn +++ /dev/null @@ -1,53 +0,0 @@ -News for ikiwiki 3.1415926: - - In order to fix a performance bug, all wikis need to be rebuilt on - upgrade to this version. If you listed your wiki in - /etc/ikiwiki/wikilist this will be done automatically when the - Debian package is upgraded. Or use ikiwiki-mass-rebuild to force - a rebuild. - -ikiwiki 3.1415926 released with [[!toggle text="these changes"]] -[[!toggleable text=""" - * [ Joey Hess ] - * po: Detect if nowrapi18n can't be passed to po4a, and warn about - the old version, but continue. Closes: #[541205](http://bugs.debian.org/541205) - * inline: Avoid use of my $\_ as it fails with older perls. - Closes: #[541215](http://bugs.debian.org/541215) - * Add discussionpage configuration setting. - * Several optimisations, including speedups to orphans and brokenlinks - calculation. - * meta, img: Fix bugs in dependency code. (smcv) - * Allow building ikiwiki on systems w/o po4a -- - building of the translated underlays will be skipped in this case. - * Add basic styling of po plugin's languages list. - * inline: Display an error if feedpages is specified and fails to match - due to a problem such as created\_before being told to check against - a page that does not exist. - * Remove deprecated ikiwiki/blog and ikiwiki/preprocessordirective - pages from the basewiki. - * Updated French program translation from Philippe Batailler. - Closes: #[542036](http://bugs.debian.org/542036) - * po: Fixed to run rcs\_add ralative to srcdir. - * Italian program translation from Luca Bruno. - * Fix example blog's tags/life to not have a broken PageSpec. - Closes: #[543510](http://bugs.debian.org/543510) - * Optimize the dependencies list. This also fixes a bug - that could cause repeated refreshes of the wiki to grow - increasingly larger dependency lists, and get increasingly - slower. (smcv) - * Rebuild wikis on upgrade to this version to fix bloat caused - by the dependency bug. - * Further optimisation of dependency handling by adding a special - case for simple page dependencies. (smcv) - * htmltidy: Return an error message if tidy fails. Closes: #[543722](http://bugs.debian.org/543722) - * po: Fix name of translated toplevel index page. (intrigeri) - * po: Fix display of links from a translated page to itself (ntrigeri) - * Add Czech basewiki translation from Miroslav Kure. - * po: fix interdiction to create pages of type po (intrigeri) - * po: po: favor the type of linking page's masterpage on page creation - (intrigeri) - * img: Don't generate new verison of image if it is scaled to be - larger in either dimension. - * [ Josh Triplett ] - * teximg: Replace the insufficient blacklist with the built-in security - mechanisms of TeX. ([[!cve CVE-2009-2944]])"""]] diff --git a/doc/news/version_3.14159265.mdwn b/doc/news/version_3.14159265.mdwn deleted file mode 100644 index ed46b09ea..000000000 --- a/doc/news/version_3.14159265.mdwn +++ /dev/null @@ -1,18 +0,0 @@ -ikiwiki 3.14159265 released with [[!toggle text="these changes"]] -[[!toggleable text=""" - * Add complete French basewiki and underlays translation from the Debian - French l10n team, including Philippe Batailler, Alexandre Dupas, and - Steve Petruzzello. - * Expand banned\_users; it can now include PageSpecs, which - allows banning by IP address. - * underlay: Also allow configuring additional directories to search - for template files in. - * Fix parsing web commits from ipv6 addresses. - * Add genwrapper hook, that can be used to add code into the C wrapper. - * cvs: Yeah, ikiwiki even supports CVS now. Plugin contributed by - Amitai Schlair. - * Updated Czech translation from Miroslav Kure. Closes: #[546223](http://bugs.debian.org/546223) - * rsync: New plugin that allows pushing the destdir to a remote host - via rsync or similar. Thanks, Amitai Schlair. - * auto.setup, auto-blog.setup: Fix sanitization of entered wikiname. - Closes: #[547378](http://bugs.debian.org/547378)"""]]
\ No newline at end of file diff --git a/doc/news/version_3.20091009.mdwn b/doc/news/version_3.20091009.mdwn deleted file mode 100644 index 9ab1299b9..000000000 --- a/doc/news/version_3.20091009.mdwn +++ /dev/null @@ -1,14 +0,0 @@ -ikiwiki 3.20091009 released with [[!toggle text="these changes"]] -[[!toggleable text=""" - * parentlinks: Add has\_parentlinks template parameter to allow styling - the toplevel index differently etc. - * img: Correct bug in image size calculation code. - * img: Fix dependency code for full size images. - * toggle, relativedate: Support templates that add attributes - to the body tag. - * Support RPC::XML 0.69's incompatible object instantiation method. - * mirrorlist: Display nothing if list is empty. - * Fix a bug that could lead to duplicate links being recorded - for tags. - * Optimize away most expensive file prune calls, when refreshing, - by only checking new files."""]]
\ No newline at end of file diff --git a/doc/news/version_3.20110124.mdwn b/doc/news/version_3.20110124.mdwn new file mode 100644 index 000000000..0d9d787fe --- /dev/null +++ b/doc/news/version_3.20110124.mdwn @@ -0,0 +1,7 @@ +ikiwiki 3.20110124 released with [[!toggle text="these changes"]] +[[!toggleable text=""" + * comments: Fix commenting, broken by security fix. + * blogspam: Don't check modifications from admins for spam, and also + allow the blogspam\_pagespec to do other matches against who the user is. + * inline: Fix regression in feed titles. Closes: #[610878](http://bugs.debian.org/610878) + (Thanks, Paul Wise)"""]]
\ No newline at end of file diff --git a/doc/news/version_3.20110225.mdwn b/doc/news/version_3.20110225.mdwn new file mode 100644 index 000000000..8e38bd7b8 --- /dev/null +++ b/doc/news/version_3.20110225.mdwn @@ -0,0 +1,23 @@ +ikiwiki 3.20110225 released with [[!toggle text="these changes"]] +[[!toggleable text=""" + * editpage: Avoid inheriting internal page types. + * htmltidy: Avoid breaking the sidebar when websetup is running. + * transient: New utility plugin that allows transient pages to + be stored in .ikiwiki/transient/ (smcv) + * aggregate: Aggregated content is stored in the transient underlay. + (Existing aggregated content is not moved, since it will eventually + expire and be removed) (smcv) + * autoindex, tag: Added autoindex\_commit and tag\_autocreate\_commit that + can be unset to make index files and tags respectively not be committed, + and instead be stored in the transient underlay. + Closes: #[544322](http://bugs.debian.org/544322) (smcv) + * autoindex: Adapted to use add\_autofile. Slight behavior changes + in edge cases that are probably really bug fixes. (smcv) + * recentchanges: Use transient underlay (smcv) + * map: Avoid unnecessary ul's in maps with nested directories. + (Giuseppe Bilotta) + * Fix broken baseurl in cgi mode when usedirs is disabled. Bug introduced + in 3.20101231. + * inline: Fix link to nested inlined pages's feeds. (Giuseppe Bilotta) + * inline: Add 'id' parameter that can be used when styling individual + feedlinks and postforms. (Giuseppe Bilotta)"""]]
\ No newline at end of file diff --git a/doc/news/version_3.20110321.mdwn b/doc/news/version_3.20110321.mdwn new file mode 100644 index 000000000..3282356bc --- /dev/null +++ b/doc/news/version_3.20110321.mdwn @@ -0,0 +1,11 @@ +ikiwiki 3.20110321 released with [[!toggle text="these changes"]] +[[!toggleable text=""" + * comment: Don't show comments of subpages on parent pages. + (Fixes bug introduced in version 3.20100505.) + * darcs: Fix multiple issues preventing rcs\_diff from working. + * aggregate: Read cookies from ~/.ikiwiki/cookies by default. + Also, the cookiejar configuration setting can be used by + other plugins to provide a custom `cookie\_jar` object for LWP::UserAgent. + (Thanks, schmonz) + * Avoid escaping / characters in filenames when building the cgiurl, + as this confuses eg, cvsweb."""]]
\ No newline at end of file diff --git a/doc/news/version_3.20110328.mdwn b/doc/news/version_3.20110328.mdwn new file mode 100644 index 000000000..db19e35c0 --- /dev/null +++ b/doc/news/version_3.20110328.mdwn @@ -0,0 +1,10 @@ +ikiwiki 3.20110328 released with [[!toggle text="these changes"]] +[[!toggleable text=""" + * Yaml formatted setup files are now produced by default. + (Perl formatted setup files can still be used.) + * Add timezone setting in setup file. This alows time zone to be configured + via the web. + * comment: Better fix to avoid showing comments of subpages, while + not breaking manual inlining of comments. + * meta: Security fix; don't allow alternative stylesheets to be added + on pages where the htmlscrubber is enabled."""]]
\ No newline at end of file diff --git a/doc/news/version_3.20110430.mdwn b/doc/news/version_3.20110430.mdwn new file mode 100644 index 000000000..ac2c815b4 --- /dev/null +++ b/doc/news/version_3.20110430.mdwn @@ -0,0 +1,19 @@ +ikiwiki 3.20110430 released with [[!toggle text="these changes"]] +[[!toggleable text=""" + * meta: Allow adding javascript to pages. Only when htmlscrubber is + disabled, naturally. (Thanks, Giuseppe Bilotta) Closes: #[623154](http://bugs.debian.org/623154) + * comments: Add avatar picture of comment author, using Libravatar::URL + when available. The avatar is looked up based on the user's openid, + or email address. (Thanks, Francois Marier) + * Recommend libgravatar-url-perl, which contains Libravatar::URL. + * monotone: Implement rcs\_getmtime, and work around a problem with monotone + 0.48 that affects rcs\_getctime. (Thanks, Richard Levitte) + * meta: Fix bug in loading of HTML::Entities that can break inline + archive=yes (mostly masked by other plugins that load the module). + * Be quiet about updating wrappers, except in verbose mode. (jmtd) + * meta: Add FOAF support. Closes: #[623156](http://bugs.debian.org/623156) (Jonas Smedegaard) + * Promote Crypt::SSLeay to Recommends; needed for https openid auth. + * tag: Avoid autocreating multiple tag pages that vary only in + capitalization. The first capitalization seen of a tag will be used + for the tag page. + * Fix yaml build dep. Closes: #[624712](http://bugs.debian.org/624712)"""]]
\ No newline at end of file diff --git a/doc/patch.mdwn b/doc/patch.mdwn index b570d995c..7d0f9847c 100644 --- a/doc/patch.mdwn +++ b/doc/patch.mdwn @@ -5,7 +5,8 @@ If you post a patch to the [[todo]] or [[bugs]] list, or elsewhere, once it's ready to be applied, add a 'patch' tag so it will show up here. If your patch is non-trivial and might need several iterations to get -right, please consider publishing a [[git]] branch. +right, or you'd just like to make it easy for [[Joey]] to apply it, +please consider publishing a [[git]] [[branch|branches]]. [[!inline pages="(todo/* or bugs/*) and link(patch) and !link(bugs/done) and !link(todo/done) and !*/Discussion" rootpage="todo" archive="yes"]] diff --git a/doc/peteg.mdwn b/doc/peteg.mdwn new file mode 100644 index 000000000..4e2face0e --- /dev/null +++ b/doc/peteg.mdwn @@ -0,0 +1,7 @@ +I'm adding some plugins to Ikiwiki to support a bioacoustic wiki. See here: + +<http://bioacoustics.cse.unsw.edu.au/wiki/> + +Personal home page: + +<http://peteg.org/> diff --git a/doc/plugins.mdwn b/doc/plugins.mdwn index 697b4a219..0bea33592 100644 --- a/doc/plugins.mdwn +++ b/doc/plugins.mdwn @@ -14,4 +14,4 @@ will fit most uses of ikiwiki. ## Plugin directory [[!map pages="plugins/* and !plugins/type/* and !plugins/write and -!plugins/write/* and !plugins/contrib and !plugins/install and !*/Discussion"]] +!plugins/write/* and !plugins/contrib and !plugins/contrib/*/* and !plugins/install and !*/Discussion"]] diff --git a/doc/plugins/404.mdwn b/doc/plugins/404.mdwn index ad332ee04..bf033202a 100644 --- a/doc/plugins/404.mdwn +++ b/doc/plugins/404.mdwn @@ -1,5 +1,5 @@ [[!template id=plugin name=404 author="[[Simon_McVittie|smcv]]"]] -[[!tag type/useful]] +[[!tag type/web]] This plugin lets you use the IkiWiki CGI script as an Apache 404 handler, to give the behaviour of various other wiki engines where visiting a @@ -13,8 +13,4 @@ file: (The path here needs to be whatever the path is to the ikiwiki.cgi from the root of your web server.) -Or put something like this in the wiki's Lighttpd (>=1.4.17) configuration file: - - server.error-handler-404 = "/ikiwiki.cgi" - diff --git a/doc/plugins/404/discussion.mdwn b/doc/plugins/404/discussion.mdwn new file mode 100644 index 000000000..5a8e8ed85 --- /dev/null +++ b/doc/plugins/404/discussion.mdwn @@ -0,0 +1,3 @@ +With Apache, if you have a page foo/bar/baz but no foo/bar, and if you've +disabled `Indexes` option, you'll end up with a `403` response for foo/bar. +The 404 plugin doesn't try to handle that. But it should. -- [[Jogo]] diff --git a/doc/plugins/aggregate.mdwn b/doc/plugins/aggregate.mdwn index e2efcd83f..75123d923 100644 --- a/doc/plugins/aggregate.mdwn +++ b/doc/plugins/aggregate.mdwn @@ -1,13 +1,17 @@ [[!template id=plugin name=aggregate author="[[Joey]]"]] -[[!tag type/useful]] +[[!tag type/special-purpose]] This plugin allows content from other feeds to be aggregated into the wiki. To specify feeds to aggregate, use the [[ikiwiki/directive/aggregate]] [[ikiwiki/directive]]. -The [[meta]] and [[tag]] plugins are also recommended. Either the -[[htmltidy]] or [[htmlbalance]] plugin is suggested, since feeds can easily -contain html problems, some of which these plugins can fix. +## requirements + +The [[meta]] and [[tag]] plugins are also recommended to be used with this +one. Either the [[htmltidy]] or [[htmlbalance]] plugin is suggested, since +feeds can easily contain html problems, some of which these plugins can fix. + +## triggering aggregation You will need to run ikiwiki periodically from a cron job, passing it the --aggregate parameter, to make it check for new posts. Here's an example @@ -15,6 +19,11 @@ crontab entry: */15 * * * * ikiwiki --setup my.wiki --aggregate --refresh +The plugin updates a file `.ikiwiki/aggregatetime` with the unix time stamp +when the next aggregation run could occur. (The file may be empty, if no +aggregation is required.) This can be integrated into more complex cron +jobs or systems to trigger aggregation only when needed. + Alternatively, you can allow `ikiwiki.cgi` to trigger the aggregation. You should only need this if for some reason you cannot use cron, and instead want to use a service such as [WebCron](http://webcron.org). To enable @@ -39,3 +48,10 @@ plugin as well as aggregate itself, since feed entries will be stored as HTML, and as first-class wiki pages -- each one generates a separate HTML page in the output, and they can even be edited. This option is provided only for backwards compatability. + +## cookies + +The `cookiejar` option can be used to configure how [[!cpan LWP::UserAgent]] +handles cookies. The default is to read them from a file +`~/.ikiwiki/cookies`, which can be populated using standard perl cookie +tools like [[!cpan HTTP::Cookies]]. diff --git a/doc/plugins/aggregate/discussion.mdwn b/doc/plugins/aggregate/discussion.mdwn index 1a9844577..028775ec8 100644 --- a/doc/plugins/aggregate/discussion.mdwn +++ b/doc/plugins/aggregate/discussion.mdwn @@ -89,3 +89,49 @@ New bug: new posts aren't getting displayed (or cached for aggregation). After f >>> mind having a copy to investigate. --[[Joey]] >>>> Didn't think of that, will keep a copy if there's a next time. -- [[schmonz]] + +----- + +In a corporate environment where feeds are generally behind +authentication, I need to prime the aggregator's `LWP::UserAgent` +with some cookies. What I've done is write a custom plugin to populate +`$config{cookies}` with an `HTTP::Cookies` object, plus this diff: + + --- /var/tmp/pkg/lib/perl5/vendor_perl/5.10.0/IkiWiki/Plugin/aggregate.pm 2010-06-24 13:03:33.000000000 -0400 + +++ aggregate.pm 2010-06-24 13:04:09.000000000 -0400 + @@ -488,7 +488,11 @@ + } + $feed->{feedurl}=pop @urls; + } + - my $res=URI::Fetch->fetch($feed->{feedurl}); + + my $res=URI::Fetch->fetch($feed->{feedurl}, + + UserAgent => LWP::UserAgent->new( + + cookie_jar => $config{cookies}, + + ), + + ); + if (! $res) { + $feed->{message}=URI::Fetch->errstr; + $feed->{error}=1; + +It works, but I have to remember to apply the diff whenever I update +ikiwiki. Can you provide a more elegant means of allowing cookies and/or +the user agent to be programmatically manipulated? --[[schmonz]] + +> Ping -- is the above patch perhaps acceptable (or near-acceptable)? -- [[schmonz]] + +>> Pong.. I'd be happier with a more 100% solution that let cookies be used +>> w/o needing to write a custom plugin to do it. --[[Joey]] + +>>> According to LWP::UserAgent, for the common case, a complete +>>> and valid configuration for `$config{cookies}` would be `{ file => +>>> "$ENV{HOME}/.cookies.txt" }`. In the more common case of not needing +>>> to prime one's cookies, `cookie_jar` can be `undef` (that's the +>>> default). In my less common case, the cookies are generated by +>>> visiting a couple magic URLs, which would be trivial to turn into +>>> config options, except that these particular URLs rely on SPNEGO +>>> and so LWP::Authen::Negotiate has to be loaded. So I think adding +>>> `$config{cookies}` (and using it in the aggregate plugin) should +>>> be safe, might help people in typical cases, and won't prevent +>>> further enhancements for less typical cases. --[[schmonz]] + +>>>> Ok, done. Called it cookiejar. --[[Joey]] diff --git a/doc/plugins/amazon_s3.mdwn b/doc/plugins/amazon_s3.mdwn index 331dc4acf..7fe60cb8d 100644 --- a/doc/plugins/amazon_s3.mdwn +++ b/doc/plugins/amazon_s3.mdwn @@ -22,9 +22,10 @@ This plugin uses the following settings in the setup file: set it to "foo", then the url will be "http://foo.s3.amazonaws.com/wiki/". * `amazon_s3_prefix` - A prefix to prepend to each page name. - The default is "wiki/". Note that due to S3 limitations (lack of support - for uploading a root key), it is not possible to set the prefix to an - empty string. + The default is "wiki/". Note: In order to host your site at the root, + it needs to be set to "", and you'll have to + [read this](http://aws.typepad.com/aws/2011/02/host-your-static-website-on-amazon-s3.html) + for details about configuring your S3 bucket as a website. * `amazon_s3_location` - Optionally, this can be set to control which datacenter to use. For example, set it to "EU" to for Europe. * `amazon_s3_dupindex` - Normally, when `usedirs` is enabled, @@ -33,7 +34,8 @@ This plugin uses the following settings in the setup file: "index.html" in their names to work, you can enable this option. Then each index.html file will be stored in S3 *twice*, under both names. This will use more disk and bandwidth, and is not recommended unless you really - need it for some reason. + need it for some reason. These days, it's probably better to configure + your S3 bucket as a website. Note that you should still set `destdir` in the setup file. The files that are uploaded to Amazon S3 will still be written to the destdir, too. diff --git a/doc/plugins/autoindex.mdwn b/doc/plugins/autoindex.mdwn index 03e2d12f3..e1cfe1157 100644 --- a/doc/plugins/autoindex.mdwn +++ b/doc/plugins/autoindex.mdwn @@ -1,7 +1,10 @@ [[!template id=plugin name=autoindex core=0 author="[[Joey]]"]] -[[!tag type/useful]] +[[!tag type/special-purpose]] This plugin searches for [[SubPages|ikiwiki/subpage]] with a missing parent page, and generates the parent pages. The generated page content is -controlled by the `autoindex.tmpl` [[template|wikitemplates]], which by +controlled by the `autoindex.tmpl` [[template|templates]], which by default, uses a [[map]] to list the SubPages. + +The `autoindex_commit` setting is enabled by default, and causes +pages generated by autoindex to be checked into version control. diff --git a/doc/plugins/autoindex/discussion.mdwn b/doc/plugins/autoindex/discussion.mdwn index 2d6b6f1f0..76d09cd3c 100644 --- a/doc/plugins/autoindex/discussion.mdwn +++ b/doc/plugins/autoindex/discussion.mdwn @@ -1,6 +1,13 @@ Would it be possible to add an option to only generate the index files for the html output and not place the markdown files in the wiki source? +> Or better still, add a mechanism for ikiwiki to hold transient source +> pages in memory and render them as if they existed, without actually +> writing them out, as [[JoeRayhawk]] suggests below? I think +> add_autofile would be the way to do this. +> I've added this to [[todo]] as [[todo/autoindex should use add__95__autofile]] +> and [[todo/transient_pages]]. --[[smcv]] + The reason being that I have a lot of directories which need to be autoindexed, but I would prefer if the index files didn't clutter up my git repository. @@ -15,6 +22,8 @@ If you just don't want to clutter your git repo, below it's a patch does the fol * If you set autoindex_commit to 1 (this is the default), auto-generated index files will be put in the repo provided you enabled rcs backend. +[[!toggle id="patch-for-autoindex_commit" text="patch for autoindex_commit"]] +[[!toggleable id="patch-for-autoindex_commit" text=""" <pre> --- autoindex.pm.orig 2009-10-01 17:13:51.000000000 +0800 +++ autoindex.pm 2009-10-01 17:21:09.000000000 +0800 @@ -58,6 +67,18 @@ If you just don't want to clutter your git repo, below it's a patch does the fol gettext("automatic index generation"), undef, undef); </pre> - +"""]] Warning: I guess this patch may work, but I *haven't tested it yet*. -- [[weakish]] + +------ + +`autoindex_commit => 0` would be nice, but uncommited files are definitely not. +<pre> +remote: From /srv/git/test3 +remote: 3047077..1df636c master -> origin/master +remote: error: Untracked working tree file 'test.mdwn' would be overwritten by merge. Aborting +remote: 'git pull --prune origin' failed: at /usr/share/perl5/IkiWiki/Plugin/git.pm line 201. +</pre> + +It'd be nice if we were able to notice directories with no associated compilable markup files and compile a simple map directive straight to HTML without any intermediate markup file being involved at all. -- JoeRayhawk diff --git a/doc/plugins/blogspam.mdwn b/doc/plugins/blogspam.mdwn index a13b6e8f4..c158316d4 100644 --- a/doc/plugins/blogspam.mdwn +++ b/doc/plugins/blogspam.mdwn @@ -23,7 +23,7 @@ you can check whether the interaction with blogspam.net works. The `blogspam_pagespec` setting is a [[ikiwiki/PageSpec]] that can be used to configure which pages are checked for spam. The default is to check all edits. If you only want to check [[comments]] (not wiki page edits), -set it to "postcomment(*)". +set it to "postcomment(*)". Posts by admins are never checked for spam. By default, the blogspam.net server is used to do the spam checking. To change this, the `blogspam_server` option can be set to the url for a diff --git a/doc/plugins/calendar.mdwn b/doc/plugins/calendar.mdwn index bc1bc6c71..76e718a3b 100644 --- a/doc/plugins/calendar.mdwn +++ b/doc/plugins/calendar.mdwn @@ -1,5 +1,5 @@ [[!template id=plugin name=calendar author="[[ManojSrivastava]]"]] -[[!tag type/chrome]] +[[!tag type/widget]] This plugin provides a [[ikiwiki/directive/calendar]] [[ikiwiki/directive]]. The directive displays a calendar, similar to the typical calendars shown on @@ -14,6 +14,7 @@ customization. * `month-calendar` - The month calendar as a whole. * `month-calendar-head` - The head of the month calendar (ie,"March"). +* `month-calendar-arrow` - Arrow pointing to previous/next month. * `month-calendar-day-head` - A column head in the month calendar (ie, a day-of-week abbreviation). * `month-calendar-day-noday`, `month-calendar-day-link`, @@ -27,6 +28,7 @@ customization. weekends. * `year-calendar` - The year calendar as a whole. * `year-calendar-head` - The head of the year calendar (ie, "2007"). +* `year-calendar-arrow` - Arrow pointing to previous/next year. * `year-calendar-subhead` - For example, "Months". * `year-calendar-month-link`, `year-calendar-month-nolink`, `year-calendar-month-future`, `year-calendar-this-month` - The month diff --git a/doc/plugins/calendar/discussion.mdwn b/doc/plugins/calendar/discussion.mdwn index 9d57b7a1e..6fc21e8ee 100644 --- a/doc/plugins/calendar/discussion.mdwn +++ b/doc/plugins/calendar/discussion.mdwn @@ -1,6 +1,23 @@ It would be nice if the "month" type calendar could collect all of the matching pages on a given date in some inline type way. --[[DavidBremner]] +> I agree, but I have not come up with good html to display them. Seems +> it might need some sort of popup. + Is it possible to get the calendar to link to pages based not on their timestamp (as I understand that it does now, or have I misunderstood this?) and instead on for example their location in a directory hierarchy. That way the calendar could be used as a planning / timeline device which I think would be great. --[[Alexander]] -I would like the ability to specify relative previous months. This way I could have a sidebar with the last three months by specifying no month, then 'month="-1"' and 'month="-2"'. Negative numbers for the month would otherwise be invalid, so this shouldn't produce any conflicts with expected behavior. (Right?) -- [[StevenBlack]] +I would like the ability to specify relative previous months. This way I +could have a sidebar with the last three months by specifying no month, +then 'month="-1"' and 'month="-2"'. Negative numbers for the month would +otherwise be invalid, so this shouldn't produce any conflicts with expected +behavior. (Right?) -- [[StevenBlack]] + +> Great idea! Just implemented that and also relative years. --[[Joey]] + +Anyone know of a way to generate a link to the previous and next calendar pages for archive browsing? In the worst case, that requires regenerating pages on either side of the current one when something is inserted in the history, and I can't quite figure that much out. --[[JasonRiedy]] + +> Well, the calendar directive puts such links on the calendars. They're +> the arrows to either side of the month or year at the top. --[[Joey]] + +>> Thanks. I either missed them or they appeared on an upgrade. I might make them a bit more obvious with +>> "Previous Month" / "Next Month" links above and below the text. Someday.--[[JasonRiedy]] diff --git a/doc/plugins/color.mdwn b/doc/plugins/color.mdwn index dbb8b870c..d639bf563 100644 --- a/doc/plugins/color.mdwn +++ b/doc/plugins/color.mdwn @@ -1,5 +1,5 @@ [[!template id=plugin name=color core=0 author="[[ptecza]]"]] -[[!tag type/chrome]] +[[!tag type/widget]] This plugin provides a [[ikiwiki/directive/color]] [[ikiwiki/directive]]. The directive can be used to color a piece of text on a page. diff --git a/doc/plugins/comments.mdwn b/doc/plugins/comments.mdwn index 7e2232411..48b6c6ae7 100644 --- a/doc/plugins/comments.mdwn +++ b/doc/plugins/comments.mdwn @@ -1,5 +1,5 @@ [[!template id=plugin name=comments author="[[Simon_McVittie|smcv]]"]] -[[!tag type/useful]] +[[!tag type/web]] This plugin adds "blog-style" comments. Unlike the wiki-style freeform Discussion pages, these comments are posted by a simple form, cannot later @@ -14,8 +14,8 @@ authorship should hopefully be unforgeable by CGI users. The intention is that on a non-wiki site (like a blog) you can lock all pages for admin-only access, then allow otherwise unprivileged (or perhaps even anonymous) users to comment on posts. See the documentation of the -[[lockedit]] and [[anonok]] pages for details on locking down a wiki so -users can only post comments. +[[opendiscussion]], [[lockedit]] and [[anonok]] pages for details on locking +down a wiki so readers can only post comments. Individual comments are stored as internal-use pages named something like `page/comment_1`, `page/comment_2`, etc. These pages internally use a @@ -45,8 +45,11 @@ There are some global options for the setup file: ## comment moderation If you enable the [[blogspam]] plugin, comments that appear spammy will be -held for moderation. Wiki admins can access the comment moderation queue +held for moderation. (Or with the [[moderatedcomments]] plugin, all +comments will be held.) Wiki admins can access the comment moderation queue via a button on their Preferences page. -The comments are stored in `.ikiwiki/comments_pending/`, and can be -deleted, or moved into the wiki's srcdir to be posted. +Comments pending moderation are not checked into revision control. +To find unmoderated comments, `find /your/ikiwiki/srcdir -name '*._comment_pending'` +To manually moderate a comment, just rename the file, removing the +"_pending" from the end, and check it into revision control. diff --git a/doc/plugins/comments/discussion.mdwn b/doc/plugins/comments/discussion.mdwn index 396d1f6d4..3043b0106 100644 --- a/doc/plugins/comments/discussion.mdwn +++ b/doc/plugins/comments/discussion.mdwn @@ -1,3 +1,18 @@ +## Syndication autodiscovery for comment feeds + +A standard `\[[!inline]]` directive adds links to the autogenerated syndication feeds using link tags in the header: + + <link rel="alternate" type="application/rss+xml" title="$title" href="$page.atom" /> + <link rel="alternate" type="application/atom+xml" title="$title" href="$page.atom" /> + +These links aren't added to my pages that include comments even though comments generate syndication feeds. How can I configure the comments plugin to add these links to the header? (These links are required for user-agent autodiscovery of syndication feeds.) --[[anderbubble]] + +## Moderating comments from the CLI + +How do you do this, without using the UI in the Preferences? + +Please put this info on the page. Many thanks --[[Kai Hendry]] + ## Why internal pages? (unresolved) Comments are saved as internal pages, so they can never be edited through the CGI, diff --git a/doc/plugins/conditional.mdwn b/doc/plugins/conditional.mdwn index 95ffb2764..27a99bb7c 100644 --- a/doc/plugins/conditional.mdwn +++ b/doc/plugins/conditional.mdwn @@ -1,5 +1,5 @@ [[!template id=plugin name=conditional core=1 author="[[Joey]]"]] -[[!tag type/format]] +[[!tag type/special-purpose]] This plugin provides the [[ikiwiki/directive/if]] [[ikiwiki/directive]]. With this directive, you can make text be conditionally displayed on a page. diff --git a/doc/plugins/conditional/discussion.mdwn b/doc/plugins/conditional/discussion.mdwn index 629d05940..6e84fdfc1 100644 --- a/doc/plugins/conditional/discussion.mdwn +++ b/doc/plugins/conditional/discussion.mdwn @@ -1,3 +1,28 @@ +## Conditional broken? + +Using \[\[!if test="tagged(plugin)" then="= Tagged as plugin =" else="*No plugins found*"]] on this wiki *should* present the 'Tagged as plugin' heading, instead it emits 'no plugins found'. Is the conditional plugin currently broken for tags or am I misusing it? Thanks. + +-- Thiana + +> This wiki has no page named "plugin", so nothing links to it; tags are a species of link +> so tagging a large number of pages with a tag that doesn't exist (which change has +> been reverted) doesn't make the pagespec match. It would if the tag's page existed. --[[Joey]] + +>> So if I understand this correctly... Assuming the tags Tag_A and Tag_B, the existence of +>> @wiki-home@/tags/Tag_A.creole, and a number of files with a \[\[!tag Tag_A Tag_B]] the +>> following is correct? +>> +>> * \[\[!if test="tagged(Tag_A)" then="OK" else="Fail"]] => OK +>> * \[\[!if test="tagged(Tag_B)" then="OK" else="Fail"]] => Fail +>> * \[\[!if test="tagged(Tag_A) and tagged(Tag_B)" then="OK" else="Fail"]] => Fail +>> +>> Is that the expected behaviour? If so, that's not what I'm seeing here since they all result +>> in a Fail. If not, what exactly is wrong with those conditionals? Thanks. +>> +>> -- Thiana + +---- + Would there be a way for this plugin to emit fewer blank lines (i.e. *none at all*)? For example, having a look at [this page](http://www.bddebian.com/~wiki/Hurd/)'s sidebar. diff --git a/doc/plugins/contrib.mdwn b/doc/plugins/contrib.mdwn index ac6c1b751..d8199a756 100644 --- a/doc/plugins/contrib.mdwn +++ b/doc/plugins/contrib.mdwn @@ -1,4 +1,4 @@ These plugins are provided by third parties and are not currently included in ikiwiki. See [[install]] for installation help. -[[!map pages="plugins/contrib/* and !*/Discussion"]] +[[!map pages="plugins/contrib/* and !plugins/contrib/*/* and !*/Discussion"]] diff --git a/doc/plugins/contrib/album.mdwn b/doc/plugins/contrib/album.mdwn index 395c99bce..daf16fd3c 100644 --- a/doc/plugins/contrib/album.mdwn +++ b/doc/plugins/contrib/album.mdwn @@ -9,9 +9,11 @@ thoughts about this plugin). This plugin formats a collection of images into a photo album, in the same way as many websites: good examples include the PHP application [Gallery](http://gallery.menalto.com/), Flickr, -and Facebook's Photos "application". I've called it `album` -to distinguish it from [[contrib/gallery|plugins/contrib/gallery]], -although `gallery` might well be a better name for this functionality. +and Facebook's Photos "application". + +I've called it `album` to distinguish it from +[[contrib/gallery|plugins/contrib/gallery]], although `gallery` might well be +a better name for this functionality. The web UI I'm trying to achieve consists of one [HTML page of thumbnails](http://www.pseudorandom.co.uk/2008/2008-03-08-panic-cell-gig/) @@ -26,83 +28,129 @@ individual photos can't be bookmarked in a meaningful way, and the best it can do as a fallback for non-Javascript browsers is to provide a direct link to the image.) -## Writing the viewers +<h2 id="album"><code>album</code> directive</h2> + +Each page containing an `album` directive is treated as a photo album. + +Every image attached to an album or its subpages is considered to be part of +the album. A "viewer" page, with the wiki's default page extension, will be +generated in the [[transient underlay|todo/transient_pages]] to display the +image, if there isn't already a page of the same name as the image: for +instance, if `debconf` is an album and `debconf/tuesday/p100.jpg` exists, +then `debconf/tuesday/p100.mdwn` might be created. + +There's currently a hard-coded list of extensions that are treated as images: +`png`, `gif`, `jpg`, `jpeg` or `mov` files. More image and video types could +be added in future. Videos aren't currently handled very well; +ideally, something like totem-video-thumbnailer would be used. + +The `album` directive also produces an [[ikiwiki/directive/inline]] which +automatically includes all the viewers for this album, except those that +will appear in an <a href="#albumsection">albumsection</a> (if every image +is in a section, then the `album` directive won't have any visible effect). - \[[!albumimage image=foo.jpg album=myalbum - title=... - caption=... - copyright=... - size=... - viewertemplate=... - ]] +The `inline` is in `archive` and `quick` mode, but can include some +extra information about the images, including file size and a thumbnail made +using [[ikiwiki/directive/img]]). The default template is `albumitem.tmpl`, +which takes advantage of these things. -Each viewer contains one `\[[!albumimage]]` directive. This -sets the `image` filename, the `album` in which this image appears, -and an optional `caption`, and can override the `size` at which to -display the image and the `viewertemplate` to use to display the -image. +<h2 id="albumsection"><code>albumsection</code> directive</h2> -It can also have `title`, `copyright` and `date` parameters, which -are short-cuts for [[ikiwiki/directive/meta]] directives. +The `albumsection` directive is used to split an album into sections. It can +only appear on a page that also has the <a href="#album">album</a> directive. + +The `filter` parameter is a [[ikiwiki/PageSpec]] against which viewer pages +are matched. The `albumsection` directive displays all the images that match +the filter, and the `album` directive displays any leftover images, like +this: + + # Holiday photos + + \[[!album]] + <!-- replaced with a list of any uncategorized photos, which might be + empty --> -The viewer can also have any other content, but typically the -directive will be the only thing there. + ## People -Eventually, there will be a specialized CGI user interface to -edit all the photos of an album at once, upload a new photo -(which will attach the photo but also write out a viewer page -for it), or mark an already-uploaded photo as a member of an -album (which is done by writing out a viewer page for it). + \[[!albumsection filter="tagged(people)"]] + <!-- replaced with a list of photos tagged 'people', including + any that are also tagged 'landscapes' --> -The `\[[!albumimage]]` directive is replaced by an + ## Landscapes + + \[[!albumsection filter="tagged(landscapes)"]] + <!-- replaced with a list of photos tagged 'landscapes', including + any that are also tagged 'people' --> + +<h2 id="albumimage"><code>albumimage</code> directive</h2> + +Each viewer page produced by the <a href="#album">album</a> directive +contains an `albumimage` directive, which is replaced by an [[ikiwiki/directive/img]], wrapped in some formatting using a -template (by default `albumviewer.tmpl`). The template can (and -should) also include "next photo", "previous photo" and -"up to gallery" links. +template (by default it's `albumviewer.tmpl`). That template can also include +links to the next photo, the previous photo and the album it's in; the default +template has all of these. -The next/previous links are themselves implemented by -[[inlining|ikiwiki/directive/inline]] the next or previous -photo, using a special template (by default `albumnext.tmpl` -or `albumprev.tmpl`), in `archive`/`quick` mode. +The next/previous links are themselves implemented by evaluating a template, +either `albumnext.tmpl` or `albumprev.tmpl` by default. -> With hindsight, using an inline here is wrong - I should just -> run hooks and fill in the template within the album plugin. -> inline has some specialized functionality that's overkill -> here, and its delayed HTML substitution breaks the ability -> to have previous/up/next links both above and below the -> photo, for instance. --[[smcv]] +The directive can also have parameters: -## Writing the album +* `title`, `copyright` and `date` are short-cuts for the corresponding + [[ikiwiki/directive/meta]] directives -The album contains one `\[[!album]]` directive. It may also -contain any number of `\[[!albumsection]]` directives, for -example the demo album linked above could look like: +* `caption` sets a caption which is displayed in the album and viewer + pages - \[[!album]] - <!-- replaced with one uncategorized photo --> +The viewer page can also have other contents before or after the actual +image viewer. + +## Bugs + +* The plugin doesn't do anything special to handle albums that are subpages + of each other. If, say, `debconf` and `debconf/monday` are both albums, + then `debconf/monday/p100.jpg` will currently be assigned to one or the + other, arbitrarily. + +* The plugin doesn't do anything special to handle photos with similar names. + If you have `p100.jpg` and `p100.png`, one will get a viewer page called + `p100` and the other will be ignored. + +* If there's no `albumimage` in a viewer page, one should probably be appended + automatically. - ## Gamarra +## TODO - \[[!albumsection filter="link(gamarra)"]] - <!-- all the Gamarra photos --> +* The documentation should mention how to replicate the appearance of + `album` and `albumsection` using an `inline` of viewer pages. - ## Smokescreen +* The documentation should mention all the template variables and + all the parameters. - \[[!albumsection filter="link(smokescreen)"]] - <!-- all the Smokescreen photos --> +* The generated viewer page should include most or all of the possible + parameters to the `albumimage` directive, with empty values, as a + template for editing. - ... +* The generated viewer page should extract as much metadata as possible from + the photo's EXIF tags (creation/modification dates, author, title, caption, + copyright). [[smcv]] has a half-written implementation which runs + `scanimage` hooks, and has an `exiftool` plugin using [[!cpan Image::ExifTool]] + as a reference implementation of that hook. -The `\[[!album]]` directive is replaced by an -[[ikiwiki/directive/inline]] which automatically includes every -page that has an `\[[!albumimage]]` directive linking it to this -album, except those that will appear in an `\[[!albumsection]]`. +* There should be an option to reduce the size of photos and write them into + an underlay (perhaps just the transient underlay), for this workflow: -The `inline` is in `archive`/`quick` mode, but includes some -extra information about the images, including file size and a -thumbnail (again, made using [[ikiwiki/directive/img]]). The -default template is `albumitem.tmpl`, which takes advantage -of these things. + * your laptop's local ikiwiki has two underlays, `photos` and `webphotos` + * `photos` contains full resolution photos with EXIF tags + * for each photo that exists in `photos` but not in `webphotos`, the album + plugin automatically resamples it down to a web-compatible resolution + ([[smcv]] uses up to 640x640), optimizes it with `jpegoptim`, strips out + all EXIF tags, and and writes it into the corresponding location + in `webphotos` + * `webphotos` is what you rsync to the web server + * the web server's ikiwiki only has `webphotos` as an underlay -Each `\[[!albumsection]]` is replaced by a similar inline, which -selects a subset of the photos in the album. +* Eventually, there could be a specialized CGI user interface to batch-edit + all the photos of an album (so for each photo, you get an edit box each for + title, author, copyright etc.) - this would work by making programmatic + edits to all the `albumimage` directives. diff --git a/doc/plugins/contrib/album/discussion.mdwn b/doc/plugins/contrib/album/discussion.mdwn index 5c8e74fa6..0356860d8 100644 --- a/doc/plugins/contrib/album/discussion.mdwn +++ b/doc/plugins/contrib/album/discussion.mdwn @@ -46,6 +46,10 @@ secondly: barring the CGI interface for editing the album, which would be great, > > --[[smcv]] +>> In the current version of the branch, the viewer pages are +>> generated automatically if you didn't generate them yourself, +>> so `ikiwiki-album` is no longer needed. --[[smcv]] + i'm new to ikiwiki, apologies if this is dealt with elsewhere. -brush > This plugin is pretty ambitious, and is unfinished, so I'd recommend @@ -60,7 +64,7 @@ code or tried it yet, but here goes. --[[Joey]] * Needing to create the albumimage "viewer" pages for each photo seems like it will become a pain. Everyone will need to come up with their own automation for it, and then there's the question - of how to automate it when uploading attachments. + of how to automate it when uploading attachments. -J > There's already a script (ikiwiki-album) to populate a git > checkout with skeleton "viewer" pages; I was planning to make a @@ -68,9 +72,25 @@ code or tried it yet, but here goes. --[[Joey]] > you (since the requirements for that CGI interface change depending > on the implementation). I agree that this is ugly, though. -s +>> Would you accept a version where the albumimage "viewer" pages +>> could be 0 bytes long, at least until metadata gets added? +>> +>> The more I think about the "binaries as first-class pages" approach, +>> the more subtle interactions I notice with other plugins. I +>> think I'm up to needing changes to editpage, comments, attachment +>> and recentchanges, plus adjustments to img and Render (to reduce +>> duplication when thumbnailing an image with a strange extension +>> while simultaneously changing the extension, and to hardlink/copy +>> an image with a strange extension to a differing target filename +>> with the normal extension, respectively). -s + +>>> Now that we have `add_autofile` I can just create viewer pages +>>> whenever there's an image to view. The current version of the +>>> branch does that. -s + * With each viewer page having next/prev links, I can see how you were having the scalability issues with ikiwiki's data structures - earlier! + earlier! -J > Yeah, I think they're a basic requirement from a UI point of view > though (although they don't necessarily have to be full wikilinks). @@ -80,12 +100,14 @@ code or tried it yet, but here goes. --[[Joey]] >> these can be presence dependencies, which will probably help with >> avoiding rebuilds of a page if the next/prev page is changed. >> (Unless you use img to make the thumbnails for those links, then it ->> would rebuild the thumbnails anyway. Have not looked at the code.) --[[Joey]] +>> would rebuild the thumbnails anyway. Have not looked at the code.) --[[Joey]] + +>>> I do use img. -s * And doesn't each viewer page really depend on every other page in the same albumsection? If a new page is added, the next/prev links may need to be updated, for example. If so, there will be much - unnecessary rebuilding. + unnecessary rebuilding. -J > albumsections are just a way to insert headings into the flow of > photos, so they don't actually affect dependencies. @@ -108,8 +130,13 @@ code or tried it yet, but here goes. --[[Joey]] >> metadata. Er, I mean, I have a cheezy hack in `add_depends` now that does >> it to deal with a similar case. --[[Joey]] +>>> I think I was misunderstanding how early you have to call `add_depends`? +>>> The critical thing I missed was that if you're scanning a page, you're +>>> going to rebuild it in a moment anyway, so it doesn't matter if you +>>> have no idea what it depends on until the rebuild phase. -s + * One thing I do like about having individual pages per image is - that they can each have their own comments, etc. + that they can each have their own comments, etc. -J > Yes; also, they can be wikilinked. I consider those to be > UI requirements. -s @@ -119,11 +146,40 @@ code or tried it yet, but here goes. --[[Joey]] album, but then anyone who can write to any other page on the wiki can add an image to it. 2: I may want an image to appear in more than one album. Think tags. So it seems it would be better to have the album - directive control what pages it includes (a la inline). + directive control what pages it includes (a la inline). -J + +> I'm inclined to fix this by constraining images to be subpages of exactly +> one album: if they're subpages of 2+ nested albums then they're only +> considered to be in the deepest-nested one (i.e. longest URL), and if +> they're not in any album then that's a usage error. This would +> also make prev/next links sane. -s + +>> The current version constrains images to be in at most one album, +>> choosing one arbitrarily (dependent on scan order) if albums are +>> nested. -s + +> If you want to reference images from elsewhere in the wiki and display +> them as if in an album, then you can use an ordinary inline with +> the same template that the album would use, and I'll make sure the +> templates are set up so this works. -s + +>> Still needs documenting, I've put it on the TODO list on the main +>> page. -s + +> (Implementation detail: this means that an image X/Y/Z/W/V, where X and +> Y are albums, Z does not exist and W exists but is not an album, +> would have a content dependency on Y, a presence dependency on Z +> and a content dependency on W.) +> +> Perhaps I should just restrict to having the album images be direct +> subpages of the album, although that would mean breaking some URLs +> on the existing website I'm doing all this work for... -s -> See note above about pagespecs not being very safe early on. -> You did merge my inline-with-pagenames feature, which is safe to use -> at scan time, though. +>> The current version of the branch doesn't have this restriction; +>> perhaps it's a worthwhile simplification, or perhaps it's too +>> restrictive? I fairly often use directory hierarchies like +>> `a_festival/saturday/foo.jpg` within an album, which makes +>> it very easy to write `albumsection` filters. -s * Putting a few of the above thoughts together, my ideal album system seems to be one where I can just drop the images into a directory and @@ -132,20 +188,69 @@ code or tried it yet, but here goes. --[[Joey]] etc. (Real pity we can't just put arbitrary metadata into the images themselves.) This is almost pointing toward making the images first-class wiki page sources. Hey, it worked for po! :) But the metadata and editing - problems probably don't really allow that. + problems probably don't really allow that. -J > Putting a JPEG in the web form is not an option from my point of > view :-) but perhaps there could just be a "web-editable" flag supplied > by plugins, and things could be changed to respect it. -> + +>> Replying to myself: would you accept patches to support +>> `hook(type => 'htmlize', editable => 0, ...)` in editpage? This would +>> essentially mean "this is an opaque binary: you can delete it +>> or rename it, and it might have its own special editing UI, but you +>> can never get it in a web form". +>> +>> On the other hand, that essentially means we need to reimplement +>> editpage in order to edit the sidecar files that contain the metadata. +>> Having already done one partial reimplementation of editpage (for +>> comments) I'm in no hurry to do another. +>> +>> I suppose another possibility would be to register hook +>> functions to be called by editpage when it loads and saves the +>> file. In this case, the loading hook would be to discard +>> the binary and use filter() instead, and the saving conversion +>> would be to write the edited content into the metadata sidecar +>> (creating it if necessary). +>> +>> I'd also need to make editpage (and also comments!) not allow the +>> creation of a file of type albumjpg, albumgif etc., which is something +>> I previously missed; and I'd need to make attachment able to +>> upload-and-rename. +>> -s + +>>> I believe the current branch meets your requirements, by having +>>> first-class wiki pages spring into existence using `add_autofile` +>>> to be viewer pages for photos. -s + > In a way, what you really want for metadata is to have it in the album > page, so you can batch-edit the whole lot by editing one file (this > does mean that editing the album necessarily causes each of its viewers > to be rebuilt, but in practice that happens anyway). -s -> ->> Yes, that would make some sense.. It also allows putting one image in ->> two albums, with different caption etc. (Maybe for different audiences.) + +>> Replying to myself: in practice that *doesn't* happen anyway. Having +>> the metadata in the album page is somewhat harmful because it means +>> that changing the title of one image causes every viewer in the album +>> to be rebuilt, whereas if you have a metadata file per image, only +>> the album itself, plus the next and previous viewers, need +>> rebuilding. So, I think a file per image is the way to go. >> +>> Ideally we'd have some way to "batch-edit" the metadata of all +>> images in an album at once, except that would make conflict +>> resolution much more complicated to deal with; maybe just +>> give up and scream about mid-air collisions in that case? +>> (That's apparently good enough for Bugzilla, but not really +>> for ikiwiki). -s + +>>> This is now in the main page's TODO list; if/when I implement this, +>>> I intend to make it a specialized CGI interface. -s + +>> Yes, [all metadata in one file] would make some sense.. It also allows putting one image in +>> two albums, with different caption etc. (Maybe for different audiences.) +>> --[[Joey]] + +>>> Eek. No, that's not what I had in mind at all; the metadata ends up +>>> in the "viewer" page, so it's necessarily the same for all albums. -s + >> It would probably be possible to add a new dependency type, and thus >> make ikiwiki smart about noticing whether the metadata has actually >> changed, and only update those viewers where it has. But the dependency @@ -154,7 +259,8 @@ code or tried it yet, but here goes. --[[Joey]] ---- -Trying to use the "special extension" design: +'''I think the "special extension" design is a dead-end, but here's what +happened when I tried to work out how it would work. --[[smcv]]''' Suppose that each viewer is a JPEG-or-GIF-or-something, with extension ".albumimage". We have a gallery "memes" with three images, badger, @@ -164,23 +270,26 @@ mushroom and snake. > etc as the htmlize extensions. May need some fixes to ikiwiki to support > that. --[[Joey]] +>> foo.albumjpg (etc.) for images, and foo._albummeta (with +>> `keepextension => 1`) for sidecar metadata files, seems viable. -s + Files in git repo: * index.mdwn * memes.mdwn -* memes/badger.albumimage (a renamed JPEG) +* memes/badger.albumjpg (a renamed JPEG) * memes/badger/comment_1._comment * memes/badger/comment_2._comment -* memes/mushroom.albumimage (a renamed GIF) -* memes/mushroom.meta (sidecar file with metadata) -* memes/snake.albumimage (a renamed video) +* memes/mushroom.albumgif (a renamed GIF) +* memes/mushroom._albummeta (sidecar file with metadata) +* memes/snake.albummov (a renamed video) Files in web content: * index.html * memes/index.html * memes/96x96-badger.jpg (from img) -* memes/96x96-mushroom.jpg (from img) +* memes/96x96-mushroom.gif (from img) * memes/96x96-snake.jpg (from img, hacked up to use totem-video-thumbnailer :-) ) * memes/badger/index.html (including comments) * memes/badger.jpg @@ -200,10 +309,28 @@ way to get them rendered anyway. > the image, as well as eg, smiley trying to munge it in sanitize. > --[[Joey]] +>> As long as nothing has a filter() hook that assumes it's already +>> text... filters are run in arbitrary order. We seem to be OK so far +>> though. +>> +>> If this is the route I take, I propose to have the result of filter() +>> be the contents of the sidecar metadata file (empty string if none), +>> with the `\[[!albumimage]]` directive (which no longer requires +>> arguments) prepended if not already present. This would mean that +>> meta directives in the metadata file would work as normal, and it +>> would be possible to insert text both before and after the viewer +>> if desired. The result of filter() would also be a sensible starting +>> point for editing, and the result of editing could be diverted into +>> the metadata file. -s + do=edit&page=memes/badger needs to not put the JPG in a text box: somehow divert or override the normal edit CGI by telling it that .albumimage files are not editable in the usual way? +> Something I missed here is that editpage also needs to be told that +> creating new files of type albumjpg, albumgif etc. is not allowed +> either! -s + Every image needs to depend on, and link to, the next and previous images, which is a bit tricky. In previous thinking about this I'd been applying the overly strict constraint that the ordered sequence of pages in each @@ -217,6 +344,9 @@ in order. > memoization to avoid each image in an album building the same list. > I sense that I may be missing a subtelty though. --[[Joey]] +>> I think I was misunderstanding how early you have to call `add_depends` +>> as mentioned above. -s + Perhaps restricting to "the images in an album A must match A/*" would be useful; then the unordered superset could just be "A/*". Your "albums via tags" idea would be nice too though, particularly for feature @@ -233,6 +363,9 @@ album, or something? > Ugh, yeah, that is a problem. Perhaps wanting to support that was just > too ambitious. --[[Joey]] +>> I propose to restrict to having images be subpages of albums, as +>> described above. -s + Requiring renaming is awkward for non-technical Windows/Mac users, with both platforms' defaults being to hide extensions; however, this could be circumvented by adding some sort of hook in attachment to turn things into @@ -244,13 +377,28 @@ extensions visible is a "don't do that then" situation :-) > with an extension. (Or allow specifying a full pagespec, > but I hesitate to seriously suggest that.) --[[Joey]] +>> I think that might be a terrifying idea for another day. If we can +>> mutate the extension during the `attach` upload, that'd be enough; +>> I don't think people who are skilled enough to use git/svn/..., +>> but not skilled enough to tell Explorer to show file extensions, +>> represent a major use case. -s + Ideally attachment could also be configured to upload into a specified underlay, so that photos don't have to be in your source-code control (you might want that, but I don't!). +> Replying to myself: perhaps best done as an orthogonal extension +> to attach? -s + +> Yet another non-obvious thing this design would need to do is to find +> some way to have each change to memes/badger._albummeta show up as a +> change to memes/badger in `recentchanges`. -s + Things that would be nice, and are probably possible: * make the "Edit page" link on viewers divert to album-specific CGI instead - of just failing or not appearing + of just failing or not appearing (probably possible via pagetemplate) + * some way to deep-link to memes/badger.jpg with a wikilink, without knowing a - priori that it's secretly a JPEG + priori that it's secretly a JPEG (probably harder than it looks - you'd + have to make a directive for it and it's probably not worth it) diff --git a/doc/plugins/contrib/default_content_for___42__copyright__42___and___42__license__42__.mdwn b/doc/plugins/contrib/default_content_for___42__copyright__42___and___42__license__42__.mdwn index b9ad3cc8e..16c147b68 100644 --- a/doc/plugins/contrib/default_content_for___42__copyright__42___and___42__license__42__.mdwn +++ b/doc/plugins/contrib/default_content_for___42__copyright__42___and___42__license__42__.mdwn @@ -6,9 +6,9 @@ Someone was just asking for it and I had written these two plugins already some months ago, so I'm now publishing them here. -[`copyright.pm`](http://www.schwinge.homeip.net/~thomas/tmp/copyright.pm) +[`copyright.pm`](http://schwinge.homeip.net/~thomas/tmp/copyright.pm) and -[`license.pm`](http://www.schwinge.homeip.net/~thomas/tmp/license.pm) +[`license.pm`](http://schwinge.homeip.net/~thomas/tmp/license.pm) Usage instructions are found inside the two plugin files. @@ -45,3 +45,10 @@ by ikiwiki are likewise fine. --[[tschwinge]] > and can extend beyond just copyright and license, but has the disadvantage > that it doesn't support setting defaults for a given "subdirectory" > only. --[[smcv]] + +[[!template id=gitbranch branch=smcv/contrib/defcopyright author="[[tschwinge]]"]] + +> For `./gitremotes` convenience (taking the Linus approach to backups :-) ) +> I've added this to my git repository as a branch. No review, approval or +> ownership is implied, feel free to replace this with a branch in any other +> repository --[[smcv]] diff --git a/doc/plugins/contrib/field.mdwn b/doc/plugins/contrib/field.mdwn new file mode 100644 index 000000000..dce2d891c --- /dev/null +++ b/doc/plugins/contrib/field.mdwn @@ -0,0 +1,196 @@ +[[!template id=plugin name=field author="[[rubykat]]"]] +[[!tag type/meta]] +[[!toc]] +## NAME + +IkiWiki::Plugin::field - front-end for per-page record fields. + +## SYNOPSIS + + # activate the plugin + add_plugins => [qw{goodstuff field ....}], + + # simple registration + field_register => [qw{meta}], + + # simple registration with priority + field_register => { + meta => 'last' + foo => 'DD' + }, + + # allow the config to be queried as a field + field_allow_config => 1, + + # flag certain fields as "tags" + field_tags => { + BookAuthor => '/books/authors', + BookGenre => '/books/genres', + MovieGenre => '/movies/genres', + } + +## DESCRIPTION + +This plugin is meant to be used in conjunction with other plugins +in order to provide a uniform interface to access per-page structured +data, where each page is treated like a record, and the structured data +are fields in that record. This can include the meta-data for that page, +such as the page title. + +Plugins can register a function which will return the value of a "field" for +a given page. This can be used in a few ways: + +* In page templates; all registered fields will be passed to the page template in the "pagetemplate" processing. +* In PageSpecs; the "field" function can be used to match the value of a field in a page. +* In SortSpecs; the "field" function can be used for sorting pages by the value of a field in a page. +* By other plugins, using the field_get_value function, to get the value of a field for a page, and do with it what they will. + +## CONFIGURATION OPTIONS + +The following options can be set in the ikiwiki setup file. + +**field_allow_config** + + field_allow_config => 1, + +Allow the $config hash to be queried like any other field; the +keys of the config hash are the field names. + +**field_register** + + field_register => [qw{meta}], + + field_register => { + meta => 'last' + foo => 'DD' + }, + +A hash of plugin-IDs to register. The keys of the hash are the names of the +plugins, and the values of the hash give the order of lookup of the field +values. The order can be 'first', 'last', 'middle', or an explicit order +sequence between 'AA' and 'ZZ'. If the simpler type of registration is used, +then the order will be 'middle'. + +This assumes that the plugins in question store data in the %pagestatus hash +using the ID of that plugin, and thus the field values are looked for there. + +This is the simplest form of registration, but the advantage is that it +doesn't require the plugin to be modified in order for it to be +registered with the "field" plugin. + +**field_tags** + + field_tags => { + BookAuthor => '/books/authors', + BookGenre => '/books/genres', + MovieGenre => '/movies/genres', + } + +A hash of fields and their associated pages. This provides a faceted +tagging system. + +The way this works is that a given field-name will be associated with a given +page, and the values of that field will be linked to sub-pages of that page. + +For example: + + BookGenre: SF + +will link to "/books/genres/SF", with a link-type of "bookgenre". + +## PageSpec + +The `field` plugin provides a few PageSpec functions to match values +of fields for pages. + +* field + * **field(*name* *glob*)** + * field(bar Foo\*) will match if the "bar" field starts with "Foo". +* destfield + * **destfield(*name* *glob*)** + * as for "field" but matches against the destination page (i.e when the source page is being included in another page). +* field_item + * **field_item(*name* *glob*)** + * field_item(bar Foo) will match if one of the values of the "bar" field is "Foo". +* destfield_item + * **destfield_item(*name* *glob*)** + * as for "field_item" but matches against the destination page. +* field_tagged + * **field_tagged(*name* *glob*)** + * like `tagged`, but this uses the tag-bases and link-types defined in the `field_tags` configuration option. +* destfield_tagged + * **destfield_tagged(*name* *glob*)** + * as for "field_tagged" but matches against the destination page. + +## SortSpec + +The "field" SortSpec function can be used to sort a page depending on the value of a field for that page. This is used for directives that take sort parameters, such as **inline** or **report**. + +field(*name*) + +For example: + +sort="field(bar)" will sort by the value og the "bar" field. + +## FUNCTIONS + +### field_register + +field_register(id=>$id); + +Register a plugin as having field data. The above form is the simplest, where +the field value is looked up in the %pagestatus hash under the plugin-id. + +Additional Options: + +**call=>&myfunc** + +A reference to a function to call rather than just looking up the value in the +%pagestatus hash. It takes two arguments: the name of the field, and the name +of the page. It is expected to return (a) an array of the values of that field +if "wantarray" is true, or (b) a concatenation of the values of that field +if "wantarray" is not true, or (c) undef if there is no field by that name. + + sub myfunc ($$) { + my $field = shift; + my $page = shift; + + ... + + return (wantarray ? @values : $value); + } + +**first=>1** + +Set this to be called first in the sequence of calls looking for values. Since +the first found value is the one which is returned, ordering is significant. +This is equivalent to "order=>'first'". + +**last=>1** + +Set this to be called last in the sequence of calls looking for values. Since +the first found value is the one which is returned, ordering is significant. +This is equivalent to "order=>'last'". + +**order=>$order** + +Set the explicit ordering in the sequence of calls looking for values. Since +the first found value is the one which is returned, ordering is significant. + +The values allowed for this are "first", "last", "middle", or a two-character +ordering-sequence between 'AA' and 'ZZ'. + +### field_get_value($field, $page) + + my @values = field_get_value($field, $page); + + my $value = field_get_value($field, $page); + +Returns the values of the field for that page, or undef if none is found. +Note that it will return an array of values if you ask for an array, +and a scalar value if you ask for a scalar. + +## DOWNLOAD + +* browse at GitHub: <http://github.com/rubykat/ikiplugins/blob/master/IkiWiki/Plugin/field.pm> +* git repo at git://github.com/rubykat/ikiplugins.git diff --git a/doc/plugins/contrib/field/discussion.mdwn b/doc/plugins/contrib/field/discussion.mdwn new file mode 100644 index 000000000..6161f80df --- /dev/null +++ b/doc/plugins/contrib/field/discussion.mdwn @@ -0,0 +1,407 @@ +Having tried out `field`, some comments (from [[smcv]]): + +The general concept looks great. + +The `pagetemplate` hook seems quite namespace-polluting: on a site containing +a list of books, I'd like to have an `author` field, but that would collide +with IkiWiki's use of `<TMPL_VAR AUTHOR>` for the author of the *page* +(i.e. me). Perhaps it'd be better if the pagetemplate hook was only active for +`<TMPL_VAR FIELD_AUTHOR>` or something? (For those who want the current +behaviour, an auxiliary plugin would be easy.) + +> No, please. The idea is to be *able* to override field names if one wishes to, and choose, for yourself, non-colliding field names if one wishes not to. I don't wish to lose the power of being able to, say, define a page title with YAML format if I want to, or to write a site-specific plugin which calculates a page title, or other nifty things. +>It's not like one is going to lose the fields defined by the meta plugin; if "author" is defined by \[[!meta author=...]] then that's what will be found by "field" (provided the "meta" plugin is registered; that's what the "field_register" option is for). +>--[[KathrynAndersen]] + +>> Hmm. I suppose if you put the title (or whatever) in the YAML, then +>> "almost" all the places in IkiWiki that respect titles will do the +>> right thing due to the pagetemplate hook, with the exception being +>> anything that has special side-effects inside `meta` (like `date`), +>> or anything that looks in `$pagestate{foo}{meta}` directly +>> (like `map`). Is your plan that `meta` should register itself by +>> default, and `map` and friends should be adapted to +>> work based on `getfield()` instead of `$pagestate{foo}{meta}`, then? + +>>> Based on `field_get_value()`, yes. That would be my ideal. Do you think I should implement that as an ikiwiki branch? --[[KathrynAndersen]] + +>>>> This doesn't solve cases where certain fields are treated specially; for +>>>> instance, putting a `\[[!meta permalink]]` on a page is not the same as +>>>> putting it in `ymlfront` (in the latter case you won't get your +>>>> `<link>` header), and putting `\[[!meta date]]` is not the same as putting +>>>> `date` in `ymlfront` (in the latter case, `%pagectime` won't be changed). +>>>> +>>>> One way to resolve that would be to have `ymlfront`, or similar, be a +>>>> front-end for `meta` rather than for `field`, and call +>>>> `IkiWiki::Plugin::meta::preprocess` (or a refactored-out function that's +>>>> similar). +>>>> +>>>> There are also some cross-site scripting issues (see below)... --[[smcv]] + +>> (On the site I mentioned, I'm using an unmodified version of `field`, +>> and currently working around the collision by tagging books' pages +>> with `bookauthor` instead of `author` in the YAML.) --s + +>> Revisiting this after more thought, the problem here is similar to the +>> possibility that a wiki user adds a `meta` shortcut +>> to [[shortcuts]], or conversely, that a plugin adds a `cpan` directive +>> that conflicts with the `cpan` shortcut that pages already use. (In the +>> case of shortcuts, this is resolved by having plugin-defined directives +>> always win.) For plugin-defined meta keywords this is the plugin +>> author's/wiki admin's problem - just don't enable conflicting plugins! - +>> but it gets scary when you start introducing things like `ymlfront`, which +>> allow arbitrary, wiki-user-defined fields, even ones that subvert +>> other plugins' assumptions. +>> +>> The `pagetemplate` hook is particularly alarming because page templates are +>> evaluated in many contexts, not all of which are subject to the +>> htmlscrubber or escaping; because the output from `field` isn't filtered, +>> prefixed or delimited, when combined with an arbitrary-key-setting plugin +>> like `ymlfront` it can interfere with other plugins' expectations +>> and potentially cause cross-site scripting exploits. For instance, `inline` +>> has a `pagetemplate` hook which defines the `FEEDLINKS` template variable +>> to be a blob of HTML to put in the `<head>` of the page. As a result, this +>> YAML would be bad: +>> +>> --- +>> FEEDLINKS: <script>alert('code injection detected')</script> +>> --- +>> +>> (It might require a different case combination due to implementation +>> details, I'm not sure.) +>> +>> It's difficult for `field` to do anything about this, because it doesn't +>> know whether a field is meant to be plain text, HTML, a URL, or something +>> else. +>> +>> If `field`'s `pagetemplate` hook did something more limiting - like +>> only emitting template variables starting with `field_`, or from some +>> finite set, or something - then this would cease to be a problem, I think? +>> +>> `ftemplate` and `getfield` don't have this problem, as far as I can see, +>> because their output is in contexts where the user could equally well have +>> written raw HTML directly; the user can cause themselves confusion, but +>> can't cause harmful output. --[[smcv]] + +From a coding style point of view, the `$CamelCase` variable names aren't +IkiWiki style, and the `match_foo` functions look as though they could benefit +from being thin wrappers around a common `&IkiWiki::Plugin::field::match` +function (see `meta` for a similar approach). + +I think the documentation would probably be clearer in a less manpage-like +and more ikiwiki-like style? + +> I don't think ikiwiki *has* a "style" for docs, does it? So I followed the Perl Module style. And I'm rather baffled as to why having the docs laid out in clear sections... make them less clear. --[[KathrynAndersen]] + +>> I keep getting distracted by the big shouty headings :-) +>> I suppose what I was really getting at was that when this plugin +>> is merged, its docs will end up split between its plugin +>> page, [[plugins/write]] and [[ikiwiki/PageSpec]]; on some of the +>> contrib plugins I've added I've tried to separate the docs +>> according to how they'll hopefully be laid out after merge. --s + +If one of my branches from [[todo/allow_plugins_to_add_sorting_methods]] is +accepted, a `field()` cmp type would mean that [[plugins/contrib/report]] can +stop reimplementing sorting. Here's the implementation I'm using, with +your "sortspec" concept (a sort-hook would be very similar): if merged, +I think it should just be part of `field` rather than a separate plugin. + + # Copyright © 2010 Simon McVittie, released under GNU GPL >= 2 + package IkiWiki::Plugin::fieldsort; + use warnings; + use strict; + use IkiWiki 3.00; + use IkiWiki::Plugin::field; + + sub import { + hook(type => "getsetup", id => "fieldsort", call => \&getsetup); + } + + sub getsetup () { + return + plugin => { + safe => 1, + rebuild => undef, + }, + } + + package IkiWiki::SortSpec; + + sub cmp_field { + if (!length $_[0]) { + error("sort=field requires a parameter"); + } + + my $left = IkiWiki::Plugin::field::field_get_value($_[0], $a); + my $right = IkiWiki::Plugin::field::field_get_value($_[0], $b); + + $left = "" unless defined $left; + $right = "" unless defined $right; + return $left cmp $right; + } + + 1; + +---- + +Disclaimer: I've only looked at this plugin and ymlfront, not other related +stuff yet. (I quite like ymlfront, so I looked at this as its dependency. :) +I also don't want to annoy you with a lot of design discussion +if your main goal was to write a plugin that did exactly what you wanted. + +My first question is: Why we need another plugin storing metadata +about the page, when we already have the meta plugin? Much of the +complication around the field plugin has to do with it accessing info +belonging to the meta plugin, and generalizing that to be able to access +info stored by other plugins too. (But I don't see any other plugins that +currently store such info). Then too, it raises points of confusion like +smcv's discuission of field author vs meta author above. --[[Joey]] + +> The point is exactly in the generalization, to provide a uniform interface for accessing structured data, no matter what the source of it, whether that be the meta plugin or some other plugin. + +> There were a few reasons for this: + +>1. In converting my site over from PmWiki, I needed something that was equivalent to PmWiki's Page-Text-Variables (which is how PmWiki implements structured data). +>2. I also wanted an equivalent of PmWiki's Page-Variables, which, rather than being simple variables, are the return-value of a function. This gives one a lot of power, because one can do calculations, derive one thing from another. Heck, just being able to have a "basename" variable is useful. +>3. I noticed that in the discussion about structured data, it was mired down in disagreements about what form the structured data should take; I wanted to overcome that hurdle by decoupling the form from the content. +>4. I actually use this to solve (1), because, while I do use ymlfront, initially my pages were in PmWiki format (I wrote (another) unreleased plugin which parses PmWiki format) including PmWiki's Page-Text-Variables for structured data. So I needed something that could deal with multiple formats. + +> So, yes, it does cater to mostly my personal needs, but I think it is more generally useful, also. +> --[[KathrynAndersen]] + +>> Is it fair to say, then, that `field`'s purpose is to take other +>> plugins' arbitrary per-page data, and present it as a single +>> merged/flattened string => string map per page? From the plugins +>> here, things you then use that merged map for include: +>> +>> * sorting - stolen by [[todo/allow_plugins_to_add_sorting_methods]] +>> * substitution into pages with Perl-like syntax - `getfield` +>> * substitution into wiki-defined templates - the `pagetemplate` +>> hook +>> * substitution into user-defined templates - `ftemplate` +>> +>> As I mentioned above, the flattening can cause collisions (and in the +>> `pagetemplate` case, even security problems). +>> +>> I wonder whether conflating Page Text Variables with Page Variables +>> causes `field` to be more general than it needs to be? +>> To define a Page Variable (function-like field), you need to write +>> a plugin containing that Perl function; if we assume that `field` +>> or something resembling it gets merged into ikiwiki, then it's +>> reasonable to expect third-party plugins to integrate with whatever +>> scaffolding there is for these (either in an enabled-by-default +>> plugin that most people are expected to leave enabled, like `meta` +>> now, or in the core), and it doesn't seem onerous to expect each +>> plugin that wants to participate in this mechanism to have code to +>> do so. While it's still contrib, `field` could just have a special case +>> for the meta plugin, rather than the converse? +>> +>> If Page Text Variables are limited to being simple strings as you +>> suggest over in [[forum/an_alternative_approach_to_structured_data]], +>> then they're functionally similar to `meta` fields, so one way to +>> get their functionality would be to extend `meta` so that +>> +>> \[[!meta badger="mushroom"]] +>> +>> (for an unrecognised keyword `badger`) would store +>> `$pagestate{$page}{meta}{badger} = "mushroom"`? Getting this to +>> appear in templates might be problematic, because a naive +>> `pagetemplate` hook would have the same problem that `field` combined +>> with `ymlfront` currently does. +>> +>> One disadvantage that would appear if the function-like and +>> meta-like fields weren't in the same namespace would be that it +>> wouldn't be possible to switch a field from being meta-like to being +>> function-like without changing any wiki content that referenced it. +>> +>> Perhaps meta-like fields should just *be* `meta` (with the above +>> enhancement), as a trivial case of function-like fields? That would +>> turn `ymlfront` into an alternative syntax for `meta`, I think? +>> That, in turn, would hopefully solve the special-fields problem, +>> by just delegating it to meta. I've been glad of the ability to define +>> new ad-hoc fields with this plugin without having to write an extra plugin +>> to do so (listing books with a `bookauthor` and sorting them by +>> `"field(bookauthor) title"`), but that'd be just as easy if `meta` +>> accepted ad-hoc fields? +>> +>> --[[smcv]] + +>>> Your point above about cross-site scripting is a valid one, and something I +>>> hadn't thought of (oops). + +>>> I still want to be able to populate pagetemplate templates with field, because I +>>> use it for a number of things, such as setting which CSS files to use for a +>>> given page, and, as I said, for titles. But apart from the titles, I +>>> realize I've been setting them in places other than the page data itself. +>>> (Another unreleased plugin, `concon`, uses Config::Context to be able to +>>> set variables on a per-site, per-directory and a per-page basis). + +>>> The first possible solution is what you suggested above: for field to only +>>> set values in pagetemplate which are prefixed with *field_*. I don't think +>>> this is quite satisfactory, since that would still mean that people could +>>> put un-scrubbed values into a pagetemplate, albeit they would be values +>>> named field_foo, etc. --[[KathrynAndersen]] + +>>>> They can already do similar; `PERMALINK` is pre-sanitized to +>>>> ensure that it's a "safe" URL, but if an extremely confused wiki admin was +>>>> to put `COPYRIGHT` in their RSS/Atom feed's `<link>`, a malicious user +>>>> could put an unsafe (e.g. Javascript) URL in there (`COPYRIGHT` *is* +>>>> HTML-scrubbed, but "javascript:alert('pwned!')" is just text as far as a +>>>> HTML sanitizer is concerned, so it passes straight through). The solution +>>>> is to not use variables in situations where that variable would be +>>>> inappropriate. Because `field` is so generic, the definition of what's +>>>> appropriate is difficult. --[[smcv]] + +>>> An alternative solution would be to classify field registration as "secure" +>>> and "insecure". Sources such as ymlfront would be insecure, sources such +>>> as concon (or the $config hash) would be secure, since they can't be edited +>>> as pages. Then, when doing pagetemplate substitution (but not ftemplate +>>> substitution) the insecure sources could be HTML-escaped. +>>> --[[KathrynAndersen]] + +>>>> Whether you trust the supplier of data seems orthogonal to whether its value +>>>> is (meant to be) interpreted as plain text, HTML, a URL or what? +>>>> +>>>> Even in cases where you trust the supplier, you need to escape things +>>>> suitably for the context, not for security but for correctness. The +>>>> definition of the value, and the context it's being used in, changes the +>>>> processing you need to do. An incomplete list: +>>>> +>>>> * HTML used as HTML needs to be html-scrubbed if and only if untrusted +>>>> * URLs used as URLs need to be put through `safeurl()` if and only if +>>>> untrusted +>>>> * HTML used as plain text needs tags removed regardless +>>>> * URLs used as plain text are safe +>>>> * URLs or plain text used in HTML need HTML-escaping (and URLs also need +>>>> `safeurl()` if untrusted) +>>>> * HTML or plain text used in URLs need URL-escaping (and the resulting +>>>> URL might need sanitizing too?) +>>>> +>>>> I can't immediately think of other data types we'd be interested in beyond +>>>> text, HTML and URL, but I'm sure there are plenty. + +>>>>> But isn't this a problem with anything that uses pagetemplates? Or is +>>>>> the point that, with plugins other than `field`, they all know, +>>>>> beforehand, the names of all the fields that they are dealing with, and +>>>>> thus the writer of the plugin knows which treatment each particular field +>>>>> needs? For example, that `meta` knows that `title` needs to be +>>>>> HTML-escaped, and that `baseurl` doesn't. In that case, yes, I see the problem. +>>>>> It's a tricky one. It isn't as if there's only ever going to be a fixed set of fields that need different treatment, either. Because the site admin is free to add whatever fields they like to the page template (if they aren't using the default one, that is. I'm not using the default one myself). +>>>>> Mind you, for trusted sources, since the person writing the page template and the person providing the variable are the same, they themselves would know whether the value will be treated as HTML, plain text, or a URL, and thus could do the needed escaping themselves when writing down the value. + +>>>>> Looking at the content of the default `page.tmpl` let's see what variables fall into which categories: + +>>>>> * **Used as URL:** BASEURL, EDITURL, PARENTLINKS->URL, RECENTCHANGESURL, HISTORYURL, GETSOURCEURL, PREFSURL, OTHERLANGUAGES->URL, ADDCOMMENTURL, BACKLINKS->URL, MORE_BACKLINKS->URL +>>>>> * **Used as part of a URL:** FAVICON, LOCAL_CSS +>>>>> * **Needs to be HTML-escaped:** TITLE +>>>>> * **Used as-is (as HTML):** FEEDLINKS, RELVCS, META, PERCENTTRANSLATED, SEARCHFORM, COMMENTSLINK, DISCUSSIONLINK, OTHERLANGUAGES->PERCENT, SIDEBAR, CONTENT, COMMENTS, TAGS->LINK, COPYRIGHT, LICENSE, MTIME, EXTRAFOOTER + +>>>>> This looks as if only TITLE needs HTML-escaping all the time, and that the URLS all end with "URL" in their name. Unfortunately the FAVICON and LOCAL_CSS which are part of URLS don't have "URL" in their name, though that's fair enough, since they aren't full URLs. + +>>>>> --K.A. + +>>>> One reasonable option would be to declare that `field` takes text-valued +>>>> fields, in which case either consumers need to escape +>>>> it with `<TMPL_VAR FIELD_FOO ESCAPE=HTML>`, and not interpret it as a URL +>>>> without first checking `safeurl`), or the pagetemplate hook needs to +>>>> pre-escape. + +>>>>> Since HTML::Template does have the ability to do ESCAPE=HTML/URL/JS, why not take advantage of that? Some things, like TITLE, probably should have ESCAPE=HTML all the time; that would solve the "to escape or not to escape" problem that `meta` has with titles. After all, when one *sorts* by title, one doesn't really want HTML-escaping in it; only when one uses it in a template. -- K.A. + +>>>> Another reasonable option would be to declare that `field` takes raw HTML, +>>>> in which case consumers need to only use it in contexts that will be +>>>> HTML-scrubbed (but it becomes unsuitable for using as text - problematic +>>>> for text-based things like sorting or URLs, and not ideal for searching). +>>>> +>>>> You could even let each consumer choose how it's going to use the field, +>>>> by having the `foo` field generate `TEXT_FOO` and `HTML_FOO` variables? +>>>> --[[smcv]] + +>>>>> Something similar is already done in `template` and `ftemplate` with the `raw_` prefix, which determines whether the variable should have `htmlize` run over it first before the value is applied to the template. Of course, that isn't scrubbing or escaping, because with those templates, the scrubbing is done afterwards as part of the normal processing. + +>>> Another problem, as you point out, is special-case fields, such as a number of +>>> those defined by `meta`, which have side-effects associated with them, more +>>> than just providing a value to pagetemplate. Perhaps `meta` should deal with +>>> the side-effects, but use `field` as an interface to get the values of those special fields. + +>>> --[[KathrynAndersen]] + +----- + +I think the main point is: what is (or should be) the main point of the +field plugin? If it's essentially a way to present a consistent +interface to access page-related structured information, then it makes +sense to have it very general. Plugins registering with fields would +then present ways for recovering the structure information from the page +(`ymlfront`, `meta`, etc), ways to manipulate it (like `meta` does), +etc. + +In this sense, security should be entirely up to the plugins, although +the fields plugin could provide some auxiliary infrastructure (like +determining where the data comes from and raise or lower the security +level accoringly). + +Namespacing is important, and it should be considered at the field +plugin interface level. A plugin should be able to register as +responsible for the processing of all data belonging to a given +namespace, but plugins should be able to set data in any namespace. So +for example, `meta` register are `meta` fields processing, and whatever +method is used to set the data (`meta` directive, `ymlfront`, etc) it +gets a say on what to do with data in its namespace. + +What I'm thinking of is something you could call fieldsets. The nice +thing about them is that, aside from the ones defined by plugins (like +`meta`), it would be possible to define custom ones (with a generic, +default processor) in an appropriate file (like smileys and shortcuts) +with a syntax like: + + [[!fieldset book namespace=book + fields="author title isbn" + fieldtype="text text text"]] + +after which, you coude use + + [[!book author="A. U. Thor" + title="Fields of Iki"]] + +and the data would be available under the book namespace, and thus +as BOOK_AUTHOR, BOOK_TITLE etc in templates. + +Security, in this sense, would be up to the plugin responsible for the +namespace processing (the default handler would HTML-escape text fields +scrub, html fields, safeurl()ify url fields, etc.) + +> So, are you saying that getting a field value is sort of a two-stage process? Get the value from anywhere, and then call the "security processor" for that namespace to "secure" the value? I think "namespaces" are really orthogonal to this issue. What the issue seems to be is: + + * what form do we expect the raw field to be in? (text, URL, HTML) + * what form do we expect the "secured" output to be in? (raw HTML, scrubbed HTML, escaped HTML, URL) + +> Only if we know both these things will we know what sort of security processing needs to be done. + +>> Fieldsets are orthogonal to the security issue in the sense that you can use +>> them without worrying about the field security issue, but they happen to be +>> a rather clean way of answering those two questions, by allowing you to +>> attach preprocessing attributes to a field in a way that the user +>> (supposedly) cannot mingle with. + +> There is also a difference between field values that are used inside pagetemplate, and field values which are used as part of a page's content (e.g. with ftemplate). If you have a TITLE, you want it to be HTML-escaped if you're using it inside pagetemplate, but you don't want it to be HTML-escaped if you're using it inside a page's content. On the other hand, if you have, say, FEEDLINKS used inside pagetemplate, you don't wish it to be HTML-escaped at all, or your page content will be completely stuffed. + +>> Not to talk about the many different ways date-like fields might be need +>> processing. It has already been proposed to solve this problem by exposing +>> the field values under different names depending on the kind or amout of +>> postprocessing they had (e.g. RAW_SOMEFIELD, SOMEFIELD, to which we could add +>> HTML_SOMEFIELD, URL_SOMEFIELD or whatever). Again, fieldsets offer a simple way +>> of letting Ikiwiki know what kind of postprocessing should be offered for +>> that particular field. + +> So, somehow, we have to know the meaning of a field before we can use it properly, which kind of goes against the idea of having something generic. + +>> We could have a default field type (text, for example), and a way to set a +>> different field type (which is what my fieldset proposal was about). + +> --[[KathrynAndersen]] + +----- + +I was just looking at HTML5 and wondered if the field plugin should generate the new Microdata tags (as well as the internal structures)? <http://slides.html5rocks.com/#slide19> -- [[Will]] + +> This could just as easily be done as a separate plugin. Feel free to do so. --[[KathrynAndersen]] diff --git a/doc/plugins/contrib/flattr.mdwn b/doc/plugins/contrib/flattr.mdwn new file mode 100644 index 000000000..e9b4bf857 --- /dev/null +++ b/doc/plugins/contrib/flattr.mdwn @@ -0,0 +1,48 @@ +[[!template id=plugin name=flattr author="[[jaywalk]]"]] + +[flattr.com](http://flattr.com/) is a flatrate micropayment service, which revolves around the idea of having flattr buttons everywhere that people visiting your site can use to "flattr" you. + +This plugin makes it easier to put flattr buttons in ikiwiki. It supports both the static kind as well as the counting dynamic javascript version. The dynamic version does not work if [[htmlscrubber|/plugins/htmlscrubber]] is active on the page. + +The dynamic button does not require creation of the page on flattr before being added to a page, the static one does. + +I wrote some notes on [jonatan.walck.se](http://jonatan.walck.se/software/ikiwiki/plugin/flattr/) and put the source here: [flattr.pm](http://jonatan.walck.se/software/ikiwiki/flattr.pm) + +This plugin is licensed under [CC0](http://creativecommons.org/publicdomain/zero/1.0/) (public domain). + +Note that there is now a [[plugins/flattr]] plugin bundled with ikiwiki. It +is less configurable, not supporting static buttons, but simpler to use. + +# Usage # + + # [[!flattr args]] where args are in the form of arg=value. + # Possible args: + # type - static or dynamic. Defaults to static. + + # vars in static mode: + # -------------------- + # Required: + # url - URL to flattr page, + # e.g. http://flattr.com/thing/1994/jaywalks-weblog + # Optional: + # style - Set to compact for compact button. + + # vars in dynamic mode: + # --------------------- + # Required: + # None. + # Optional: + # uid - Set the default in the plugin, override if needed. + # title - The title defaults to $wikiname/some/path (like on the top of + # the wiki). + # desc - A description of the content. Defaults to " ". + # cat - Category, this can be text, images, video, audio, software or + # rest. Defaults to text. + # lang - Language, list of available choises is on + # https://flattr.com/support/integrate/languages. Defaults to en_GB. + # tag - A list of comma separated tags. Empty per default. + # url - URL to thing to flattred, + # e.g. http://jonatan.walck.se/weblog + # style - Set it to compact to get the small button, big for any other + # value including empty. + diff --git a/doc/plugins/contrib/flattr/discussion.mdwn b/doc/plugins/contrib/flattr/discussion.mdwn new file mode 100644 index 000000000..586139e9c --- /dev/null +++ b/doc/plugins/contrib/flattr/discussion.mdwn @@ -0,0 +1,9 @@ +FWIW, it is possible for a plugin like this to add javascript to pages that +are protected by htmlscrubber. Just return a token in your preprocess hook, +and then have a format hook that replaces the token with the javascript. +--[[Joey]] + +> Thanks, That's good to know. I'll try to continue the development of this +> plugin later, for now I just needed it to work. :) It will most likely +> evolve as my page does too. +> --[[jaywalk]] diff --git a/doc/plugins/contrib/ftemplate.mdwn b/doc/plugins/contrib/ftemplate.mdwn new file mode 100644 index 000000000..d82867f94 --- /dev/null +++ b/doc/plugins/contrib/ftemplate.mdwn @@ -0,0 +1,25 @@ +[[!template id=plugin name=ftemplate author="[[rubykat]]"]] +[[!tag type/meta type/format]] + +This plugin provides the [[ikiwiki/directive/ftemplate]] directive. + +This is like the [[ikiwiki/directive/template]] directive, with the addition +that one does not have to provide all the values in the call to the template, +because ftemplate can query structured data ("fields") using the [[field]] +plugin. + +## Activate the plugin + + add_plugins => [qw{goodstuff ftemplate ....}], + +## PREREQUISITES + + IkiWiki + IkiWiki::Plugin::field + HTML::Template + Encode + +## DOWNLOAD + +* browse at GitHub: <http://github.com/rubykat/ikiplugins/blob/master/IkiWiki/Plugin/ftemplate.pm> +* git repo at git://github.com/rubykat/ikiplugins.git diff --git a/doc/plugins/contrib/ftemplate/discussion.mdwn b/doc/plugins/contrib/ftemplate/discussion.mdwn new file mode 100644 index 000000000..1e0bca5d8 --- /dev/null +++ b/doc/plugins/contrib/ftemplate/discussion.mdwn @@ -0,0 +1,33 @@ +I initially thought this wasn't actually necessary - the combination +of [[plugins/template]] with [[plugins/contrib/field]]'s `pagetemplate` +hook ought to provide the same functionality. However, `template` +doesn't run `pagetemplate` hooks; a more general version of this +plugin would be to have a variant of `template` that runs `pagetemplate` +hooks (probably easiest to just patch `template` to implement a +second directive, or have a special parameter `run_hooks="yes"`, +or something). + +> I got the impression that `pagetemplate` hooks are intended to be completely independent of `template` variables; page-template is for the actual `page.tmpl` template, while `template` is for other templates which are used inside the page content. So I don't understand why one would need a run_hooks option. --[[KathrynAndersen]] + +>> `Render`, `inline`, `comments` and `recentchanges` run `pagetemplate` +>> hooks, as does anything that uses `IkiWiki::misctemplate`. From that +>> quick survey, it seems as though `template` is the only thing that +>> uses `HTML::Template` but *doesn't* run `pagetemplate` hooks? +>> +>> It just seems strange to me that `field` needs to have its own +>> variant of `template` (this), its own variant of `inline` (`report`), +>> and so on - I'd tend to lean more towards having `field` +>> enhance the existing plugins. I'm not an ikiwiki committer, +>> mind... Joey, your opinion would be appreciated! --[[smcv]] + +>>> I did it that way basically because I needed the functionality ASAP, and I didn't want to step on anyone's toes, so I made them as separate plugins. If Joey wants to integrate the functionality into IkiWiki proper, I would be very happy, but I don't want to put pressure on him. --[[KathrynAndersen]] + +Another missing thing is that `ftemplate` looks in +the "system" templates directories, not just in the wiki, but that +seems orthogonal (and might be a good enhancement to `template` anyway). +--[[smcv]] + +> Yes, I added that because I wanted the option of not having to make all my templates work as wiki pages also. --[[KathrynAndersen]] + +>> Joey has added support for +>> [[todo/user-defined_templates_outside_the_wiki]] now. --s diff --git a/doc/plugins/contrib/ftemplate/ikiwiki/directive/ftemplate.mdwn b/doc/plugins/contrib/ftemplate/ikiwiki/directive/ftemplate.mdwn new file mode 100644 index 000000000..3009fc830 --- /dev/null +++ b/doc/plugins/contrib/ftemplate/ikiwiki/directive/ftemplate.mdwn @@ -0,0 +1,106 @@ +The `ftemplate` directive is supplied by the [[!iki plugins/contrib/ftemplate desc=ftemplate]] plugin. + +This is like the [[ikiwiki/directive/template]] directive, with the addition +that one does not have to provide all the values in the call to the template, +because ftemplate can query structured data ("fields") using the +[[plugins/contrib/field]] plugin. + +Templates are files that can be filled out and inserted into pages in +the wiki, by using the ftemplate directive. The directive has an id +parameter that identifies the template to use. + +Additional parameters can be used to fill out the template, in +addition to the "field" values. Passed-in values override the +"field" values. + +There are two places where template files can live. One is in the /templates +directory on the wiki. These templates are wiki pages, and can be edited from +the web like other wiki pages. + +The second place where template files can live is in the global +templates directory (the same place where the page.tmpl template lives). +This is a useful place to put template files if you want to prevent +them being edited from the web, and you don't want to have to make +them work as wiki pages. + +### EXAMPLES + +#### Example 1 + +PageA: + + \[[!meta title="I Am Page A"]] + \[[!meta description="A is for Apple."]] + \[[!meta author="Fred Nurk"]] + \[[!ftemplate id="mytemplate"]] + +Template "mytemplate": + + # <TMPL_VAR NAME="TITLE"> + by <TMPL_VAR NAME="AUTHOR"> + + **Summary:** <TMPL_VAR NAME="DESCRIPTION"> + +This will give: + + <h1>I Am Page A</h1> + <p>by Fred Nurk</p> + <p><strong>Summary:</strong> A is for Apple. + +#### Example 2: Overriding values + +PageB: + + \[[!meta title="I Am Page B"]] + \[[!meta description="B is for Banana."]] + \[[!meta author="Fred Nurk"]] + \[[!ftemplate id="mytemplate" title="Bananananananas"]] + +This will give: + + <h1>Bananananananas</h1> + <p>by Fred Nurk</p> + <p><strong>Summary:</strong> B is for Banana. + +#### Example 3: Loops + +(this example uses the [[plugins/contrib/ymlfront]] plugin) + +Page C: + + --- + BookAuthor: Georgette Heyer + BookTitle: Black Sheep + BookGenre: + - Historical + - Romance + --- + \[[ftemplate id="footemplate"]] + + I like this book. + +Template "footemplate": + + # <TMPL_VAR BOOKTITLE> + by <TMPL_VAR BOOKAUTHOR> + + <TMPL_IF BOOKGENRE>( + <TMPL_LOOP GENRE_LOOP><TMPL_VAR BOOKGENRE> + <TMPL_UNLESS __last__>, </TMPL_UNLESS> + </TMPL_LOOP> + )</TMPL_IF> + +This will give: + + <h1>Black Sheep</h1> + <p>by Georgette Heyer</p> + + <p>(Historical, Romance)</p> + + <p>I like this book.</p> + +### LIMITATIONS + +One cannot query the values of fields on pages other than the current +page. If you want to do that, check out the [[plugins/contrib/report]] +plugin. diff --git a/doc/plugins/contrib/getfield.mdwn b/doc/plugins/contrib/getfield.mdwn new file mode 100644 index 000000000..0a92894f1 --- /dev/null +++ b/doc/plugins/contrib/getfield.mdwn @@ -0,0 +1,131 @@ +[[!template id=plugin name=getfield author="[[rubykat]]"]] +[[!tag type/meta type/format]] +[[!toc]] +## NAME + +IkiWiki::Plugin::getfield - query the values of fields + +## SYNOPSIS + + # activate the plugin + add_plugins => [qw{goodstuff getfield ....}], + +## DESCRIPTION + +This plugin provides a way of querying the meta-data (data fields) of a page +inside the page content (rather than inside a template) This provides a way to +use per-page structured data, where each page is treated like a record, and the +structured data are fields in that record. This can include the meta-data for +that page, such as the page title. + +This plugin is meant to be used in conjunction with the [[field]] plugin. + +### USAGE + +One can get the value of a field by using special markup in the page. +This does not use directive markup, in order to make it easier to +use the markup inside other directives. There are four forms: + +* {{$*fieldname*}} + + This queries the value of *fieldname* for the source page. + + For example: + + \[[!meta title="My Long and Complicated Title With Potential For Spelling Mistakes"]] + # {{$title}} + + When the page is processed, this will give you: + + <h1>My Long and Complicated Title With Potential For Spelling Mistakes</h1> + +* {{$*pagename*#*fieldname*}} + + This queries the value of *fieldname* for the page *pagename*. + + For example: + + On PageFoo: + + \[[!meta title="I Am Page Foo"]] + + Stuff about Foo. + + On PageBar: + + For more info, see \[[{{$PageFoo#title}}|PageFoo]]. + + When PageBar is displayed: + + <p>For more info, see <a href="PageFoo">I Am Page Foo</a>.</p> + +* {{+$*fieldname*+}} + + This queries the value of *fieldname* for the destination page; that is, + the value when this page is included inside another page. + + For example: + + On PageA: + + \[[!meta title="I Am Page A"]] + # {{+$title+}} + + Stuff about A. + + On PageB: + + \[[!meta title="I Am Page B"]] + \[[!inline pagespec="PageA"]] + + When PageA is displayed: + + <h1>I Am Page A</h1> + <p>Stuff about A.</p> + + When PageB is displayed: + + <h1>I Am Page B</h1> + <p>Stuff about A.</p> + +* {{+$*pagename*#*fieldname*+}} + + This queries the value of *fieldname* for the page *pagename*; the + only difference between this and {{$*pagename*#*fieldname*}} is + that the full name of *pagename* is calculated relative to the + destination page rather than the source page. + + I can't really think of a reason why this should be needed, but + this format has been added for completeness. + +### No Value Found + +If no value is found for the given field, then the field name is returned. + +For example: + +On PageFoo: + + \[[!meta title="Foo"]] + My title is {{$title}}. + + My description is {{$description}}. + +When PageFoo is displayed: + + <p>My title is Foo.</p> + + <p>My description is description.</p> + +This is because "description" hasn't been defined for that page. + +### More Examples + +Listing all the sub-pages of the current page: + + \[[!map pages="{{$page}}/*"]] + +## DOWNLOAD + +* browse at GitHub: <http://github.com/rubykat/ikiplugins/blob/master/IkiWiki/Plugin/getfield.pm> +* git repo at git://github.com/rubykat/ikiplugins.git diff --git a/doc/plugins/contrib/getfield/discussion.mdwn b/doc/plugins/contrib/getfield/discussion.mdwn new file mode 100644 index 000000000..5f7fffead --- /dev/null +++ b/doc/plugins/contrib/getfield/discussion.mdwn @@ -0,0 +1,32 @@ +## Templating, and other uses + +Like you mentioned in [[ftemplate]] IIRC, it'll only work on the same page. If it can be made to work anywhere, or from a specific place in the wiki - configurable, possibly - you'll have something very similar to mediawiki's templates. I can already think of a few uses for this combined with [[template]] ;) . --[[SR|users/simonraven]] + +> Yes, I mentioned "only current page" in the "LIMITATIONS" section. + +> What do you think would be a good syntax for querying other pages? +> It needs to resolve to a single page, though I guess using "bestlink" to find the closest page would mean that one didn't have to spell out the whole page. + +>> I don't know the internals very well, I think that's how other plugins do it. *goes to check* Usually it's a `foreach` loop, and use a `pagestate{foo}` to check the page's status/state. There's also some stuff like 'pagespec_match_list($params{page}` ... they do slightly different thing depending on need. --[[SR|users/simonraven]] + +>>> No, I meant what markup I should use; the actual implementation probably wouldn't be too difficult. + +>>> The current markup is {{$*fieldname*}}; what you're wanting, perhaps it should be represented like {{$*pagename*:*fieldname*}}, or {{$*pagename*::*fieldname*}} or something else... +>>> -- [[KathrynAndersen]] + +>>>> Oh. Hmm. I like your idea actually, or alternately, in keeping more with other plugins, doing it like {{pagename/fieldname}}. The meaning of the separator is less clear with /, but avoids potential issues with filename clashes that have a colon in them. It also keeps a certain logic - at least to me. Either way, I think both are good choices. [[SR|users/simonraven]] + +>>>>> What about using {{pagename#fieldname}}? The meaning of the hash in URLs sort of fits with what is needed here (reference to a 'named' thing within the page) and it won't conflict with actual hash usages (unless we expect different named parts of pages to define different values for the same field ...) +>>>>> -- [[Oblomov]] +>>>>>> That's a good one too. --[[simonraven]] +>>>>>>> Done! I used {{$*pagename*#*fieldname*}} for the format. -- [[users/KathrynAndersen]] + + +> I'm also working on a "report" plugin, which will basically apply a template like [[ftemplate]] does, but to a list of pages given from a pagespec, rather than the current page. + +> -- [[users/KathrynAndersen]] + +>> Ooh, sounds nice :) . --[[SR|users/simonraven]] + +>>> I've now released the [[plugins/contrib/report]] plugin. I've been using it on my site; the holdup on releasing was because I hadn't yet written the docs for it. I hope you find it useful. +>>> -- [[users/KathrynAndersen]] diff --git a/doc/plugins/contrib/headinganchors/discussion.mdwn b/doc/plugins/contrib/headinganchors/discussion.mdwn index 91fe04a6d..151af8d92 100644 --- a/doc/plugins/contrib/headinganchors/discussion.mdwn +++ b/doc/plugins/contrib/headinganchors/discussion.mdwn @@ -1 +1,33 @@ Isn't this functionality a part of what [[plugins/toc]] needs and does? Then probably the [[plugins/toc]] plugin's code could be split into the part that implements the [[plugins/contrib/headinganchors]]'s functionality and the TOC generation itself. That will bring more order into the code and the set of available plugins. --Ivan Z. + +--- + +A patch to make it more like MediaWiki: + +<pre>--- headinganchors.pm ++++ headinganchors.pm +@@ -5,6 +5,7 @@ + use warnings; + use strict; + use IkiWiki 2.00; ++use URI::Escape; + + sub import { + hook(type => "sanitize", id => "headinganchors", call => \&headinganchors); +@@ -14,9 +15,11 @@ + my $str = shift; + $str =~ s/^\s+//; + $str =~ s/\s+$//; +- $str = lc($str); +- $str =~ s/[&\?"\'\.,\(\)!]//mig; +- $str =~ s/[^a-z]/_/mig; ++ $str =~ s/\s/_/g; ++ $str =~ s/"//g; ++ $str =~ s/^[^a-zA-Z]/z-/; # must start with an alphabetical character ++ $str = uri_escape_utf8($str); ++ $str =~ s/%/./g; + return $str; + } + </pre> + +--Changaco diff --git a/doc/plugins/contrib/highlightcode.mdwn b/doc/plugins/contrib/highlightcode.mdwn index 8abb76583..f1df204bb 100644 --- a/doc/plugins/contrib/highlightcode.mdwn +++ b/doc/plugins/contrib/highlightcode.mdwn @@ -1,6 +1,8 @@ [[!template id=plugin name=highlightcode author="[[sabr]]"]] [[!tag type/format]] +(An alternative to this plugin, [[plugins/highlight]], is now provided with IkiWiki. --[[smcv]]) + A small plugin to allow Ikiwiki to display source files complete with syntax highlighting. Files with recognized extensions (i.e. my-file.cpp) are be rendered just like any other Ikiwiki page. You can even edit your source files with Ikiwiki's editor. It uses the Syntax::Highlight::Engine::Kate Perl module to do the highlighting. diff --git a/doc/plugins/contrib/ikiwiki/directive/ymlfront.mdwn b/doc/plugins/contrib/ikiwiki/directive/ymlfront.mdwn new file mode 100644 index 000000000..1a01834f8 --- /dev/null +++ b/doc/plugins/contrib/ikiwiki/directive/ymlfront.mdwn @@ -0,0 +1,17 @@ +The `ymlfront` directive is supplied by the [[!iki plugins/contrib/ymlfront desc=ymlfront]] plugin. + +This directive allows the user to define arbitrary meta-data in YAML format. + + \[[!ymlfront data=""" + foo: fooness + bar: The Royal Pigeon + baz: 2 + """]] + +There is one argument to this directive. + +* **data:** + The YAML-format data. This should be enclosed inside triple-quotes to preserve the data correctly. + +If more than one ymlfront directive is given per page, the result is undefined. +Likewise, it is inadvisable to try to mix the non-directive ymlfront format with the directive form of the data. diff --git a/doc/plugins/contrib/imailhide.mdwn b/doc/plugins/contrib/imailhide.mdwn new file mode 100644 index 000000000..6009aa012 --- /dev/null +++ b/doc/plugins/contrib/imailhide.mdwn @@ -0,0 +1,65 @@ +[[!template id=plugin name=imailhide author="Peter_Vizi"]] +[[!tag type/widget type/html]] + +# Mailhide Plugin for Ikiwiki + +This plugin provides the directive mailhide, that uses the [Mailhide +API][1] to protect email addresses from spammers. + +## Dependencies + +The [Captcha::reCAPTCHA::Mailhide][2] perl module is required for this +plugin. + +## Download + +You can get the source code from [github][3]. + +## Installation + +Copy `imailhide.pm` to `/usr/share/perl/5.10.0/IkiWiki/Plugin` or +`~/.ikiwiki/IkiWiki/Plugin`, and enable it in your `.setup` file + + add_plugins => [qw{goodstuff imailhide ....}], + mailhide_public_key => "8s99vSA99fF11mao193LWdpa==", + mailhide_private_key => "6b5e4545326b5e4545326b5e45453223", + mailhide_default_style => "short", + +## Configuration + +### `mailhide_public_key` + +This is your personal public key that you can get at [Google][4]. + +### `mailhide_private_key` + +This is your personal private key that you can get at [Google][4]. + +### `mailhide_default_style` + +As per the recommendation of the [Mailhide API documentation][5], you +can define this as `short` or `long`. The `short` parameter will +result in `<a href="...">john</a>` links, while the `long` parameter +will result in `joh<a href="...">...</a>@example.com`. + +## Parameters + +### `email` + +*Required.* This is the email addres that you want to hide. + +### `style` + +*Optional.* You can set the style parameter individually for each + `mailhide` call. See `mailhide_default_style` for details. + +## Known Issues + +1. [opening new window when displaying email address][6] + +[1]: http://www.google.com/recaptcha/mailhide/ +[2]: http://search.cpan.org/perldoc?Captcha::reCAPTCHA::Mailhide +[3]: http://github.com/petervizi/imailhide +[4]: http://www.google.com/recaptcha/mailhide/apikey +[5]: http://code.google.com/apis/recaptcha/docs/mailhideapi.html +[6]: http://github.com/petervizi/imailhide/issues#issue/1 diff --git a/doc/plugins/contrib/justlogin.mdwn b/doc/plugins/contrib/justlogin.mdwn new file mode 100644 index 000000000..90645b9ef --- /dev/null +++ b/doc/plugins/contrib/justlogin.mdwn @@ -0,0 +1,52 @@ +This plugin has been abandoned while still in development. Currently it does bring up the login page and the login page does, with proper credentials, log in the user, but the returning page goes to prefs. I have no idea why. I decided to go in another direction so if someone wants to take over then please do so. Otherwise I have no problem if this page needs to be deleted. [[users/justint/]] + +Place this code into a page: + +<form action="http://portable.local/cgi-bin/ikiwiki.cgi" method="get"> + +<input type="hidden" name="do" value="justlogin" /> + +<input type="submit" value="Login" /></form> + +This is the plugin so far: +#!/usr/bin/perl + # Bring up a login page that returns to the calling page + package IkiWiki::Plugin::justlogin; + + use warnings; + use strict; + use IkiWiki 3.00; + + sub import { + hook(type => "sessioncgi", id => "justlogin", call => \&sessioncgi); + } + + sub sessioncgi ($$) { + my $q=shift; + my $session=shift; + + debug("jl sessioncgi1 running."); + + if ($q->param("do") eq "justlogin") { + debug("jl do=justlogin running."); + if (! defined $session->param("name") ) { + debug("jl param!defined running."); + $session->param("postsignin" => $ENV{HTTP_REFERER} ); + $session->param("do" => "justgoback" ); + IkiWiki::cgi_signin($q, $session); + IkiWiki::cgi_savesession($session); + } + exit; + } elsif ($session->param("do") eq "justgoback") { + debug("jl justgoback running."); + my $page=$q->param("postsignin"); + $session->clear("postsignin"); + $session->clear("do"); + IkiWiki::cgi_savesession($session); + IkiWiki::redirect($q, $page); + exit; + } + } + + 1 + diff --git a/doc/plugins/contrib/mediawiki.mdwn b/doc/plugins/contrib/mediawiki.mdwn index 7bf1ba0df..13c2d04b2 100644 --- a/doc/plugins/contrib/mediawiki.mdwn +++ b/doc/plugins/contrib/mediawiki.mdwn @@ -1,7 +1,10 @@ [[!template id=plugin name=mediawiki author="[[sabr]]"]] [[!tag type/format]] -[The Mediawiki plugin](http://u32.net/Mediawiki_Plugin/) allows ikiwiki to -process pages written using MediaWiki markup. +The Mediawiki plugin allows ikiwiki to process pages written using MediaWiki +markup. -Available at <http://alcopop.org/~jon/mediawiki.pm> +Available at <http://github.com/jmtd/mediawiki.pm>. + +This plugin originally lived at <http://u32.net/Mediawiki_Plugin/>, but that +website has disappeared. diff --git a/doc/plugins/contrib/pod.mdwn b/doc/plugins/contrib/pod.mdwn new file mode 100644 index 000000000..97a9c648a --- /dev/null +++ b/doc/plugins/contrib/pod.mdwn @@ -0,0 +1,38 @@ +[[!template id=plugin name=pod author="[[rubykat]]"]] +[[!tag type/format]] +## NAME + +IkiWiki::Plugin::pod - process pages written in POD format. + +## SYNOPSIS + +In the ikiwiki setup file, enable this plugin by adding it to the +list of active plugins. + + add_plugins => [qw{goodstuff pod ....}], + +## DESCRIPTION + +IkiWiki::Plugin::pod is an IkiWiki plugin enabling ikiwiki to +process pages written in POD ([Plain Old Documentation](http://en.wikipedia.org/wiki/Plain_Old_Documentation)) format. +This will treat files with a `.pod` or `.pm` extension as files +which contain POD markup. + +## OPTIONS + +The following options can be set in the ikiwiki setup file. + +* **pod_index:** If true, this will generate an index (table of contents) for the page. +* **pod_toplink:** The label to be used for links back to the top of the page. If this is empty, then no top-links will be generated. + +## PREREQUISITES + + IkiWiki + Pod::Xhtml + IO::String + +## DOWNLOAD + +* browse at GitHub: <http://github.com/rubykat/ikiplugins/blob/master/IkiWiki/Plugin/pod.pm> +* git repo at git://github.com/rubykat/ikiplugins.git + diff --git a/doc/plugins/contrib/pod/discussion.mdwn b/doc/plugins/contrib/pod/discussion.mdwn new file mode 100644 index 000000000..9187b1350 --- /dev/null +++ b/doc/plugins/contrib/pod/discussion.mdwn @@ -0,0 +1,14 @@ +My one concern about this plugin is the `=for` markup in POD. + +> Some format names that formatters currently are known to +> accept include "roff", "man", "latex", "tex", "text", and "html". + +I don't know which of these [[!cpan Pod::Xhtml]] supports. If it currently +supports, or later support latex, that could be problimatic since that +could maybe be used to include files or run code. --[[Joey]] + +> I don't know, either; the documentation for [[!cpan Pod:Xhtml]] is silent on this subject. --[[KathrynAndersen]] + +>> I'm afraid the only approach is to audit the existing code in the perl +>> module(s), and then hope nothing is added to them later that opens a +>> security hole. --[[Joey]] diff --git a/doc/plugins/contrib/postal.mdwn b/doc/plugins/contrib/postal.mdwn index b2f875393..c522f8bcb 100644 --- a/doc/plugins/contrib/postal.mdwn +++ b/doc/plugins/contrib/postal.mdwn @@ -1,5 +1,5 @@ [[!template id=plugin name=postal author="[[DavidBremner]]"]] -[[!tag type/useful]] +[[!tag type/special-purpose]] The `postal` plugin allows users to send mail to a special address to comment on a page. It uses the [[mailbox]] diff --git a/doc/plugins/contrib/report.mdwn b/doc/plugins/contrib/report.mdwn new file mode 100644 index 000000000..0bd5392c6 --- /dev/null +++ b/doc/plugins/contrib/report.mdwn @@ -0,0 +1,26 @@ +[[!template id=plugin name=report author="[[rubykat]]"]] +[[!tag type/meta type/format]] +IkiWiki::Plugin::report - Produce templated reports from page field data. + +This plugin provides the [[ikiwiki/directive/report]] directive. This enables +one to report on the structured data ("field" values) of multiple pages; the +output is formatted via a template. This depends on the +[[plugins/contrib/field]] plugin. + + +## Activate the plugin + + # activate the plugin + add_plugins => [qw{goodstuff report ....}], + +## PREREQUISITES + + IkiWiki + IkiWiki::Plugin::field + HTML::Template + Encode + +## DOWNLOAD + +* browse at GitHub: <http://github.com/rubykat/ikiplugins/blob/master/IkiWiki/Plugin/report.pm> +* git repo at git://github.com/rubykat/ikiplugins.git diff --git a/doc/plugins/contrib/report/discussion.mdwn b/doc/plugins/contrib/report/discussion.mdwn new file mode 100644 index 000000000..e23a4ced4 --- /dev/null +++ b/doc/plugins/contrib/report/discussion.mdwn @@ -0,0 +1,75 @@ +Wow, this plugin does a lot... it seems to be `inline` (but without the feeds +or the ability to not have `archive="yes"`), plus part of +[[plugins/contrib/trail]], plus some sorting, plus an ingenious workaround +for template evaluation being relatively stateless. + +A large part of this plugin would just fall off if one of the versions of +"[[todo/allow_plugins_to_add_sorting_methods]]" was merged, which was a +large part of the idea of that feature request :-) To make use of that +you'd have to use `pagespec_match_list` in the trail case too, but that's +easy enough - just add `list => [@the_trail_pages]` to the arguments. + +Another large part would fall off if this plugin required, and internally +invoked, `inline` (like my `comments` plugin does) - `inline` runs +`pagetemplate` hooks, and in particular, it'll run the `field` hook. +Alternatively, this plugin could invoke `pagetemplate` hooks itself, +removing the special case for `field`. + +Perhaps the `headers` thing could migrate into inline somehow? That might +lead to making inline too big, though. + +> I think inline is *already* too big, honestly. --[[KathrynAndersen]] + +>> A fair point; perhaps my complaint should be that *inline* does +>> too many orthogonal things. I suppose the headers feature wouldn't +>> really make sense in an inline that didn't have `archive="yes"`, +>> so it'd make sense to recommend this plugin as a replacement +>> for inlining with archive=yes (for which I now realise "inline" +>> is the wrong verb anyway :-) ) --s + +>>> I think *inline* would be a bit less unwieldy if there was some way of factoring out the feed stuff into a separate plugin, but I don't know if that's possible. --K.A. + +Is the intention that the `trail` part is a performance hack, or a way +to select pages? How does it relate to [[todo/wikitrails]] or +[[plugins/contrib/trail]]? --[[smcv]] + +> The `trail` part is *both* a performance hack, and a way to select pages. I have over 5000 pages on my site, I need all the performance hacks I can get. +> For the performance hack, it is a way of reducing the need to iterate through every single page in the wiki in order to find matching pages. +> For the way-to-select-pages, yes, it is intended to be similar to [[todo/wikitrails]] and [[plugins/contrib/trail]] (and will be more similar with the new release which will be happening soon; it will add prev_* and next_* variables). +> The idea is that, rather than having to add special "trail" links on PageA to indicate that a page is part of the trail, +> it takes advantage of the `%links` hash, which already contains, for each page, an array of the links from that page to other pages. No need for special markup, just use what's there; a trail is defined as "all the pages linked to from page X", and since it's an array, it has an order already. +> But to avoid that being too limiting, one can use a `pages=...` pagespec to filter that list to a subset; only the pages one is interested in. +> And one can also sort it, if one so desires. +> --[[KathrynAndersen]] + +>> That's an interesting approach to trails; I'd missed the fact that +>> links are already ordered. +>> +>> This does have the same problems as tags, though: see +>> [[bugs/tagged()_matching_wikilinks]] and +>> [[todo/matching_different_kinds_of_links]]. I suppose the question +>> now is whether new code should be consistent with `tag` (and +>> potentially be fixed at the same time as tag itself), or try to +>> avoid those problems? +>> +>> The combination of `trail` with another pagespec in this plugin +>> does provide a neat way for it to work around having unwanted +>> pages in the report, by limiting by a suitable tag or subdirectory +>> or something. --s + +>>> Either that, or somehow combine tagging with fields, such that one could declare a tag, and it would create both a link and a field with a given value. (I've been working on something like that, but it still has bugs). +>>> That way, the test for whether something is tagged would be something like "link(tag/foo) and field(tag foo)". +>>> --K.A. + +>>>> I can see that this'd work well for 1:1 relationships like next +>>>> and previous, but I don't think that'd work for pages with more than +>>>> one tag - as far as I can see, `field`'s data model is that each +>>>> page has no more than one value for each field? +>>>> [[todo/Matching_different_kinds_of_links]] has some thoughts about +>>>> how it could be implemented, though. --s + +>>>>> You have a point there. I'm not sure what would be better: to add the concept of arrays/sets to `field`, or to think of tags as a special case. Problem is, I find tags as they currently exist to be too limiting. I prefer something that can be used for Faceted Tagging <http://en.wikipedia.org/wiki/Faceted_classification>; that is, things like Author:Fred Nurk, Genre:Historical, Rating:Good, and so on. Of course, that doesn't mean that each tag is limited to only one value, either; just to take the above examples, something might have more than one author, or have multiple genres (such as Historical + Romance). + +>>>>> It might be that adding arrays to the `field` plugin is a good way to go: after all, even though field=value is the most common, with the flexibility of things like YAML, one could define all sorts of things. What I'm not so sure about is how to return the values when queried, since some things would be expecting scalars all the time. Ah, perhaps I could use wantarray? +>>>>> Is there a way of checking a HTML::Template template to see if it expecting an array for a particular value? +>>>>> --[[KathrynAndersen]] diff --git a/doc/plugins/contrib/report/ikiwiki/directive/report.mdwn b/doc/plugins/contrib/report/ikiwiki/directive/report.mdwn new file mode 100644 index 000000000..df88b33ad --- /dev/null +++ b/doc/plugins/contrib/report/ikiwiki/directive/report.mdwn @@ -0,0 +1,171 @@ +[[!toc]] +The `report` directive is supplied by the [[!iki plugins/contrib/report desc=report]] plugin. + +This enables one to report on the structured data ("field" values) of +multiple pages; the output is formatted via a template. This depends +on the [[plugins/contrib/field]] plugin. + +The pages to report on are selected by a PageSpec given by the "pages" +parameter. The template is given by the "template" parameter. +The template expects the data from a single page; it is applied +to each matching page separately, one after the other. + +Additional parameters can be used to fill out the template, in +addition to the "field" values. Passed-in values override the +"field" values. + +There are two places where template files can live. One is in the +/templates directory on the wiki. These templates are wiki pages, and +can be edited from the web like other wiki pages. + +The second place where template files can live is in the global +templates directory (the same place where the page.tmpl template lives). +This is a useful place to put template files if you want to prevent +them being edited from the web, and you don't want to have to make +them work as wiki pages. + +## OPTIONS + +**template**: The template to use for the report. + +**pages**: A PageSpec to determine the pages to report on. + +**pagenames**: If given instead of pages, this is interpreted as a +space-separated list of links to pages, and they are shown in exactly the order +given: the sort and pages parameters cannot be used in conjunction with this +one. If they are used, they will be ignored. + +**trail**: A page or pages to use as a "trail" page. + +When a trail page is used, the matching pages are limited to (a subset +of) the pages which that page links to; the "pages" pagespec in this +case, rather than selecting pages from the entire wiki, will select +pages from within the set of pages given by the trail page. + +Additional space-separated trail pages can be given in this option. +For example: + + trail="animals/cats animals/dogs" + +This will take the links from both the "animals/cats" page and the +"animals/dogs" page as the set of pages to apply the PageSpec to. + +**start**: Start the report at the given page-index; the index starts +from zero. + +**count**: Report only on N pages where count=N. + +**sort**: A SortSpec to determine how the matching pages should be sorted. + +**here_only**: Report on the current page only. + +This is useful in combination with "prev_" and "next_" variables to +make a navigation trail. +If the current page doesn't match the pagespec, then no pages will +be reported on. + +### Headers + +An additional option is the "headers" option. This is a space-separated +list of field names which are to be used as headers in the report. This +is a way of getting around one of the limitations of HTML::Template, that +is, not being able to do tests such as +"if this-header is not equal to previous-header". + +Instead, that logic is performed inside the plugin. The template is +given parameters "HEADER1", "HEADER2" and so on, for each header. +If the value of a header field is the same as the previous value, +then HEADER**N** is set to be empty, but if the value of the header +field is new, then HEADER**N** is given that value. + +#### Example + +Suppose you're writing a blog in which you record "moods", and you +want to display your blog posts by mood. + + \[[!report template="mood_summary" + pages="blog/*" + sort="Mood Date title" + headers="Mood"]] + +The "mood_summary" template might be like this: + + <TMPL_IF NAME="HEADER1"> + ## <TMPL_VAR NAME="HEADER1"> + </TMPL_IF> + ### <TMPL_VAR NAME="TITLE"> + (<TMPL_VAR NAME="DATE">) \[[<TMPL_VAR NAME="PAGE">]] + <TMPL_VAR NAME="DESCRIPTION"> + +### Multi-page Reports + +Reports can now be split over multiple pages, so that there aren't +too many items per report-page. + +**per_page**: how many items to show per report-page. + +**first_page_is_index**: If true, the first page of the report is just +an index which contains links to the other report pages. +If false, the first page will contain report-content as well as links +to the other pages. + +### Advanced Options + +The following options are used to improve efficiency when dealing +with large numbers of pages; most people probably won't need them. + +**doscan**: + +Whether this report should be called in "scan" mode; if it is, then +the pages which match the pagespec are added to the list of links from +this page. This can be used by *another* report by setting this +page to be a "trail" page in *that* report. +It is not possible to use "trail" and "doscan" at the same time. +By default, "doscan" is false. + +## TEMPLATE PARAMETERS + +The templates are in HTML::Template format, just as [[plugins/template]] and +[[ftemplate]] are. The parameters passed in to the template are as follows: + +### Fields + +The structured data from the current matching page. This includes +"title" and "description" if they are defined. + +### Common values + +Values known for all pages: "page", "destpage". Also "basename" (the +base name of the page). + +### Passed-in values + +Any additional parameters to the report directive are passed to the +template; a parameter will override the matching "field" value. +For example, if you have a "Mood" field, and you pass Mood="bad" to +the report, then that will be the Mood which is given for the whole +report. + +Generally this is useful if one wishes to make a more generic +template and hide or show portions of it depending on what +values are passed in the report directive call. + +For example, one could have a "hide_mood" parameter which would hide +the "Mood" section of your template when it is true, which one could +use when the Mood is one of the headers. + +### Prev_ And Next_ Items + +Any of the above variables can be prefixed with "prev_" or "next_" +and that will give the previous or next value of that variable; that is, +the value from the previous or next page that this report is reporting on. +This is mainly useful for a "here_only" report. + +### Headers + +See the section on Headers. + +### First and Last + +If this is the first page-record in the report, then "first" is true. +If this is the last page-record in the report, then "last" is true. diff --git a/doc/plugins/contrib/screenplay.pm.mdwn b/doc/plugins/contrib/screenplay.pm.mdwn new file mode 100644 index 000000000..5ff082da5 --- /dev/null +++ b/doc/plugins/contrib/screenplay.pm.mdwn @@ -0,0 +1,320 @@ +This plugin works for me. It follows the standard for a movie screenplay pretty closely, I am not aware of any errors in format. Please let me know if you find any. + +Right now all it does is display your pages properly in a web browser. What I would like to add is the ability to output a file that could easily be printed once the screenplay is finished. We keep all the scenes we work on in one folder and eventually we will want to print a script out of that folder. It would be great if an up to date PDF or TXT script could be put in the folder when a scene is saved. I will do it, it just isn't a priority yet. + +I am not a published writer and not an authority on script formatting. I got what I know out of a book. + +Briefly, you type a command on a line, like ".d", then on the next line (for the dialog command) you type a person's name. Then you hit return again and write the words he is supposed to speak out all on one line. When you save your document this simple text will become a properly formatted script. + +Thank you Joey for having me here. + +###Headings: + Most headings should begin with a transition. The list of valid commands is: + .fi => FADE IN: a gradual transition from a solid color to an image + .fo => FADE OUT. + .ftb => FADE TO BLACK. + .ftw => FADE TO WHITE. + .ct => CUT TO: indicates an instantaneous shift from one shot to the next + .shot => lack of an explicit transition assumes a cut + .hct => HARD CUT TO: describes a jarring transition + .qct => QUICK CUT TO: describes a cut sooner than expected + .tct => TIME CUT TO: emphasizes time passing + .mct => MATCH CUT TO: image in first shot visually or thematically matches image in second + .dt => DISSOLVE TO: gradual transition from image to another implies passage of time. + .rdt => RIPPLE DISSOLVE TO: indicates transition into daydream or imagination + .wt => WIPE TO: new image slides over top of last one + + Example transition: + + .fi (or any transition command) <= Writes a transition line, except .shot which omits it. + type shot heading here <= this line will be capitalized + First direction. <= these lines are not capitalized. + Second direction. + Third direction, etc... + + Direction without a shot heading: + .dir + First direction. + Second direction. + Third direction, etc... + + Some items aren't implemented in dialogue yet: + 1) you must watch that you don't leave a " -- " dangling on a line by itself, + instead, carry the last word onto the line with a dash + 2) observe lyrical line endings in dialogue by indenting wrapped lines by two spaces + 3) you must watch that the four line limit for parenthetical direction is not exceeded + + Example dialogue: + + .d + char name <= this line will be capitalized + this is what he's saying <= Dialogue + raises hand to wave <= Parenthetical direction + this is more of what he's saying <= Dialogue + this is going to be in parenthesis <= Parenthetical direction + this is more of what he's saying, etc... <= Dialogue + + .note + Allows you to add a temporary note to a script without getting an error. + All notes need to be removed eventually because they are a format violation. + + + + ###name this file screenplay.pm and pop it in your Plugin folder. Then you need to add the plugin to your Ikiwiki setup file. + + #!/usr/bin/perl + # Screenplay markup language + package IkiWiki::Plugin::screenplay; + + use warnings; + use strict; + use IkiWiki 3.00; + use Text::Format; + use Log::Log4perl qw(:easy); + Log::Log4perl->easy_init($INFO); + #Log::Log4perl->easy_init($ERROR); + + sub import { + hook(type => "getsetup", id => "screenplay", call => \&getsetup); + hook(type => "htmlize", id => "screenplay", call => \&htmlize, longname => "Screenplay"); + } + + sub getsetup () { + return + plugin => { + safe => 1, + rebuild => 1, # format plugin + section => "format", + }, + } + + sub htmlize (@) { + #set up variables and fill with defaults + my %params=@_; + my $content = $params{content}; + my @lines = split(/\r\n|\r|\n/, $content); + my @chunk; + my @formatted; + my $current_line = shift(@lines); + my $current_command = ""; + my $current_chunk = ""; + + while (scalar(@lines) > 0) { + until ( &dot_command($current_line) || scalar(@lines) == 0 ) { + #skip spaces; mark bad lines + unless ( &blank_line($current_line) ) { + push(@formatted, "<br />"); + push(@formatted, &no_command($current_line)); + } + $current_line = shift(@lines); + } + + #Exit while loop if we're out of lines + last if (scalar(@lines) == 0); + + #set command for chunk + $current_command = $current_line; + $current_line = shift(@lines); + + #get chunk, i.e. all text up to next blank line or a dot command. + until (substr($current_line,0,1) eq '.' || $current_line =~ m// || $current_line =~ m/^\s*$/) { + push(@chunk,$current_line); + $current_line = shift(@lines); + last unless defined $current_line; + } + + #Start with a blank line unless unneeded. + if (scalar(@formatted) > 0 ) { + push(@formatted, "<br />"); + } + + #remaining lines are not commands. + if (scalar(@chunk)) { + $current_chunk = shift(@chunk); + if ($current_command eq ".shot") { + push(@formatted, &indent(&chunk(uc($current_chunk),57),17)); + while (scalar(@chunk)) { + $current_chunk = shift(@chunk); + push(@formatted, "<br />"); + push(@formatted, &indent(&chunk($current_chunk,57),17)); + } + + } elsif ($current_command eq ".note") { + push(@formatted, "NOTE:<br />"); + push(@formatted, &chunk($current_chunk,75)); + while (scalar(@chunk)) { + $current_chunk = shift(@chunk); + push(@formatted, "<br />"); + push(@formatted, &chunk($current_chunk,75)); + } + + } elsif ($current_command eq ".dir") { + push(@formatted, &indent(&chunk($current_chunk,57),17)); + while (scalar(@chunk)) { + $current_chunk = shift(@chunk); + push(@formatted, "<br />"); + push(@formatted, &indent(&chunk($current_chunk,57),17)); + } + + } elsif ($current_command eq ".d") { + push(@formatted, &indent(&chunk(uc($current_chunk),32),41)); + $current_chunk = shift(@chunk); + push(@formatted, &indent(&chunk($current_chunk,34),27)); + while (scalar(@chunk) / 2 >= 1 ) { + $current_chunk = shift(@chunk); + push(@formatted, &indent(&chunk(&pd($current_chunk),19),34)); + $current_chunk = shift(@chunk); + push(@formatted, &indent(&chunk($current_chunk,34),27)); + } + + } elsif ($current_command eq ".pd") { + push(@formatted, &indent(&chunk(uc($current_chunk),32),41)); + $current_chunk = shift(@chunk); + push(@formatted, &indent(&chunk(&pd($current_chunk),19),34)); + $current_chunk = shift(@chunk); + push(@formatted, &indent(&chunk($current_chunk,34),27)); + while (scalar(@chunk) / 2 >= 1 ) { + $current_chunk = shift(@chunk); + push(@formatted, &indent(&chunk(&pd($current_chunk),19),34)); + $current_chunk = shift(@chunk); + push(@formatted, &indent(&chunk($current_chunk,34),27)); + } + + } elsif ($current_command =~ m/^\.(fi|fo|ct|hct|qct|tct|mct|dt|rdt)$/) { + if ($current_command eq ".fi") { + push(@formatted, &indent(&chunk(uc("FADE IN:"),20),17)); + } elsif ($current_command eq ".fo") { + push(@formatted, &indent(&chunk(uc("FADE OUT:"),20),60)); + } elsif ($current_command eq ".ct") { + push(@formatted, &indent(&chunk(uc("CUT TO:"),20),60)); + } elsif ($current_command eq ".hct") { + push(@formatted, &indent(&chunk(uc("HARD CUT TO:"),20),60)); + } elsif ($current_command eq ".qct") { + push(@formatted, &indent(&chunk(uc("QUICK CUT TO:"),20),60)); + } elsif ($current_command eq ".tct") { + push(@formatted, &indent(&chunk(uc("TIME CUT TO:"),20),60)); + } elsif ($current_command eq ".mct") { + push(@formatted, &indent(&chunk(uc("MATCH CUT TO:"),20),60)); + } elsif ($current_command eq ".dt") { + push(@formatted, &indent(&chunk(uc("DISSOLVE TO:"),20),60)); + } elsif ($current_command eq ".rdt") { + push(@formatted, &indent(&chunk(uc("RIPPLE DISSOLVE TO:"),20),60)); + } elsif ($current_command eq ".wt") { + push(@formatted, &indent(&chunk(uc("WIPE TO:"),20),60)); + } + push(@formatted, &indent(&chunk(uc($current_chunk),57),17)); + while (scalar(@chunk)) { + $current_chunk = shift(@chunk); + push(@formatted, "<br />"); + push(@formatted, &indent(&chunk($current_chunk,57),17)); + } + + } + #mark the rest of the chunk as 'no command' + if (scalar(@chunk)) { + $current_chunk = shift(@chunk); + push(@formatted, &no_command($current_chunk)); + } + + } + } + my @content; + my $i = 0; + $current_line = ""; + while (scalar(@formatted)) { + $i++; + $current_line = shift(@formatted); + if ( $i % 60 == 0 ) { + push(@content, &indent($i/60 . ".<br />",72) ); + } + push(@content, $current_line); + } + $content = join("\r\n",@content); + return $content; + } + + sub blank_line { + my $line = shift(@_); + my $ret = 0; + + if ($line =~ m// || $line =~ m/^\s*$/) { + $ret = 1; + } else { + $ret = 0; + } + + return $ret; + } + + sub chunk () { + my $unchunked = shift(@_); + my $columns = shift(@_); + my $text = new Text::Format; + $text->rightFill(1); + $text->columns($columns); + $text->firstIndent(0); + $text->tabstop(0); + $text->extraSpace(1); + my @chunked = split /\n/, $text->format($unchunked); + my @formatted; + foreach (@chunked) { + push(@formatted, $_ . "<br />"); + } + return @formatted; + } + + sub dot_command { + my $line = shift(@_); + my $ret = 0; + + if ($line =~ m/^\.(ct|dir|dt|d|fi|fo|hct|mct|note|pd|qct|rdt|shot|tct)$/) { + $ret = 1; + } else { + $ret = 0; + } + + return $ret; + } + + sub indent () { + my @unindented = @_; + my $spaces = pop @unindented; + my @indented; + foreach (@unindented) { + push(@indented, " " x $spaces . $_); + } + return @indented; + } + + sub no_command () { + my $line = shift(@_); + my $text = new Text::Format; + $text->rightFill(1); + $text->columns(68); + $text->firstIndent(0); + $text->tabstop(0); + $text->extraSpace(1); + my @chunked = split /\n/, $text->format($line); + my @formatted; + push(@formatted, ("NO COMMAND: ")); + foreach (@chunked) { + push(@formatted, ( $_ . "<br />" )); + } + return @formatted; + } + + sub pd () { + my @chunk = @_; + # add '(' to top item + my $line = "(" . shift(@chunk); + unshift(@chunk, $line); + + # add ')' to bottom item + $line = pop(@chunk) . ")"; + push(@chunk, $line); + + return @chunk; + } + + 1 + diff --git a/doc/plugins/contrib/texinfo.mdwn b/doc/plugins/contrib/texinfo.mdwn index 595bd27aa..a2769166d 100644 --- a/doc/plugins/contrib/texinfo.mdwn +++ b/doc/plugins/contrib/texinfo.mdwn @@ -8,7 +8,7 @@ This plugin is not neccessarily meant to enable people to write arbitrary wiki pages in the Texinfo format (even though that is possible, of course), but rather to ease collaboration on existing Texinfo documents. -The plugin is available at <http://www.schwinge.homeip.net/~thomas/tmp/texinfo.pm>. +The plugin is available at <http://schwinge.homeip.net/~thomas/tmp/texinfo.pm>. It's very basic at the moment, but will be improved over time. diff --git a/doc/plugins/contrib/tracking.mdwn b/doc/plugins/contrib/tracking.mdwn new file mode 100644 index 000000000..06d4120cd --- /dev/null +++ b/doc/plugins/contrib/tracking.mdwn @@ -0,0 +1,30 @@ +[[!template id=plugin name=tracking author="[[BerndZeimetz]]"]] +[[!toc]] +[[!tag plugins]] [[!tag patch]] [[!tag wishlist]] + +## NAME + +IkiWiki::Plugin::tracking - enable google/piwik visitor tracking + +## SYNOPSIS + + # activate the plugin + add_plugins => [qw{goodstuff tracking ....}], + + # to use Piwik: + piwik_id => '1', + piwik_https_url => "https://ssl.example.com/piwik/", + piwik_http_url => "http://www.example.com/piwik/", + + # to use Google Analytics: + google_analytics_id => "UA-xxxxxx-x" + +## DESCRIPTION + +This plugin includes the necessary tracking codes for Piwik and/or Google Analytics on all pages. Tracking codes will only be included if the necessary config options are set. The plugin could be enhanced to support goals/profiles and similar things, but I do not plan to do so. + +## DOWNLOAD + +* single files: [tracking.pm](http://git.recluse.de/?p=users/bzed/ikiwiki.git;a=blob;f=IkiWiki/Plugin/tracking.pm;hb=refs/heads/tracking) [piwik.tmpl](http://git.recluse.de/?p=users/bzed/ikiwiki.git;a=blob;f=templates/piwik.tmpl;hb=refs/heads/tracking) [google_analytics.tmpl](http://git.recluse.de/?p=users/bzed/ikiwiki.git;a=blob;f=templates/google_analytics.tmpl;hb=refs/heads/tracking) +* browse repository: <http://git.recluse.de/?p=users/bzed/ikiwiki.git;a=shortlog;h=refs/heads/tracking> +* git repo: `git://git.recluse.de/users/bzed/ikiwiki.git` or <http://git.recluse.de/repos/users/bzed/ikiwiki.git> (Use the tracking branch) diff --git a/doc/plugins/contrib/unixauth.mdwn b/doc/plugins/contrib/unixauth.mdwn index 137195139..c97312b59 100644 --- a/doc/plugins/contrib/unixauth.mdwn +++ b/doc/plugins/contrib/unixauth.mdwn @@ -1,7 +1,7 @@ [[!template id=plugin name=unixauth core=0 author="[[schmonz]]"]] [[!tag type/auth]] -[[!template id=gitbranch branch=schmonz author="[[schmonz]]"]] +[[!template id=gitbranch branch=unixauth author="[[schmonz]]"]] This plugin authenticates users against the Unix user database. It presents a similar UI to [[plugins/passwordauth]], but simpler, as there's no need to be able to register or change one's password. diff --git a/doc/plugins/contrib/xslt.mdwn b/doc/plugins/contrib/xslt.mdwn new file mode 100644 index 000000000..80c956c58 --- /dev/null +++ b/doc/plugins/contrib/xslt.mdwn @@ -0,0 +1,39 @@ +[[!template id=plugin name=xslt author="[[rubykat]]"]] +[[!tag type/chrome]] +## NAME + +IkiWiki::Plugin::xslt - ikiwiki directive to process an XML file with XSLT + +## SYNOPSIS + +\[[!xslt file="data1.xml" stylesheet="style1.xsl"]] + +## DESCRIPTION + +IkiWiki::Plugin::xslt is an IkiWiki plugin implementing a directive +to process an input XML data file with XSLT, and output the result in +the page where the directive was called. + +It is expected that the XSLT stylesheet will output valid HTML markup. + +## OPTIONS + +There are two arguments to this directive. + +* **file:** + The file which contains XML data to be processed. This file *must* have a `.xml` extension (`filename.xml`). This file is searched for using the usual IkiWiki mechanism, thus finding the file first in the same directory as the page, then in the directory above, and so on. + +* **stylesheet:** + The file which contains XSLT stylesheet to apply to the XML data. This file *must* have a `.xsl` extension (`filename.xsl`). This file is searched for using the usual IkiWiki mechanism, thus finding the file first in the same directory as the page, then in the directory above, and so on. + +## PREREQUISITES + + IkiWiki + XML::LibXML + XML::LibXSLT + +## DOWNLOAD + +* browse at GitHub: <http://github.com/rubykat/ikiplugins/blob/master/IkiWiki/Plugin/xslt.pm> +* git repo at git://github.com/rubykat/ikiplugins.git + diff --git a/doc/plugins/contrib/xslt/discussion.mdwn b/doc/plugins/contrib/xslt/discussion.mdwn new file mode 100644 index 000000000..72cce083c --- /dev/null +++ b/doc/plugins/contrib/xslt/discussion.mdwn @@ -0,0 +1,49 @@ +## security + +I'm curious what the security implications of having this plugin on a +publically writable wiki are. + +First, it looks like the way it looks up the stylesheet file will happily +use a regular .mdwn wiki page as the stylsheet. Which means any user can +create a stylesheet and have it be used, without needing permission to +upload arbitrary files. That probably needs to be fixed; one way would be +to mandate that the `srcfile` has a `.xsl` extension. + +Secondly, if an attacker is able to upload a stylesheet file somehow, could +this be used to attack the server where it is built? I know that xslt is +really a full programming language, so I assume at least DOS attacks are +possible. Can it also read other arbitrary files, run other programs, etc? +--[[Joey]] + +> For the first point, agreed. It should probably check that the data file has a `.xml` extension also. Have now fixed. + +> For the second point, I think the main concern would be resource usage. XSLT is a pretty limited language; it can read other XML files, but it can't run other programs so far as I know. + +> -- [[KathrynAndersen]] + +>> XSLT is, indeed, a Turing-complete programming language. + However, [XML::LibXSLT][] provides a set of functions to help + to minimize the damage that may be caused by running a random + program. + +>> In particular, `max_depth ()` allows for the maximum + recursion depth to be set, while + `read_file ()`, `write_file ()`, `create_dir ()`, + `read_net ()` and `write_net ()` + are the callbacks that allow any of the possible file + operations to be denied. + +>> To be honest, I'd prefer for the `read_file ()` callback to + only grant access to the files below the Ikiwiki source + directory, and for all the `write_`… and + …`_net` callbacks to deny the access unconditionally. + +>> One more wishlist item: allow the set of locations to take + `.xsl` files from to be preconfigured, so that, e. g., + one could allow (preasumably trusted) system stylesheets, + while disallowing any stylesheets that are placed on the Wiki + itself. + +>> — Ivan Shmakov, 2010-03-28Z. + +[XML::LibXSLT]: http://search.cpan.org/~PAJAS/XML-LibXSLT/LibXSLT.pm diff --git a/doc/plugins/contrib/ymlfront.mdwn b/doc/plugins/contrib/ymlfront.mdwn new file mode 100644 index 000000000..2805be04f --- /dev/null +++ b/doc/plugins/contrib/ymlfront.mdwn @@ -0,0 +1,143 @@ +[[!template id=plugin name=ymlfront author="[[rubykat]]"]] +[[!tag type/meta]] +[[!toc]] +## NAME + +IkiWiki::Plugin::ymlfront - add YAML-format data to a page + +## SYNOPSIS + + # activate the plugin + add_plugins => [qw{goodstuff ymlfront ....}], + + # configure the plugin + ymlfront_delim => [qw(--YAML-- --YAML--)], + +## DESCRIPTION + +This plugin provides a way of adding arbitrary meta-data (data fields) to any +page by prefixing the page with a YAML-format document. This also provides +the [[ikiwiki/directive/ymlfront]] directive, which enables one to put +YAML-formatted data inside a standard IkiWiki [[ikiwiki/directive]]. + +This is a way to create per-page structured data, where each page is +treated like a record, and the structured data are fields in that record. This +can include the meta-data for that page, such as the page title. + +This plugin is meant to be used in conjunction with the [[field]] plugin. + +## DETAILS + +There are three formats for adding YAML data to a page. These formats +should not be mixed - the result is undefined. + +1. ymlfront directive + + See [[ikiwiki/directive/ymlfront]] for more information. + +2. default YAML-compatible delimiter + + By default, the YAML-format data in a page is placed at the start of + the page and delimited by lines containing precisely three dashes. + This is what YAML itself uses to delimit multiple documents. + The "normal" content of the page then follows. + + For example: + + --- + title: Foo does not work + Urgency: High + Status: Assigned + AssignedTo: Fred Nurk + Version: 1.2.3 + --- + When running on the Sprongle system, the Foo function returns incorrect data. + + What will normally be displayed is everything following the second line of dashes. That will be htmlized using the page-type of the page-file. + +3. user-defined delimiter + + Instead of using the default "---" delimiter, the user can define, + in the configuration file, the **ymlfront_delim** value, which is an + array containing two strings. The first string defines the markup for + the start of the YAML data, and the second string defines the markip + for the end of the YAML data. These two strings can be the same, or + they can be different. In this case, the YAML data section is not + required to be at the start of the page, but as with the default, it + is expected that only one data section will be on the page. + + For example: + + --YAML-- + title: Foo does not work + Urgency: High + Status: Assigned + AssignedTo: Fred Nurk + Version: 1.2.3 + --YAML-- + When running on the Sprongle system, the Foo function returns incorrect data. + + What will normally be displayed is everything outside the delimiters, + both before and after. That will be htmlized using the page-type of the page-file. + +### Accessing the Data + +There are a few ways to access the given YAML data. + +* [[getfield]] plugin + + The **getfield** plugin can display the data as individual variable values. + + For example: + + --- + title: Foo does not work + Urgency: High + Status: Assigned + AssignedTo: Fred Nurk + Version: 1.2.3 + --- + # {{$title}} + + **Urgency:** {{$Urgency}}\\ + **Status:** {{$Status}}\\ + **Assigned To:** {{$AssignedTo}}\\ + **Version:** {{$Version}} + + When running on the Sprongle system, the Foo function returns incorrect data. + +* [[ftemplate]] plugin + + The **ftemplate** plugin is like the [[plugins/template]] plugin, but it is also aware of [[field]] values. + + For example: + + --- + title: Foo does not work + Urgency: High + Status: Assigned + AssignedTo: Fred Nurk + Version: 1.2.3 + --- + \[[!ftemplate id="bug_display_template"]] + + When running on the Sprongle system, the Foo function returns incorrect data. + +* [[report]] plugin + + The **report** plugin is like the [[ftemplate]] plugin, but it reports on multiple pages, rather than just the current page. + +* write your own plugin + + In conjunction with the [[field]] plugin, you can write your own plugin to access the data. + +## PREREQUISITES + + IkiWiki + IkiWiki::Plugin::field + YAML::Any + +## DOWNLOAD + +* browse at GitHub: <http://github.com/rubykat/ikiplugins/blob/master/IkiWiki/Plugin/ymlfront.pm> +* git repo at git://github.com/rubykat/ikiplugins.git diff --git a/doc/plugins/contrib/ymlfront/discussion.mdwn b/doc/plugins/contrib/ymlfront/discussion.mdwn new file mode 100644 index 000000000..b122294bb --- /dev/null +++ b/doc/plugins/contrib/ymlfront/discussion.mdwn @@ -0,0 +1,31 @@ +Now that I have implemented a \[[!ymlfront ...]] directive, I would like to remove support for the old "---" delimited format, because + +* it is fragile (easily breakable) +* it is non-standard + +Any objections? + +> Well, I don't have much standing since I have been too lame to integrate +> ymlfront into ikiwiki yet. Buy, my opinion is, I liked the old +> format of putting the YAML literally at the front of the file. It +> seemed to allow parsing the file as YAML, using any arbitrary YAML +> processer. And it was nice how it avoided boilerplate. --[[Joey]] + +>> The old delimited format also has the advantage of being remarkably similar to the +>> [MultiMarkDown](http://fletcherpenney.net/multimarkdown/users_guide/multimarkdown_syntax_guide/) +>> way of including metadata in documents. The only difference is that MMD doesn't expect the +>> triple-dash separators, but I'm thinking about submitting a patch to MMD to actually support +>> that syntax. --GB + +>>> Yes, the idea was to allow the file to be parsed as YAML, you're right. I just found that I tended to have problems when people used "---" for horizontal rules. However, I have also found that trying to keep it solely as an IkiWiki directive doesn't work either, since sometimes the meta-data I need also contained "]]" which broke the parsing of the directive. +>>> So I have decided to go for a compromise, and make the delimiter configurable, rather than hardcoded as "---"; the triple-dash is the default, but it can be configured to be something else instead. I haven't pushed the change yet, but I have written it, and it seems to work. -- [[KathrynAndersen]] + +>>>> I'm not sure about what kind of problems you're meeting with "---" being used +>>>> for horizontal rules: isn't it sufficient to just check that (1) the triple-dash +>>>> is the first thing in the page and (2) there are only YAML-style assignments +>>>> (and no blank lines) between the two markers? Check #2 would also be enough to +>>>> support MMD-style metadata, which means (a) no start marker and (b) empty line +>>>> to mark the end of the metadata block. Would this be supported by the plugin? +>>>> --GB + +>>>>> Since I allow all legal YAML, the only way to check if it is legal YAML is to use the YAML parser, by which time one is already parsing the YAML, so it seems a bit pointless to check before one does so. -- KA diff --git a/doc/plugins/creole/discussion.mdwn b/doc/plugins/creole/discussion.mdwn index 38ee2bd78..7f47c2c97 100644 --- a/doc/plugins/creole/discussion.mdwn +++ b/doc/plugins/creole/discussion.mdwn @@ -12,4 +12,11 @@ I've installed Text::WikiCreole 0.05 and enabled the plugin, but I get an error >>> forgot, done now --[[Joey]] +--- +## External Links + I'm moving over a really stinkingly old UseMod and creole seems the nearest match. I've worked out that Bare /Subpage links need to become \[\[Subpage\]\], and Top/Sub links need to be \[\[Top/Sub\]\] (or \[\[Top/Sub|Top/Sub\]\], to display in exactly the same way), but I'm stuck on generic hyperlinks. The creole cheat sheet says I should be able to do \[\[http://url.path/foo|LinkText\]\], but that comes out as a link to create the "linktext" page, and Markdown-style \[Link Text\](http://url.path/foo) just gets rendered as is. Any suggestions? --[[schmonz]] + +> Was this problem ever solved? -- Thiana + +>> Not by me. If I were looking at the problem now, with fresh eyes, I'd probably bite the bullet and just convert everything to Markdown. --[[schmonz]] diff --git a/doc/plugins/cutpaste.mdwn b/doc/plugins/cutpaste.mdwn index f74f8a269..ea3665c44 100644 --- a/doc/plugins/cutpaste.mdwn +++ b/doc/plugins/cutpaste.mdwn @@ -1,5 +1,5 @@ [[!template id=plugin name=cutpaste author="[[Enrico]]"]] -[[!tag type/chrome]] +[[!tag type/widget]] This plugin provides the [[ikiwiki/directive/cut]], [[ikiwiki/directive/copy]] and [[ikiwiki/directive/paste]] diff --git a/doc/plugins/date.mdwn b/doc/plugins/date.mdwn new file mode 100644 index 000000000..2a33f014c --- /dev/null +++ b/doc/plugins/date.mdwn @@ -0,0 +1,6 @@ +[[!template id=plugin name=date author="[[Joey]]"]] +[[!tag type/widget]] + +This plugin provides the [[ikiwiki/directive/date]] +[[ikiwiki/directive]], which provides a way to display an arbitrary date +in a page. diff --git a/doc/plugins/ddate.mdwn b/doc/plugins/ddate.mdwn index 741606a6e..17bb16cff 100644 --- a/doc/plugins/ddate.mdwn +++ b/doc/plugins/ddate.mdwn @@ -1,6 +1,7 @@ [[!template id=plugin name=ddate author="[[Joey]]"]] [[!tag type/fun]] [[!tag type/date]] +[[!tag type/chrome]] Enables use of Discordian dates. `--timeformat` can be used to change the date format; see `ddate(1)`. diff --git a/doc/plugins/discussion.mdwn b/doc/plugins/discussion.mdwn index 854307a98..d47fa4718 100644 --- a/doc/plugins/discussion.mdwn +++ b/doc/plugins/discussion.mdwn @@ -34,3 +34,9 @@ Any objections to listing plugins alphabetically rather than by creation date? >> "recently changed" list with the 10 most recently changed plugins >> at the top. That would allow what you suggested, but still allow >> the main list to be alphabetical. -- [[Will]] + +### `themes.pm` instead of `themes.mdwn` + +Could someone please change the filename. I cannot fix this using the Web interface. Somebody step in please. --[[PaulePanter]] + +> Oops, not the first time I've made that mistake! --[[Joey]] diff --git a/doc/plugins/editpage.mdwn b/doc/plugins/editpage.mdwn index b830e51aa..346ee7c78 100644 --- a/doc/plugins/editpage.mdwn +++ b/doc/plugins/editpage.mdwn @@ -1,4 +1,5 @@ [[!template id=plugin name=editpage core=1 author="[[Joey]]"]] +[[!tag type/web]] This plugin allows editing wiki pages in the web interface. It's enabled by default if [[cgi]] is enabled; disable it if you want cgi for other things diff --git a/doc/plugins/edittemplate.mdwn b/doc/plugins/edittemplate.mdwn index 85dfdfc2d..c19ecd858 100644 --- a/doc/plugins/edittemplate.mdwn +++ b/doc/plugins/edittemplate.mdwn @@ -2,5 +2,5 @@ [[!tag type/web]] This plugin provides the [[ikiwiki/directive/edittemplate]] [[ikiwiki/directive]]. -This directive allows registering template pages, that provide default -content for new pages created using the web frontend. +This directive allows registering [[template|templates]] pages, that +provide default content for new pages created using the web frontend. diff --git a/doc/plugins/filecheck.mdwn b/doc/plugins/filecheck.mdwn index f4563d58e..b038bc433 100644 --- a/doc/plugins/filecheck.mdwn +++ b/doc/plugins/filecheck.mdwn @@ -1,5 +1,5 @@ [[!template id=plugin name=filecheck core=0 author="[[Joey]]"]] -[[!tag type/useful]] +[[!tag type/special-purpose]] This plugin enhances the regular [[ikiwiki/PageSpec]] syntax with some additional tests, for things like file size, mime type, and virus @@ -7,7 +7,8 @@ status. These tests are mostly useful for the [[attachment]] plugin, and are documented [[here|ikiwiki/pagespec/attachment]]. This plugin will use the [[!cpan File::MimeInfo::Magic]] perl module, if -available, for mimetype checking. +available, for mimetype checking. It falls back to using the `file` command +if necessary for hard to detect files. The `virusfree` [[PageSpec|ikiwiki/pagespec/attachment]] requires that ikiwiki be configured with a virus scanner program via the `virus_checker` diff --git a/doc/plugins/filecheck/discussion.mdwn b/doc/plugins/filecheck/discussion.mdwn index f91950b7d..f3f3c4ffd 100644 --- a/doc/plugins/filecheck/discussion.mdwn +++ b/doc/plugins/filecheck/discussion.mdwn @@ -15,3 +15,71 @@ if ::magic() returns undef? --[[DavidBremner]] >> for ::default >>> Applied + +--- + +At first I need to thank you for ikiwiki - it is what I was always looking +for - coming from a whole bunch of wiki engines, this is the most +intelligent and least bloated one. + +My question is about the [[plugins/attachment]] plugin in conjunction with +[[plugins/filecheck]]: I am using soundmanger2 js-library for having +attached media files of all sorts played inline a page. + +To achieve this soundmanager2 asks for an id inside a ul-tag surrounding +the a-tag. I was wondering if the Insert Link button could be provided with +a more elegant solution than to have this code snippet to be filled in by +hand every time you use it to insert links for attached media files. And in +fact there apparently is a way in attachment.pm. + +While I can see that it is not needed for everyone inserting links to +attached media files to have ul- and li-tags surrounding the link itself as +well as being supplied with an id fill in, for me it would be the most +straight forward solution. Pitty is I don't have the time to wrap my head +around perl to write a patch myself. Is there any way to have this made an +option which can be called via templates? + +For sure I would like to donate for such a patch as well as I will do it +for ikiwiki anyway, because it is such a fine application. + +If you are not familiar with soundmanager2: It is a very straight forward +solution to inline mediafiles, using the usual flash as well as html5 +solutions (used by soundcloud.com, freesound.org and the like). Worth a +look anyway [schillmania.com](http://www.schillmania.com/) + +Boris + +> The behavior of "Insert Links" is currently hardcoded to support images +> and has a fallback for other files. What you want is a +> [[todo/generic_insert_links]] that can insert a template directive. +> Then you could make a template that generates the html needed for +> soundmanager2. I've written down a design at +> [[todo/generic_insert_links]]; I am currently very busy and not sure +> when I will get around to writing it, but with it on the todo list +> I shouldn't forget. --[[Joey]] +> +> You could make a [[ikiwiki/directive/template]] for soundmanager2 +> now, and manually insert the template directive for now +> when you want to embed a sound file. Something like this: + + \[[!template id=embed_mp3 file=your.mp3]] + +> Then in templates/embed_mp3.mdwn, something vaguely like this: + + <ul id="foo"> + <a href="<TMPL_VAR FILE>">mp3</a> + </ul> + +>> Thanks a lot - looking forward to [[todo/generic_insert_links]] - I am using the [[ikiwiki/directive/template]] variant also adding a name vaiable, it looks like this and is working fine: + + <ul class="playlist"> + <li> + <a href="<TMPL_VAR FILE>"><TMPL_VAR NAME></a> + </li> + </ul> + +>> Calling it: + + \[[!template id=embedmedia.tmpl file=../Tinas_Gonna_Have_A_Baby.mp3 name="Tina's Gonna Have A Baby" ]] + +>> BTW your Flattr button doesn't seem to work properly - or it is Flattr itself that doesn't- clicking it won't let ikiwiki show up on my Dashboard. diff --git a/doc/plugins/flattr.mdwn b/doc/plugins/flattr.mdwn new file mode 100644 index 000000000..5da279518 --- /dev/null +++ b/doc/plugins/flattr.mdwn @@ -0,0 +1,9 @@ +[[!template id=plugin name=flattr author="[[Joey]]"]] +[[!tag type/web]] + +[Flattr](http://flattr.com/) is a social micropayment platform. +This plugin allows easily adding Flattr buttons to pages, +using the [[ikiwiki/directive/flattr]] directive. + +This plugin has a configuration setting. `flattr_userid` can be set +to either your numeric flatter userid, or your flattr username. diff --git a/doc/plugins/format.mdwn b/doc/plugins/format.mdwn index 91e707fcf..b41d365aa 100644 --- a/doc/plugins/format.mdwn +++ b/doc/plugins/format.mdwn @@ -1,5 +1,5 @@ [[!template id=plugin name=format core=0 author="[[Joey]]"]] -[[!tag type/format]] +[[!tag type/widget]] This plugin allows mixing different page formats together, by embedding text formatted one way inside a page formatted another way. This is done diff --git a/doc/plugins/fortune.mdwn b/doc/plugins/fortune.mdwn index 9966f456d..3cb125ac1 100644 --- a/doc/plugins/fortune.mdwn +++ b/doc/plugins/fortune.mdwn @@ -1,5 +1,6 @@ [[!template id=plugin name=fortune author="[[Joey]]"]] [[!tag type/fun]] +[[!tag type/widget]] This plugin implements the [[ikiwiki/directive/fortune]] [[ikiwiki/directive]]. This directive uses the `fortune` program to insert a fortune into the page. diff --git a/doc/plugins/getsource.mdwn b/doc/plugins/getsource.mdwn index 20040ccee..d5404a628 100644 --- a/doc/plugins/getsource.mdwn +++ b/doc/plugins/getsource.mdwn @@ -1,4 +1,5 @@ [[!template id=plugin name=getsource author="[[Will_Uther|Will]]"]] +[[!tag type/web]] This plugin adds a "Source" link to the top of each page that uses the CGI to display the page's source. diff --git a/doc/plugins/getsource/discussion.mdwn b/doc/plugins/getsource/discussion.mdwn new file mode 100644 index 000000000..3e985948b --- /dev/null +++ b/doc/plugins/getsource/discussion.mdwn @@ -0,0 +1,3 @@ +It would be very cool if this plugin was enabled by default. One of the best ways to learn how to do various advanced things is to be able to "view source" on other wiki's which do things you like. -- [[AdamShand]] + +This plugin requires the cgi plugin. If you run a static site, you may check the [[repolist]] plugin. -- [[weakish]] diff --git a/doc/plugins/goodstuff/discussion.mdwn b/doc/plugins/goodstuff/discussion.mdwn new file mode 100644 index 000000000..4ccea4ad4 --- /dev/null +++ b/doc/plugins/goodstuff/discussion.mdwn @@ -0,0 +1,8 @@ +### What is the syntax for enabling plugins in the setup file? + +Here is an example snippet from a working setup file: + + <pre> + # plugins to add to the default configuration + add_plugins => ['goodstuff'], +</pre> diff --git a/doc/plugins/google.mdwn b/doc/plugins/google.mdwn index 7c61e637b..349c278ee 100644 --- a/doc/plugins/google.mdwn +++ b/doc/plugins/google.mdwn @@ -5,8 +5,7 @@ This plugin adds a search form to the wiki, using google's site search. Google is asked to search for pages in the domain specified in the wiki's `url` configuration parameter. Results will depend on whether google has -indexed the site, and how recently. Also, if the same domain has other -content, outside the wiki's content, it will be searched as well. +indexed the site, and how recently. The [[search]] plugin offers full text search of only the wiki, but requires that a search engine be installed on your site. diff --git a/doc/plugins/google/discussion.mdwn b/doc/plugins/google/discussion.mdwn index babc919d2..e664f5723 100644 --- a/doc/plugins/google/discussion.mdwn +++ b/doc/plugins/google/discussion.mdwn @@ -4,3 +4,22 @@ This is not very good since the default ikiwiki templates produce XHTML instead of HTML. > Fixed, thanks for the patch! --[[Joey]] + +It works to pass the whole wiki baseurl to Google, not just the +domain, and appears to be legal. I've got a wiki that'd benefit +(it's a few directories down from the root). Can the plugin be +tweaked to do this? --[[schmonz]] + +> Done. --[[Joey]] + +The main page said: + +> Also, if the same domain has other content, outside the wiki's +> content, it will be searched as well. + +Is it still true now? (Or this statement is out of date?) --[weakish] + +[weakish]: http://weakish.pigro.net + +> I checked, and it's never been true; google is given the url to the top +> of the wiki and only searches things in there. --[[Joey]] diff --git a/doc/plugins/goto.mdwn b/doc/plugins/goto.mdwn index 9c401c5d2..8e1de7a10 100644 --- a/doc/plugins/goto.mdwn +++ b/doc/plugins/goto.mdwn @@ -1,5 +1,5 @@ [[!template id=plugin name=goto author="[[Simon_McVittie|smcv]]"]] -[[!tag type/useful]] +[[!tag type/web]] This plugin adds a `do=goto` mode for the IkiWiki CGI script. It's mainly for internal use by the [[404]], [[comments]] and [[recentchanges]] diff --git a/doc/plugins/graphviz.mdwn b/doc/plugins/graphviz.mdwn index b89f16b59..d57d7dc94 100644 --- a/doc/plugins/graphviz.mdwn +++ b/doc/plugins/graphviz.mdwn @@ -1,5 +1,5 @@ [[!template id=plugin name=graphviz author="[[JoshTriplett]]"]] -[[!tag type/chrome type/format]] +[[!tag type/widget]] This plugin provides the [[ikiwiki/directive/graph]] [[ikiwiki/directive]]. This directive allows embedding [graphviz](http://www.graphviz.org/) graphs in a @@ -22,4 +22,4 @@ Some example graphs: [[!graph src="a -- b -- c -- a;" prog="circo" type="graph"]] """]] -This plugin uses the [[!cpan Digest::SHA1]] perl module. +This plugin uses the [[!cpan Digest::SHA]] perl module. diff --git a/doc/plugins/haiku.mdwn b/doc/plugins/haiku.mdwn index 74eac1c29..448733d95 100644 --- a/doc/plugins/haiku.mdwn +++ b/doc/plugins/haiku.mdwn @@ -1,5 +1,6 @@ [[!template id=plugin name=haiku author="[[Joey]]"]] [[!tag type/fun]] +[[!tag type/widget]] This plugin provides a [[ikiwiki/directive/haiku]] [[ikiwiki/directive]]. The directive allows inserting a randomly generated haiku into a wiki page. diff --git a/doc/plugins/htmlscrubber.mdwn b/doc/plugins/htmlscrubber.mdwn index c59b46e14..080575c46 100644 --- a/doc/plugins/htmlscrubber.mdwn +++ b/doc/plugins/htmlscrubber.mdwn @@ -32,10 +32,11 @@ other HTML-related functionality, such as whether [[meta]] allows potentially unsafe HTML tags. The `htmlscrubber_skip` configuration setting can be used to skip scrubbing -of some pages. Set it to a [[ikiwiki/PageSpec]], such as "!*/Discussion", -and pages matching that can have all the evil CSS, JavsScript, and unsafe -html elements you like. One safe way to use this is to use [[lockedit]] to -lock those pages, so only admins can edit them. +of some pages. Set it to a [[ikiwiki/PageSpec]], such as +"posts/* and !comment(*) and !*/Discussion", and pages matching that can have +all the evil CSS, JavsScript, and unsafe html elements you like. One safe +way to use this is to use [[lockedit]] to lock those pages, so only admins +can edit them. ---- diff --git a/doc/plugins/httpauth.mdwn b/doc/plugins/httpauth.mdwn index 11ed223e7..0eda5554f 100644 --- a/doc/plugins/httpauth.mdwn +++ b/doc/plugins/httpauth.mdwn @@ -2,8 +2,34 @@ [[!tag type/auth]] This plugin allows HTTP basic authentication to be used to log into the -wiki. To use the plugin, your web server should be set up to perform HTTP -basic authentiation for at least the directory containing `ikiwiki.cgi`. -The authenticated user will be automatically signed into the wiki. +wiki. -This plugin is included in ikiwiki, but is not enabled by default. +## fully authenticated wiki + +One way to use the plugin is to configure your web server to require +HTTP basic authentication for any access to the directory containing the +wiki (and `ikiwiki.cgi`). The authenticated user will be automatically +signed into the wiki. This method is suitable only for private wikis. + +## separate cgiauthurl + +To use httpauth for a wiki where the content is public, and where +the `ikiwiki.cgi` needs to be usable without authentication (for searching, +or logging in using other methods, and so on), you can configure a separate +url that is used for authentication, via the `cgiauthurl` option in the setup +file. This url will then be redirected to when a user chooses to log in using +httpauth. + +A typical setup is to make an `auth` subdirectory, and symlink `ikiwiki.cgi` +into it. Then configure the web server to require authentication only for +access to the `auth` subdirectory. Then `cgiauthurl` is pointed at this +symlink. + +## using only httpauth for some pages + +If you want to only use httpauth for editing some pages, while allowing +other authentication methods to be used for other pages, you can +configure `httpauth_pagespec` in the setup file. This makes Edit +links on pages that match the [[ikiwiki/PageSpec]] automatically use +the `cgiauthurl`, and prevents matching pages from being edited by +users authentication via other methods. diff --git a/doc/plugins/img.mdwn b/doc/plugins/img.mdwn index 114438765..a6cd90f28 100644 --- a/doc/plugins/img.mdwn +++ b/doc/plugins/img.mdwn @@ -1,5 +1,5 @@ [[!template id=plugin name=img author="Christian Mock"]] -[[!tag type/chrome]] +[[!tag type/widget]] This plugin provides the [[ikiwiki/directive/img]] [[ikiwiki/directive]]. While ikiwiki supports inlining full-size images by making a diff --git a/doc/plugins/inline.mdwn b/doc/plugins/inline.mdwn index 6c3282576..3eb849fdb 100644 --- a/doc/plugins/inline.mdwn +++ b/doc/plugins/inline.mdwn @@ -1,4 +1,5 @@ [[!template id=plugin name=inline core=1 author="[[Joey]]"]] +[[!tag type/widget]] This plugin provides the [[ikiwiki/directive/inline]] [[ikiwiki/directive]], which allows including one wiki page diff --git a/doc/plugins/link.mdwn b/doc/plugins/link.mdwn index 6adbf3eae..7dfa50de4 100644 --- a/doc/plugins/link.mdwn +++ b/doc/plugins/link.mdwn @@ -1,4 +1,5 @@ [[!template id=plugin name=link core=1 author="[[Joey]]"]] [[!tag type/link]] -This plugin implements standard [[WikiLinks|ikiwiki/wikilink]]. +This plugin implements standard [[WikiLinks|ikiwiki/wikilink]] and links to +external pages. diff --git a/doc/plugins/linkmap.mdwn b/doc/plugins/linkmap.mdwn index 89cb9d8ae..7e51cd935 100644 --- a/doc/plugins/linkmap.mdwn +++ b/doc/plugins/linkmap.mdwn @@ -1,5 +1,6 @@ [[!template id=plugin name=linkmap author="[[Joey]]"]] [[!tag type/meta]] +[[!tag type/widget]] [[!tag type/slow]] This plugin provides the [[ikiwiki/directive/linkmap]] [[ikiwiki/directive]]. diff --git a/doc/plugins/listdirectives.mdwn b/doc/plugins/listdirectives.mdwn index 2d9bce01d..df854de52 100644 --- a/doc/plugins/listdirectives.mdwn +++ b/doc/plugins/listdirectives.mdwn @@ -1,5 +1,6 @@ [[!template id=plugin name=listdirectives author="Will"]] [[!tag type/meta]] +[[!tag type/widget]] This plugin provides the [[ikiwiki/directive/listdirectives]] [[ikiwiki/directive]], which inserts a list of currently available diff --git a/doc/plugins/localstyle.mdwn b/doc/plugins/localstyle.mdwn new file mode 100644 index 000000000..70a909d68 --- /dev/null +++ b/doc/plugins/localstyle.mdwn @@ -0,0 +1,12 @@ +[[!template id=plugin name=localstyle author="[[Joey]]"]] +[[!tag type/chrome]] + +This plugin allows styling different sections of a wiki using different +versions of the local.css [[CSS]] file. Normally this file is read from the +top level of the wiki, but with this plugin enabled, standard +[[ikiwiki/subpage/LinkingRules]] are used to find the closest local.css +file to each page. + +So, for example, to use different styling for page `foo`, as well as all +of its [[SubPages|ikiwiki/subpage]], such as `foo/bar`, create a +`foo/local.css`. diff --git a/doc/plugins/lockedit.mdwn b/doc/plugins/lockedit.mdwn index c8f64ea47..681163203 100644 --- a/doc/plugins/lockedit.mdwn +++ b/doc/plugins/lockedit.mdwn @@ -12,14 +12,9 @@ to lock. For example, you could choose to lock all pages created before 2006, or all pages that are linked to from the page named "locked". More usually though, you'll just list some names of pages to lock. -One handy thing to do if you're using ikiwiki for your blog is to lock -"* and !*/Discussion". This prevents others from adding to or modifying -posts in your blog, while still letting them comment via the Discussion -pages. - -Alternatively, if you're using the [[comments]] plugin, you can lock -"!postcomment(*)" to allow users to comment on pages, but not edit anything -else. +If you want to lock down a blog so only you can post to it, you can just +lock "*", and enable the [[opendiscussion]] plugin, so readers can still post +[[comments]]. Wiki administrators can always edit locked pages. The [[ikiwiki/PageSpec]] can specify that some pages are not locked for some users. For example, diff --git a/doc/plugins/lockedit/discussion.mdwn b/doc/plugins/lockedit/discussion.mdwn index b058b2b07..867fc6a51 100644 --- a/doc/plugins/lockedit/discussion.mdwn +++ b/doc/plugins/lockedit/discussion.mdwn @@ -1,21 +1,18 @@ -This plugin not only locks pages but ensures too a user is logged in. This seems to me redundant with signedit. I propose : +This plugin not only locks pages but ensures too a user is logged in. This +seems to me redundant with signedit. I propose [removing the if block that +calls needsignin ]. - sub canedit ($$) { - my $page=shift; - my $cgi=shift; - my $session=shift; - - my $user=$session->param("name"); - return undef if defined $user && IkiWiki::is_admin($user); - - if (defined $config{locked_pages} && length $config{locked_pages} && - pagespec_match($page, $config{locked_pages}, - user => $session->param("name"), - ip => $ENV{REMOTE_ADDR}, - )) { - return sprintf(gettext("%s is locked and cannot be edited"), - htmllink("", "", $page, noimageinline => 1)); - } - - return undef; - } +> That was added because the most typical reason for being unable to edit a +> page is that you are not logged in. And without the jump to logging the +> user in, there is no way for the user to log in, without navigating away +> from the page they were trying to edit. --[[Joey]] + +>> Ok, but the problem is that when you don't want any signin form you end up +>> with a lone login button. That might happend if you lock pages only on IP +>> adresses, if you use another cookie from another webapp... + +>> That happends to me and I had to reimplement lockedit in my private auth +>> plugin. + +>> Perhaps you could return undef on that case and let another plugin do the +>> needsignin call ? -- [[Jogo]] diff --git a/doc/plugins/map.mdwn b/doc/plugins/map.mdwn index 8f5a9f15e..b164d5ca8 100644 --- a/doc/plugins/map.mdwn +++ b/doc/plugins/map.mdwn @@ -1,5 +1,5 @@ [[!template id=plugin name=map author="Alessandro Dotti Contra"]] -[[!tag type/meta]] +[[!tag type/meta type/widget]] This plugin provides the [[ikiwiki/directive/map]] [[ikiwiki/directive]], which generates a hierarchical page map for the wiki. diff --git a/doc/plugins/map/discussion.mdwn b/doc/plugins/map/discussion.mdwn index 2f7b140d6..54c921b0f 100644 --- a/doc/plugins/map/discussion.mdwn +++ b/doc/plugins/map/discussion.mdwn @@ -1,7 +1,7 @@ I'm wanting a [[map]] (with indentation levels) showing page _titles_ instead of page 'names'. As far as I can see, this is not an option with existing plugins - I can get a list of pages using [[inline]] and -appropriate [[wikitemplates]], but that has no indentation and therefore +appropriate [[templates]], but that has no indentation and therefore doesn't show structure well. The quick way is to modify the map plugin to have a 'titles' option. The diff --git a/doc/plugins/mirrorlist.mdwn b/doc/plugins/mirrorlist.mdwn index b371e8eb7..aedc1f4a0 100644 --- a/doc/plugins/mirrorlist.mdwn +++ b/doc/plugins/mirrorlist.mdwn @@ -1,5 +1,5 @@ [[!template id=plugin name=mirror author="[[Joey]]"]] -[[!tag type/special-purpose]] +[[!tag type/web]] This plugin allows adding links a list of mirrors to each page in the wiki. For each mirror, a name and an url should be specified. Pages are diff --git a/doc/plugins/moderatedcomments.mdwn b/doc/plugins/moderatedcomments.mdwn new file mode 100644 index 000000000..f9466e833 --- /dev/null +++ b/doc/plugins/moderatedcomments.mdwn @@ -0,0 +1,12 @@ +[[!template id=plugin name=moderatedcomments author="[[Joey]]"]] +[[!tag type/auth]] + +This plugin causes [[comments]] to be held for manual moderation. +Admins can access the comment moderation queue via their preferences page. + +By default, all comments made by anyone who is not an admin will be held +for moderation. The `moderate_pagespec` setting can be used to specify a +[[ikiwiki/PageSpec]] to match comments and users who should be moderated. +For example, to avoid moderating comments from logged-in users, set +`moderate_pagespec` to "`!user(*)`". Or to moderate everyone except for +admins, set it to "`!admin(*)`". diff --git a/doc/plugins/more.mdwn b/doc/plugins/more.mdwn index e9a971289..a0664e843 100644 --- a/doc/plugins/more.mdwn +++ b/doc/plugins/more.mdwn @@ -1,5 +1,5 @@ [[!template id=plugin name=more author="Ben"]] -[[!tag type/format]] +[[!tag type/widget]] This plugin provides the [[ikiwiki/directive/more]] [[ikiwiki/directive]], which is a way to have a "more" link on a post in a blog, that leads to the diff --git a/doc/plugins/opendiscussion.mdwn b/doc/plugins/opendiscussion.mdwn index b2ba68bf7..3b5ab4858 100644 --- a/doc/plugins/opendiscussion.mdwn +++ b/doc/plugins/opendiscussion.mdwn @@ -1,5 +1,6 @@ [[!template id=plugin name=opendiscussion author="[[Joey]]"]] [[!tag type/auth]] -This plugin allows editing of Discussion pages by anonymous users who have -not logged into the wiki. +This plugin allows editing of Discussion pages, and posting of comments, +even when the [[lockedit]] plugin has been configured to otherwise prevent +editing. diff --git a/doc/plugins/openid.mdwn b/doc/plugins/openid.mdwn index 91fc7cddc..f3b3abfbb 100644 --- a/doc/plugins/openid.mdwn +++ b/doc/plugins/openid.mdwn @@ -11,17 +11,22 @@ The [[!cpan LWPx::ParanoidAgent]] perl module is used if available, for added security. Finally, the [[!cpan Crypt::SSLeay]] perl module is needed to support users entering "https" OpenID urls. -This plugin has a configuration option. You can set `--openidsignup` -to the url of a third-party site where users can sign up for an OpenID. If -it's set, the signin page will link to that site. - -This plugin supports the -[myopenid.com affiliate program](http://myopenid.com/affiliate_welcome), -which can be used to help users sign up for an OpenID and log into your -site in a single, unified process. When you create the affiliate, specify a -login url like `http://example.com/ikiwiki.cgi?do=continue`. Once the -affiliate is created, set `openidsignup` to point to the affiliate's signup -url. - This plugin is enabled by default, but can be turned off if you want to only use some other form of authentication, such as [[passwordauth]]. + +## options + +These options do not normally need to be set, but can be useful in +certian setups. + +* `openid_realm` can be used to control the scope of the openid request. + It defaults to the `cgiurl` (or `openid_cgiurl` if set); only allowing + ikiwiki's [[CGI]] to authenticate. If you have multiple ikiwiki instances, + or other things using openid on the same site, you may choose to put them + all in the same realm to improve the user's openid experience. It is an + url pattern, so can be set to eg "http://*.example.com/" + +* `openid_cgiurl` can be used to cause a different than usual `cgiurl` + to be used when doing openid authentication. The `openid_cgiurl` must + point to an ikiwiki [[CGI]], and it will need to match the `openid_realm` + to work. diff --git a/doc/plugins/openid/discussion.mdwn b/doc/plugins/openid/discussion.mdwn index 39e947b82..a88da8b9d 100644 --- a/doc/plugins/openid/discussion.mdwn +++ b/doc/plugins/openid/discussion.mdwn @@ -19,3 +19,8 @@ It looks like OpenID 2.0 (the only supported by Yahoo) is not supported in ikiwi -- Ivan Z. They have more on OpenID 2.0 in [their FAQ](http://developer.yahoo.com/openid/faq.html). --Ivan Z. + +---- +I'm trying to add a way to query the data saved by the OpenID plugin from outside of ikiwiki, to see what identity the user has been authenticated as, if any. I'm thinking of designating some directories as internal pages and check the identity against a list in a mod_perl access hook. I would also write a CGI script that would return a JSON formatted reply to tell if the user is authenticated for those pages and query it with AJAX and only render links to the internal pages if the user would have access to them. That's just a couple of ideas I'm working on first, but I can imagine that there's any number of other tricks that people could implement with that sort of a thing. + +Also, this isn't really specific to OpenID but to all auth plugins, but I'm going to use only OpenID for authentication so that's what I'm targeting right now. I suppose that would be worth its own TODO item. --[[kaol]] diff --git a/doc/plugins/orphans.mdwn b/doc/plugins/orphans.mdwn index e403c2d18..09ad0a51d 100644 --- a/doc/plugins/orphans.mdwn +++ b/doc/plugins/orphans.mdwn @@ -1,5 +1,6 @@ [[!template id=plugin name=orphans author="[[Joey]]"]] [[!tag type/meta]] +[[!tag type/widget]] This plugin provides the [[ikiwiki/directive/orphans]] [[ikiwiki/directive]], which generates a list of possibly orphaned pages -- diff --git a/doc/plugins/pagecount.mdwn b/doc/plugins/pagecount.mdwn index a56027e60..71872fae8 100644 --- a/doc/plugins/pagecount.mdwn +++ b/doc/plugins/pagecount.mdwn @@ -1,5 +1,6 @@ [[!template id=plugin name=pagecount author="[[Joey]]"]] [[!tag type/meta]] +[[!tag type/widget]] This plugin provides the [[ikiwiki/directive/pagecount]] [[ikiwiki/directive]], which displays the number of pages diff --git a/doc/plugins/pagestats.mdwn b/doc/plugins/pagestats.mdwn index c3eba6363..347e39a89 100644 --- a/doc/plugins/pagestats.mdwn +++ b/doc/plugins/pagestats.mdwn @@ -1,5 +1,5 @@ [[!template id=plugin name=pagestats author="Enrico Zini"]] -[[!tag type/meta type/tags]] +[[!tag type/meta type/tags type/widget]] This plugin provides the [[ikiwiki/directive/pagestats]] [[ikiwiki/directive]], which can generate stats about how pages link to diff --git a/doc/plugins/pagetemplate.mdwn b/doc/plugins/pagetemplate.mdwn index 53f069d0d..8254e14c5 100644 --- a/doc/plugins/pagetemplate.mdwn +++ b/doc/plugins/pagetemplate.mdwn @@ -3,8 +3,4 @@ This plugin provides the [[ikiwiki/directive/pagetemplate]] [[ikiwiki/directive]], which allows a page to be displayed -using a different [[template|wikitemplates]] than the default. - -This plugin can only use templates that are already installed in -`/usr/share/ikiwiki/templates` (or wherever ikiwiki is configured to look for -them). You can choose to use any .tmpl files in that directory. +using a different [[template|templates]] than the default. diff --git a/doc/plugins/parentlinks.mdwn b/doc/plugins/parentlinks.mdwn index ef262a30c..c2d364bef 100644 --- a/doc/plugins/parentlinks.mdwn +++ b/doc/plugins/parentlinks.mdwn @@ -1,5 +1,5 @@ [[!template id=plugin name=parentlinks core=1 author="[[intrigeri]]"]] -[[!tag type/link]] +[[!tag type/link type/chrome]] This plugin generates the links to a page's parents that typically appear at the top of a wiki page. diff --git a/doc/plugins/po.mdwn b/doc/plugins/po.mdwn index f3b70b5f7..f91e44ea3 100644 --- a/doc/plugins/po.mdwn +++ b/doc/plugins/po.mdwn @@ -49,15 +49,15 @@ Supported languages `po_master_language` is used to set the "master" language in `ikiwiki.setup`, such as: - po_master_language => { 'code' => 'en', 'name' => 'English' } + po_master_language => 'en|English' `po_slave_languages` is used to set the list of supported "slave" languages, such as: - po_slave_languages => { 'fr' => 'Français', - 'es' => 'Español', - 'de' => 'Deutsch', - } + po_slave_languages => [ 'fr|Français', + 'es|Español', + 'de|Deutsch', + ] Decide which pages are translatable ----------------------------------- @@ -117,23 +117,25 @@ serve any page in the client's preferred language, if available. Add 'Options MultiViews' to the wiki directory's configuration in Apache. -When `usedirs` is enabled, one has to set `DirectoryIndex index` for -the wiki context. +When `usedirs` is enabled, you should also set `DirectoryIndex index`. -Setting `DefaultLanguage LL` (replace `LL` with your default MIME -language code) for the wiki context can help to ensure -`bla/page/index.en.html` is served as `Content-Language: LL`. +These settings are also recommended, in order to avoid serving up rss files +as index pages: + + AddType application/rss+xml;qs=0.8 .rss + AddType application/atom+xml;qs=0.8 .atom For details, see [Apache's documentation](http://httpd.apache.org/docs/2.2/content-negotiation.html). lighttpd -------- -lighttpd unfortunately does not support content negotiation. +Recent versions of lighttpd should be able to use +`$HTTP["language"]` to configure the translated pages to be served. -**FIXME**: does `mod_magnet` provide the functionality needed to - emulate this? +See [Lighttpd Issue](http://redmine.lighttpd.net/issues/show/1119) +TODO: Example Usage ===== @@ -197,7 +199,7 @@ enabled, "slave" pages therefore link to the "master" page's discussion page. Likewise, "slave" pages are not supposed to have sub-pages; -[[WikiLinks|wikilink]] that appear on a "slave" page therefore link to +[[WikiLinks|ikiwiki/wikilink]] that appear on a "slave" page therefore link to the master page's sub-pages. Translating @@ -212,16 +214,16 @@ preferred `$EDITOR`, without needing to be online. Markup languages support ------------------------ -[[Markdown|mdwn]] is well supported. Some other markup languages supported -by ikiwiki mostly work, but some pieces of syntax are not rendered -correctly on the slave pages: +[[Markdown|mdwn]] and [[html]] are well supported. Some other markup +languages supported by ikiwiki mostly work, but some pieces of syntax +are not rendered correctly on the slave pages: * [[reStructuredText|rst]]: anonymous hyperlinks and internal cross-references * [[wikitext]]: conversion of newlines to paragraphs * [[creole]]: verbatim text is wrapped, tables are broken -* [[html]] and LaTeX: not supported yet; the dedicated po4a modules - could be used to support them, but they would need a security audit +* LaTeX: not supported yet; the dedicated po4a module + could be used to support it, but it would need a security audit * other markup languages have not been tested. Security @@ -234,95 +236,14 @@ When using po4a older than 0.35, it is recommended to uninstall `Text::WrapI18N` (Debian package `libtext-wrapi18n-perl`), in order to avoid a potential denial of service. -TODO +BUGS ==== -Better links ------------- - -Once the fix to -[[bugs/pagetitle_function_does_not_respect_meta_titles]] from -[[intrigeri]]'s `meta` branch is merged into ikiwiki upstream, the -generated links' text will be optionally based on the page titles set -with the [[meta|plugins/meta]] plugin, and will thus be translatable. -It will also allow displaying the translation status in links to slave -pages. Both were implemented, and reverted in commit -ea753782b222bf4ba2fb4683b6363afdd9055b64, which should be reverted -once [[intrigeri]]'s `meta` branch is merged. - -An integration branch, called `meta-po`, merges [[intrigeri]]'s `po` -and `meta` branches, and thus has this additional features. - -Language display order ----------------------- - -Jonas pointed out that one might want to control the order that links to -other languages are listed, for various reasons. Currently, there is no -order, as `po_slave_languages` is a hash. It would need to be converted -to an array to support this. (If twere done, twere best done quickly.) ---[[Joey]] - -Pagespecs ---------- - -I was suprised that, when using the map directive, a pagespec of "*" -listed all the translated pages as well as regular pages. That can -make a big difference to an existing wiki when po is turned on, -and seems generally not wanted. -(OTOH, you do want to match translated pages by -default when locking pages.) --[[Joey]] - -Edit links on untranslated pages --------------------------------- - -If a page is not translated yet, the "translated" version of it -displays wikilinks to other, existing (but not yet translated?) -pages as edit links, as if those pages do not exist. - -That's really confusing, especially as clicking such a link -brings up an edit form to create a new, english page. - -This is with po_link_to=current or negotiated. With default, it doesn't -happen.. - -Also, this may only happen if the page being linked to is coming from an -underlay, and the underlays lack translation to a given language. ---[[Joey]] +[[!inline pages="bugs/po:* and !bugs/done and !link(bugs/done) and !bugs/*/*" +feeds=no actions=no archive=yes show=0]] -Double commits of po files --------------------------- - -When adding a new english page, the po files are created, committed, -and then committed again. The second commit makes this change: - - -"Content-Type: text/plain; charset=utf-8\n" - -"Content-Transfer-Encoding: ENCODING" - +"Content-Type: text/plain; charset=UTF-8\n" - +"Content-Transfer-Encoding: ENCODING\n" - -Same thing happens when a change to an existing page triggers a po file -update. --[[Joey]] - -Ugly messages with empty files ------------------------------- - -If there are empty .mdwn files, the po plugin displays some ugly messages. - -Translation of directives -------------------------- - -If a translated page contains a directive, it may expand to some english -text, or text in whatever single language ikiwiki is configured to "speak". - -Maybe there could be a way to switch ikiwiki to speaking another language -when building a non-english page? Then the directives would get translated. - -(We also will need this in order to use translated templates, when they are -available.) - -Documentation -------------- +TODO +==== -Maybe write separate documentation depending on the people it targets: -translators, wiki administrators, hackers. This plugin may be complex -enough to deserve this. +[[!inline pages="todo/po:* and !todo/done and !link(todo/done) and !todo/*/*" +feeds=no actions=no archive=yes show=0]] diff --git a/doc/plugins/po/discussion.mdwn b/doc/plugins/po/discussion.mdwn index ab822e76c..50998e822 100644 --- a/doc/plugins/po/discussion.mdwn +++ b/doc/plugins/po/discussion.mdwn @@ -150,6 +150,23 @@ The following analysis was done with his help. variables; according to [[Joey]], this is "Freaky code, but seems ok due to use of `quotementa`". +##### Locale::Po4a::Xhtml + +* does not run any external program +* does not build regexp's from untrusted variables + +=> Seems safe as far as the `includessi` option is disabled; the po +plugin explicitly disables it. + +Relies on Locale::Po4a::Xml` to do most of the work. + +##### Locale::Po4a::Xml + +* does not run any external program +* the `includeexternal` option makes it able to read external files; + the po plugin explicitly disables it +* untrusted variables are escaped when used to build regexp's + ##### Text::WrapI18N `Text::WrapI18N` can cause DoS @@ -513,7 +530,7 @@ finish it at some point in the first quarter of 2009. --[[intrigeri]] >>>> >>>>> Done. --[[intrigeri]] >>> -> * I'm very fearful of the `add_depends` in `postscan`. +> * I'm very fearful of the `add_depends` in `indexhtml`. > Does this make every page depend on every page that links > to it? Won't this absurdly bloat the dependency pagespecs > and slow everything down? And since nicepagetitle is given @@ -627,28 +644,6 @@ daring a timid "please pull"... or rather, please review again :) >>> need improvements to the deletion UI to de-confuse that. It's fine to >>> put that off until needed --[[Joey]] >> -> * Re the meta title escaping issue worked around by `change`. -> I suppose this does not only affect meta, but other things -> at scan time too. Also, handling it only on rebuild feels -> suspicious -- a refresh could involve changes to multiple -> pages and trigger the same problem, I think. Also, exposing -> this rebuild to the user seems really ugly, not confidence inducing. -> -> So I wonder if there's a better way. Such as making po, at scan time, -> re-run the scan hooks, passing them modified content (either converted -> from po to mdwn or with the escaped stuff cheaply de-escaped). (Of -> course the scan hook would need to avoid calling itself!) -> -> (This doesn't need to block the merge, but I hope it can be addressed -> eventually..) -> -> --[[Joey]] ->> ->> I'll think about it soon. ->> ->> --[[intrigeri]] ->> ->>> Did you get a chance to? --[[Joey]] * As discussed at [[todo/l10n]] the templates needs to be translatable too. They should be treated properly by po4a using the markdown option - at least with my diff --git a/doc/plugins/poll.mdwn b/doc/plugins/poll.mdwn index 510f67798..099cb399c 100644 --- a/doc/plugins/poll.mdwn +++ b/doc/plugins/poll.mdwn @@ -1,5 +1,5 @@ [[!template id=plugin name=poll author="[[Joey]]"]] -[[!tag type/web]] +[[!tag type/widget]] This plugin provides the [[ikiwiki/directive/poll]] [[ikiwiki/directive]], which allows inserting an online poll into a page. diff --git a/doc/plugins/polygen.mdwn b/doc/plugins/polygen.mdwn index 6045c1ec9..f9cea1f4d 100644 --- a/doc/plugins/polygen.mdwn +++ b/doc/plugins/polygen.mdwn @@ -1,5 +1,6 @@ [[!template id=plugin name=polygen author="Enrico Zini"]] [[!tag type/fun]] +[[!tag type/widget]] This plugin provides the [[ikiwiki/directive/polygen]] [[ikiwiki/directive]], which allows inserting text generated by polygen into a wiki page. diff --git a/doc/plugins/postsparkline.mdwn b/doc/plugins/postsparkline.mdwn index c81f91bdc..b0733e343 100644 --- a/doc/plugins/postsparkline.mdwn +++ b/doc/plugins/postsparkline.mdwn @@ -1,5 +1,5 @@ [[!template id=plugin name=postsparkline author="[[Joey]]"]] -[[!tag type/chrome]] +[[!tag type/widget]] This plugin provides the [[ikiwiki/directive/postsparkline]] [[ikiwiki/directive]]. It uses the [[sparkline]] plugin to create a sparkline of diff --git a/doc/plugins/prettydate.mdwn b/doc/plugins/prettydate.mdwn index 11ad4252f..149b7c29c 100644 --- a/doc/plugins/prettydate.mdwn +++ b/doc/plugins/prettydate.mdwn @@ -1,5 +1,6 @@ [[!template id=plugin name=prettydate author="[[Joey]]"]] [[!tag type/date]] +[[!tag type/chrome]] Enabling this plugin changes the dates displayed on pages in the wiki to a format that is nice and easy to read. Examples: "late Wednesday evening, diff --git a/doc/plugins/progress.mdwn b/doc/plugins/progress.mdwn index e1b560cc8..20736d18c 100644 --- a/doc/plugins/progress.mdwn +++ b/doc/plugins/progress.mdwn @@ -1,5 +1,5 @@ [[!template id=plugin name=progress author="[[Will]]"]] -[[!tag type/meta]] +[[!tag type/widget]] This plugin provides the [[ikiwiki/directive/progress]] [[ikiwiki/directive]], which generates a progress bar. diff --git a/doc/plugins/rawhtml/discussion.mdwn b/doc/plugins/rawhtml/discussion.mdwn index e63e4acb9..9ed8230ba 100644 --- a/doc/plugins/rawhtml/discussion.mdwn +++ b/doc/plugins/rawhtml/discussion.mdwn @@ -2,4 +2,6 @@ Is there anyway to allow this only on locked pages? I'd like to be able to do r > Not at the moment. Long-term, ikiwiki needs some general permission mechanisms that encompass this sort of issue. --[[JoshTriplett]] ->> Thanks. Bummer though, looking forward to when this is possible. :-) -- Adam.
\ No newline at end of file +>> Thanks. Bummer though, looking forward to when this is possible. :-) -- Adam. + +> Well, this plugin is different from the [[html]] plugin. It **copies** html files. So users cannot do raw HTML via cgi. Thus it is safe in most cases. -- weakish diff --git a/doc/plugins/recentchanges.mdwn b/doc/plugins/recentchanges.mdwn index 9375296a4..6fff18e8a 100644 --- a/doc/plugins/recentchanges.mdwn +++ b/doc/plugins/recentchanges.mdwn @@ -1,10 +1,13 @@ [[!template id=plugin name=recentchanges core=1 author="[[Joey]]"]] +[[!tag type/meta]] This plugin examines the [[revision_control_system|rcs]] history and generates a page describing each recent change made to the wiki. These pages can be joined together with [[inline]] to generate the [[RecentChanges]] page. +This plugin also currently handles web-based reversion of changes. + Typically only the RecentChanges page will use the pages generated by this plugin, but you can use it elsewhere too if you like. It's used like this: diff --git a/doc/plugins/recentchangesdiff.mdwn b/doc/plugins/recentchangesdiff.mdwn index a7b113ade..57299f92d 100644 --- a/doc/plugins/recentchangesdiff.mdwn +++ b/doc/plugins/recentchangesdiff.mdwn @@ -1,4 +1,5 @@ [[!template id=plugin name=recentchangesdiff core=0 author="[[Joey]]"]] +[[!tag type/meta]] This plugin extends the [[recentchanges]] plugin, adding a diff for each change. The diffs are by default hidden from display on the recentchanges diff --git a/doc/plugins/relativedate.mdwn b/doc/plugins/relativedate.mdwn index 3ada0864b..d6e8eb08b 100644 --- a/doc/plugins/relativedate.mdwn +++ b/doc/plugins/relativedate.mdwn @@ -1,5 +1,6 @@ [[!template id=plugin name=relativedate author="[[Joey]]"]] [[!tag type/date]] +[[!tag type/chrome]] This plugin lets dates be displayed in relative form. Examples: "2 days ago", "1 month and 3 days ago", "30 minutes ago". Hovering over the date will @@ -8,9 +9,3 @@ cause a tooltip to pop up with the absolute date. This only works in browsers with javascript enabled; other browsers will show the absolute date instead. Also, this plugin can be used with other plugins like [[prettydate]] that change how the absolute date is displayed. - -If this plugin is enabled, you may also add relative dates to pages in the -wiki, by using html elements in the "relativedate" class. For example, this -will display as a relative date: - - <span class="relativedate">Tue Jan 20 12:00:00 EDT 2009</span> diff --git a/doc/plugins/rename.mdwn b/doc/plugins/rename.mdwn index ddaede8b0..abb361329 100644 --- a/doc/plugins/rename.mdwn +++ b/doc/plugins/rename.mdwn @@ -2,7 +2,8 @@ [[!tag type/web]] This plugin allows pages or other files to be renamed using the web -interface. +interface. Following Unix tradition, renaming also allows moving to a +different directory. Users can only rename things that they are allowed to edit or upload. diff --git a/doc/plugins/repolist.mdwn b/doc/plugins/repolist.mdwn index 9b3a7575e..efd9c9352 100644 --- a/doc/plugins/repolist.mdwn +++ b/doc/plugins/repolist.mdwn @@ -1,5 +1,5 @@ [[!template id=plugin name=repolist author="[[Joey]]"]] -[[!tag type/useful]] +[[!tag type/web]] This plugin allows you to configure ikiwiki with the location of [[rcs]] repositories for your wiki's source. This is done via the diff --git a/doc/plugins/rst/discussion.mdwn b/doc/plugins/rst/discussion.mdwn index 38fbed6d6..c84a6218e 100644 --- a/doc/plugins/rst/discussion.mdwn +++ b/doc/plugins/rst/discussion.mdwn @@ -61,8 +61,8 @@ but the backlinks don't show up. I converted one of my pages to rst: -Before: http://kaizer.se/wiki/kupfer-mdwn -After: http://kaizer.se/wiki/kupfer-rst +Before: <http://kaizer.se/wiki/kupfer-mdwn> +After: <http://kaizer.se/wiki/kupfer-rst> I need help on a couple of points @@ -71,3 +71,11 @@ I need help on a couple of points * Can we include this in ikiwiki's rst if it is not too hairy? --ulrik + + +---- + +> The main problem with more sophisticated RST support is that ikiwiki turns +preprocessor directives into raw HTML and reST hates inline HTML. + +Is it possible for ikiwiki to store preprocessor directives in memory, and replace them with place holders, then do the rst process. After the rst processing, process the preprocessor directives and replace place holders. --[[weakish]] diff --git a/doc/plugins/rsync.mdwn b/doc/plugins/rsync.mdwn index db7fcb4f7..e48886168 100644 --- a/doc/plugins/rsync.mdwn +++ b/doc/plugins/rsync.mdwn @@ -1,4 +1,5 @@ [[!template id=plugin name=rsync author="[[schmonz]]"]] +[[!tag type/special-purpose]] This plugin allows ikiwiki to push generated pages to another host by running a command such as `rsync`. @@ -7,7 +8,7 @@ The command to run is specified by setting `rsync_command` in your setup file. The command will be run in your destdir, so something like this is a typical command: - rsync => 'rsync -qa --delete . user@host:/path/to/docroot/', + rsync_command => 'rsync -qa --delete . user@host:/path/to/docroot/', If using rsync over ssh, you will need to enable noninteractive ssh login to the remote host. It's also a good idea to specify the exact command line diff --git a/doc/plugins/rsync/discussion.mdwn b/doc/plugins/rsync/discussion.mdwn index 6bf7a3826..ef0fa9967 100644 --- a/doc/plugins/rsync/discussion.mdwn +++ b/doc/plugins/rsync/discussion.mdwn @@ -47,6 +47,8 @@ The wiki now lives on (1), and clicking "edit" just works. --[[schmonz]] >> a DVCS (of which I've got at least one other), and possibly for >> other uses not yet imagined. ;-) --[[schmonz]] +>>> I'm now using this plugin for an additional purpose. One of the aforementioned wikis (there are actually two) can only be read by trusted users, the list of which is kept in an `.htaccess` file. I added it to git as `htaccess.txt`, enabled the [[plugins/txt]] plugin, and in my `rsync_command` script, have it copied to the destdir as `.htaccess` before calling `rsync`. Now my users (who aren't tech-savvy, but are trustworthy) can edit the access list directly in the wiki. This idea might also be useful for wikis not using `rsync` at all. --[[schmonz]] + ---- Revew: --[[Joey]] diff --git a/doc/plugins/search.mdwn b/doc/plugins/search.mdwn index 92cc5945a..e95739cf3 100644 --- a/doc/plugins/search.mdwn +++ b/doc/plugins/search.mdwn @@ -4,7 +4,7 @@ This plugin adds full text search to ikiwiki, using the [xapian](http://xapian.org/) engine, its [omega](http://xapian.org/docs/omega/overview.html) frontend, and the -[[!cpan Search::Xapian]], [[!cpan Digest::SHA1]], and [[!cpan HTML::Scrubber]] +[[!cpan Search::Xapian]], [[!cpan Digest::SHA]], and [[!cpan HTML::Scrubber]] perl modules. The [[ikiwiki/searching]] page describes how to write search queries. diff --git a/doc/plugins/shortcut.mdwn b/doc/plugins/shortcut.mdwn index cca1f4bdd..1e8e85ed8 100644 --- a/doc/plugins/shortcut.mdwn +++ b/doc/plugins/shortcut.mdwn @@ -1,5 +1,5 @@ [[!template id=plugin name=shortcut author="[[Joey]]"]] -[[!tag type/format]] +[[!tag type/widget]] This plugin provides the [[ikiwiki/directive/shortcut]] [[ikiwiki/directive]]. It allows external links to commonly linked to sites to be made diff --git a/doc/plugins/shortcut/discussion.mdwn b/doc/plugins/shortcut/discussion.mdwn index 4e11ce08c..2e2b1b281 100644 --- a/doc/plugins/shortcut/discussion.mdwn +++ b/doc/plugins/shortcut/discussion.mdwn @@ -9,4 +9,10 @@ Maybe use the `default_pageext` is better than hardcode .mdwn? > done, it will use `default_pageext` now --[[Joey]] +--- +Instead of modifying the [[basewiki]]'s [[shortcuts]] file for local needs -- +thus copying it at some point and losing continuity with upstream enhancements -- +what about handling a `shortcuts-local.mdwn` or `shortcuts/local.mdwn` (if such +a file exists in the wiki), and additionally process that one. Possibily a +conditional `\[[!inline]]` could be used. --[[tschwinge]] diff --git a/doc/plugins/sidebar.mdwn b/doc/plugins/sidebar.mdwn index 4e356d65a..012733456 100644 --- a/doc/plugins/sidebar.mdwn +++ b/doc/plugins/sidebar.mdwn @@ -1,24 +1,27 @@ [[!template id=plugin name=sidebar author="Tuomo Valkonen"]] [[!tag type/chrome]] -If this plugin is enabled, then a sidebar is added to pages in the wiki. -The content of the sidebar is simply the content of a page named -"sidebar" (ie, create a "sidebar.mdwn"). +This plugin allows adding a sidebar to pages in the wiki. + +By default, and unless the `global_sidebars` setting is turned off, +a sidebar is added to all pages in the wiki. The content of the sidebar +is simply the content of a page named "sidebar" (ie, create a "sidebar.mdwn"). Typically this will be a page in the root of the wiki, but it can also be a [[ikiwiki/SubPage]]. In fact, this page, [[plugins/sidebar|plugins/sidebar]], will be treated as a sidebar for the [[plugins]] page, and of all of its SubPages, if the plugin is enabled. -Note that to disable a sidebar for a [[ikiwiki/SubPage]] of a page that has -a sidebar, you can create a sidebar page that is completely empty. This -will turn off the sidebar altogether. +There is also a [[ikiwiki/directive/sidebar]] directive that can be used +to provide a custom sidebar content for a page. + +---- -Warning: Any change to the sidebar will cause a rebuild of the whole wiki, -since every page includes a copy that has to be updated. This can -especially be a problem if the sidebar includes an [[ikiwiki/directive/inline]] -directive, since any changes to pages inlined into the sidebar -will change the sidebar and cause a full wiki rebuild. +Warning: Any change to the sidebar page will cause a rebuild of the whole +wiki, since every page includes a copy that has to be updated. This can +especially be a problem if the sidebar includes an +[[ikiwiki/directive/inline]] directive, since any changes to pages inlined +into the sidebar will change the sidebar and cause a full wiki rebuild. Instead, if you include a [[ikiwiki/directive/map]] directive on the sidebar, and it does not use the `show` parameter, only adding or removing pages diff --git a/doc/plugins/sortnaturally.mdwn b/doc/plugins/sortnaturally.mdwn new file mode 100644 index 000000000..a16381946 --- /dev/null +++ b/doc/plugins/sortnaturally.mdwn @@ -0,0 +1,6 @@ +[[!template id=plugin name=sortnaturally core=1 author="[[chrysn]], [[smcv]]"]] +[[!tag type/meta]] + +This plugin provides the `title_natural` [[ikiwiki/pagespec/sorting]] +order, which uses [[!cpan Sort::Naturally]] to sort numbered pages in a +more natural order. diff --git a/doc/plugins/sparkline.mdwn b/doc/plugins/sparkline.mdwn index bcc5daec6..83e24a27d 100644 --- a/doc/plugins/sparkline.mdwn +++ b/doc/plugins/sparkline.mdwn @@ -1,5 +1,5 @@ [[!template id=plugin name=sparkline author="[[Joey]]"]] -[[!tag type/chrome]] +[[!tag type/widget]] This plugin provides the [[ikiwiki/directive/sparkline]] [[ikiwiki/directive]], which allows for easily embedding sparklines into @@ -16,7 +16,7 @@ to use the plugin, you will need: php can find it when `sparkline/Sparkline.php` is required. * The GD PHP module used by the Sparkline library. * A "php" program in the path, that can run standalone php programs. -* [[!cpan Digest::SHA1]] +* [[!cpan Digest::SHA]] On a Debian system, this can be accomplished by installing these packages: `libsparkline-php` `php5-gd` `php5-cli` `libdigest-sha1-perl` diff --git a/doc/plugins/table.mdwn b/doc/plugins/table.mdwn index 7b080acda..fe66f90a8 100644 --- a/doc/plugins/table.mdwn +++ b/doc/plugins/table.mdwn @@ -1,8 +1,8 @@ [[!template id=plugin name=table author="[[VictorMoral]]"]] -[[!tag type/format]] +[[!tag type/widget]] This plugin provides the [[ikiwiki/directive/table]] [[ikiwiki/directive]]. It can build HTML tables from data in CSV (comma-separated values) -or DSV (delimiter-separated values) format. +or DSV ([delimiter-separated values](http://en.wikipedia.org/wiki/Delimiter-separated_values)) format. It needs the perl module [[!cpan Text::CSV]] for the CSV data. diff --git a/doc/plugins/tag.mdwn b/doc/plugins/tag.mdwn index 8ff70a069..1d6bfbdd9 100644 --- a/doc/plugins/tag.mdwn +++ b/doc/plugins/tag.mdwn @@ -8,6 +8,16 @@ These directives allow tagging pages. It also provides the `tagged()` [[ikiwiki/PageSpec]], which can be used to match pages that are tagged with a specific tag. +The `tagbase` setting can be used to make tags default to being put in a +particular subdirectory. + +The `tag_autocreate` setting can be used to control whether new tag pages +are created as needed. It defaults to being done only if a `tagbase` is +set. + +The `tag_autocreate_commit` setting is enabled by default, and causes +new tag pages to be checked into version control. + [[!if test="enabled(tag)" then=""" This wiki has the tag plugin enabled, so you'll see a note below that this page is tagged with the "tags" tag. diff --git a/doc/plugins/tag/discussion.mdwn b/doc/plugins/tag/discussion.mdwn index 03dcb7b2f..dfd749252 100644 --- a/doc/plugins/tag/discussion.mdwn +++ b/doc/plugins/tag/discussion.mdwn @@ -28,3 +28,4 @@ See [[todo/auto-create tag pages according to a template]] -- Jeremy Schultz <jeremy.schultz@uleth.ca> +`tag_autocreate` can now enable this. --[[Joey]] diff --git a/doc/plugins/template.mdwn b/doc/plugins/template.mdwn index 3485fe64c..8d17e2825 100644 --- a/doc/plugins/template.mdwn +++ b/doc/plugins/template.mdwn @@ -1,7 +1,7 @@ [[!template id=plugin name=template author="[[Joey]]"]] -[[!tag type/format]] +[[!tag type/widget]] This plugin provides the [[ikiwiki/directive/template]] [[ikiwiki/directive]]. With this plugin, you can set up templates, and cause them to be filled out -and inserted into pages in the wiki. It's documented and existing templates -are listed in the [[templates]] page. +and inserted into pages in the wiki. Existing templates are listed in the +[[templates]] page. diff --git a/doc/plugins/testpagespec.mdwn b/doc/plugins/testpagespec.mdwn index dabcb0bec..8180d5d4b 100644 --- a/doc/plugins/testpagespec.mdwn +++ b/doc/plugins/testpagespec.mdwn @@ -1,5 +1,5 @@ [[!template id=plugin name=testpagespec author="[[Joey]]"]] -[[!tag type/useful]] +[[!tag type/special-purpose]] This plugin provides a [[ikiwiki/directive/testpagespec]] [[ikiwiki/directive]]. The directive allows testing a [[ikiwiki/PageSpec]] to see if it matches a diff --git a/doc/plugins/teximg.mdwn b/doc/plugins/teximg.mdwn index ae052837f..f3cade85f 100644 --- a/doc/plugins/teximg.mdwn +++ b/doc/plugins/teximg.mdwn @@ -1,5 +1,5 @@ [[!template id=plugin name=teximg author="[[PatrickWinnertz]]"]] -[[!tag type/chrome type/slow]] +[[!tag type/widget type/slow]] This plugin provides a [[ikiwiki/directive/teximg]] [[ikiwiki/directive]], that renders LaTeX formulas into images. diff --git a/doc/plugins/theme.mdwn b/doc/plugins/theme.mdwn new file mode 100644 index 000000000..ebbb0be8e --- /dev/null +++ b/doc/plugins/theme.mdwn @@ -0,0 +1,11 @@ +[[!template id=plugin name=theme author="[[Joey]]"]] +[[!tag type/web]] + +The theme plugin allows easily applying a theme to your wiki, by +configuring the `theme` setting in the setup file with the name of a theme +to use. The themes you can choose from are all subdirectories, typically +inside `/usr/share/ikiwiki/themes/`. See [[themes]] for an overview +of the themes included in ikiwiki. + +You can set the theme via the **theme** option in your config file (after +enabling the plugin). Refresh the wiki after changing it to see the changes. diff --git a/doc/plugins/theme/discussion.mdwn b/doc/plugins/theme/discussion.mdwn new file mode 100644 index 000000000..67a2bf46a --- /dev/null +++ b/doc/plugins/theme/discussion.mdwn @@ -0,0 +1,26 @@ +### What license do themes need to have for distribution? + +Could someone specify what license the themes need to have to get +distributed in ikiwiki or Debian? The current included theme seem to be +under the GPLv2. Does the [Creative Commons Attribution 3.0 Unported +License](http://creativecommons.org/licenses/by/3.0/) also work. This way a +lot of free CSS templates could be included, e. g. from +[freecsstemplates.org](http://www.freecsstemplates.org/). --PaulePanter + +> Paule, I'd love it if you did that! The only hard requirement on themes +> included in ikiwiki is that they need to be licensed with a [DFSG +> compatable license](https://wiki.debian.org/DFSGLicenses). CC-BY-SA 3.0 +> is DFSG; CC-BY is apparently being accepted by Debian too. +> +> As a soft requirement, I may exersise some discretion about themes that +> require obtrusive attributions links be included on every page of a +> site using the theme. While probably DFSG, that adds a requirement +> that ikiwiki itself does not require. --[[Joey]] + +### Once one has enabled the 'theme' plugin in the setup file, how does one use themes? + +Choose one of the [[themes]] which are bundled with ikiwiki and configure ikiwiki to use it by setting this in your setup file, eg. + + theme => 'blueview', + +-- [[AdamShand]] diff --git a/doc/plugins/toc.mdwn b/doc/plugins/toc.mdwn index 2b7686681..a0ad3a5d0 100644 --- a/doc/plugins/toc.mdwn +++ b/doc/plugins/toc.mdwn @@ -1,5 +1,5 @@ [[!template id=plugin name=toc author="[[Joey]]"]] -[[!tag type/chrome]] +[[!tag type/widget]] This plugin provides the [[ikiwiki/directive/toc]] [[ikiwiki/directive]], which adds a table of contents to a page. diff --git a/doc/plugins/toggle.mdwn b/doc/plugins/toggle.mdwn index 69ac613e0..d1500eba0 100644 --- a/doc/plugins/toggle.mdwn +++ b/doc/plugins/toggle.mdwn @@ -1,5 +1,5 @@ [[!template id=plugin name=toggle author="[[Joey]]"]] -[[!tag type/chrome]] +[[!tag type/widget]] This plugin provides the [[ikiwiki/directive/toggle]] and [[ikiwiki/directive/toggleable]] [[directives|ikiwiki/directive]]. diff --git a/doc/plugins/transient.mdwn b/doc/plugins/transient.mdwn new file mode 100644 index 000000000..b7dd11906 --- /dev/null +++ b/doc/plugins/transient.mdwn @@ -0,0 +1,24 @@ +[[!template id=plugin name=transient author="[[Simon_McVittie|smcv]]" core=yes]] +[[!tag type/special-purpose]] + +The `transient` plugin adds an underlay in `.ikiwiki/transient`, which is +intended for pages that are automatically created and should not be committed +to the [[RCS]]. It works in the same way as the [[basewiki]] and the underlays +set up by the [[plugins/underlay]] plugin, so if a page in the transient +underlay is edited via the web, the edited version is committed to the RCS +as usual. Unlike other underlays, if a page in the transient underlay is +superseded by an edited version in the RCS, the old transient version +is deleted automatically. + +This plugin is mostly useful as something that other plugins can depend on: + +* [[plugins/aggregate]] writes aggregated posts into the transient underlay +* [[plugins/autoindex]] can be configured to auto-create missing + pages that have a [[ikiwiki/subpage]] or an [[plugins/attachment]], but not + commit them, in which case they go in the transient underlay +* [[plugins/comments]] can be configured to not commit comments: if so, it + puts them in the transient underlay +* [[plugins/recentchanges]] writes new changes into the transient underlay +* [[plugins/tag]] can be configured to auto-create missing + tag pages but not commit them, in which case they go in the transient + underlay diff --git a/doc/plugins/txt.mdwn b/doc/plugins/txt.mdwn index 420898d09..a3087c9e0 100644 --- a/doc/plugins/txt.mdwn +++ b/doc/plugins/txt.mdwn @@ -12,3 +12,8 @@ The only exceptions are that [[WikiLinks|ikiwiki/WikiLink]] and [[directives|ikiwiki/directive]] are still expanded by ikiwiki, and that, if the [[!cpan URI::Find]] perl module is installed, URLs in the txt file are converted to hyperlinks. + +---- + +As a special case, a file `robots.txt` will be copied intact into the +`destdir`, as well as creating a wiki page named "robots". diff --git a/doc/plugins/type/chrome.mdwn b/doc/plugins/type/chrome.mdwn index d3f0eb3d3..a1c6d0728 100644 --- a/doc/plugins/type/chrome.mdwn +++ b/doc/plugins/type/chrome.mdwn @@ -1 +1 @@ -These plugins affect the look and feel of the wiki. +These plugins affect the look and feel of the overall wiki. diff --git a/doc/plugins/type/useful.mdwn b/doc/plugins/type/useful.mdwn deleted file mode 100644 index 92fcf5af1..000000000 --- a/doc/plugins/type/useful.mdwn +++ /dev/null @@ -1 +0,0 @@ -These plugins perform various miscellaneous useful functions. diff --git a/doc/plugins/type/widget.mdwn b/doc/plugins/type/widget.mdwn new file mode 100644 index 000000000..875829d0b --- /dev/null +++ b/doc/plugins/type/widget.mdwn @@ -0,0 +1,2 @@ +These plugins allow inserting various things into pages via a +[[ikiwiki/directive]]. diff --git a/doc/plugins/typography.mdwn b/doc/plugins/typography.mdwn index 030ef8052..9ff6c4ffd 100644 --- a/doc/plugins/typography.mdwn +++ b/doc/plugins/typography.mdwn @@ -1,5 +1,5 @@ [[!template id=plugin name=typography author="[[Roktas]]"]] -[[!tag type/format]] +[[!tag type/chrome]] This plugin, also known as [SmartyPants](http://daringfireball.net/projects/smartypants/), translates diff --git a/doc/plugins/underlay.mdwn b/doc/plugins/underlay.mdwn index f7eafee7c..0cf819472 100644 --- a/doc/plugins/underlay.mdwn +++ b/doc/plugins/underlay.mdwn @@ -1,20 +1,14 @@ [[!template id=plugin name=underlay author="[[Simon_McVittie|smcv]]"]] -[[!tag type/useful]] +[[!tag type/special-purpose]] -This plugin adds an `add_underlays` option to the setup file. -Its value is a list of underlay directories whose content is added to the wiki. +This plugin adds an `add_underlays` option to the setup file. Its value is +a list of underlay directories whose content is added to the wiki. Multiple underlays are normally set up automatically by other plugins (for -instance, the images used by the [[plugins/smiley]] plugin), but they can also be -used as a way to pull in external files that you don't want in revision control, -like photos or software releases. +instance, the images used by the [[plugins/smiley]] plugin), but they can +also be used as a way to pull in external files that you don't want in +revision control, like photos or software releases. -Directories in `add_underlays` should usually be absolute. If relative, they're -interpreted as relative to the parent directory of the basewiki underlay, which -is probably not particularly useful in this context. - --- - -This plugin also adds an `add_templates` option to the setup file. -Its value is a list of template directories to look for template files in, -if they are not present in the `templatedir`. +Directories in `add_underlays` should usually be absolute. If relative, +they're interpreted as relative to the parent directory of the basewiki +underlay, which is probably not particularly useful in this context. diff --git a/doc/plugins/version.mdwn b/doc/plugins/version.mdwn index 43027bdd7..326a2e7ce 100644 --- a/doc/plugins/version.mdwn +++ b/doc/plugins/version.mdwn @@ -1,5 +1,6 @@ [[!template id=plugin name=version author="[[Joey]]"]] [[!tag type/meta]] +[[!tag type/widget]] This plugin provides the [[ikiwiki/directive/version]] [[ikiwiki/directive]], which inserts the current version diff --git a/doc/plugins/websetup.mdwn b/doc/plugins/websetup.mdwn index f1756ba8f..a20a32489 100644 --- a/doc/plugins/websetup.mdwn +++ b/doc/plugins/websetup.mdwn @@ -2,7 +2,7 @@ [[!tag type/web]] This plugin allows wiki admins to configure the wiki using a web interface, -rather than editing the setup file directly. A "Wiki Setup" button is added +rather than editing the setup file directly. A "Setup" button is added to the admins' preferences page. Warning: This plugin rewrites your setup file. Any comments or unusual @@ -16,7 +16,8 @@ enabled and disabled using it too. Some settings are not considered safe enough to be manipulated over the web; these are still shown, by default, but cannot be modified. To hide them, set `websetup_show_unsafe` to false in the setup file. A few settings have too complex a data type to be -configured via the web. +configured via the web. To mark additional settings as unsafe, you can +list them in `websetup_unsafe`. Plugins that should not be enabled/disabled via the web interface can be listed in `websetup_force_plugins` in the setup file. diff --git a/doc/plugins/wmd.mdwn b/doc/plugins/wmd.mdwn index dc9a30703..96c6e2e6c 100644 --- a/doc/plugins/wmd.mdwn +++ b/doc/plugins/wmd.mdwn @@ -1,5 +1,5 @@ [[!template id=plugin name=wmd author="[[Will]]"]] -[[!tag type/chrome]] +[[!tag type/web]] [WMD](http://wmd-editor.com/) is a What You See Is What You Mean editor for [[mdwn]]. This plugin makes WMD be used for editing pages in the wiki. diff --git a/doc/plugins/write.mdwn b/doc/plugins/write.mdwn index c72418c3c..f0f79ebc7 100644 --- a/doc/plugins/write.mdwn +++ b/doc/plugins/write.mdwn @@ -1,10 +1,86 @@ -Ikiwiki's plugin interface allows all kinds of useful [[plugins]] to be +lkiwiki's plugin interface allows all kinds of useful [[plugins]] to be written to extend ikiwiki in many ways. Despite the length of this page, it's not really hard. This page is a complete reference to everything a plugin might want to do. There is also a quick [[tutorial]]. +[[!template id="note" text=""" +Ikiwiki is a compiler + +One thing to keep in mind when writing a plugin is that ikiwiki is a wiki +*compiler*. So plugins influence pages when they are built, not when they +are loaded. A plugin that inserts the current time into a page, for +example, will insert the build time. + +Also, as a compiler, ikiwiki avoids rebuilding pages unless they have +changed, so a plugin that prints some random or changing thing on a page +will generate a static page that won't change until ikiwiki rebuilds the +page for some other reason, like the page being edited. + +The [[tutorial]] has some other examples of ways that ikiwiki being a +compiler may trip up the unwary. +"""]] + [[!toc levels=2]] +## Highlevel view of ikiwiki + +Ikiwiki mostly has two modes of operation. It can either be running +as a compiler, building or updating a wiki; or as a cgi program, providing +user interface for editing pages, etc. Almost everything ikiwiki does +is accomplished by calling various hooks provided by plugins. + +### compiler + +As a compiler, ikiwiki starts by calling the `refresh` hook. Then it checks +the wiki's source to find new or changed pages. The `needsbuild` hook is +then called to allow manipulation of the list of pages that need to be +built. + +Now that it knows what pages it needs to build, ikiwiki runs two +compile passes. First, it runs `scan` hooks, which collect metadata about +the pages. Then it runs a page rendering pipeline, by calling in turn these +hooks: `filter`, `preprocess`, `linkify`, `htmlize`, `indexhtml`, +`pagetemplate`, `sanitize`, `format`. + +After all necessary pages are built, it calls the `change` hook. Finally, +if a page is was deleted, the `delete` hook is called, and the files that +page had previously produced are removed. + +### cgi + +The flow between hooks when ikiwiki is run as a cgi is best illustrated by +an example. + +Alice browses to a page and clicks Edit. + +* Ikiwiki is run as a cgi. It assigns Alice a session cookie, and, + by calling the `auth` hooks, sees that she is not yet logged in. +* The `sessioncgi` hooks are then called, and one of them, + from the [[editpage]] plugin, notices that the cgi has been told "do=edit". +* The [[editpage]] plugin calls the `canedit` hook to check if this + page edit is allowed. The [[signinedit]] plugin has a hook that says not: + Alice is not signed in. +* The [[signinedit]] plugin then launches the signin process. A signin + page is built by calling the `formbuilder_setup` hook. + +Alice signs in with her openid. + +* The [[openid]] plugin's `formbuilder` hook sees that an openid was + entered in the signin form, and redirects to Alice's openid provider. +* Alice's openid provider calls back to ikiwiki. The [[openid]] plugin + has an `auth` hook that finishes the openid signin process. +* Signin complete, ikiwiki returns to what Alice was doing before; editing + a page. +* Now all the `canedit` hooks are happy. The [[editpage]] plugin calls + `formbuilder_setup` to display the page editing form. + +Alice saves her change to the page. + +* The [[editpage]] plugin's `formbuilder` hook sees that the Save button + was pressed, and calls the `checkcontent` and `editcontent` hooks. + Then it saves the page to disk, and branches into the compiler part + of ikiwiki to refresh the wiki. + ## Types of plugins Most ikiwiki [[plugins]] are written in perl, like ikiwiki. This gives the @@ -31,16 +107,20 @@ they're the same as far as how they hook into ikiwiki. This document will explain how to write both sorts of plugins, albeit with an emphasis on perl plugins. -## Considerations +## Plugin interface -One thing to keep in mind when writing a plugin is that ikiwiki is a wiki -*compiler*. So plugins influence pages when they are built, not when they -are loaded. A plugin that inserts the current time into a page, for -example, will insert the build time. Also, as a compiler, ikiwiki avoids -rebuilding pages unless they have changed, so a plugin that prints some -random or changing thing on a page will generate a static page that won't -change until ikiwiki rebuilds the page for some other reason, like the page -being edited. +To import the ikiwiki plugin interface: + + use IkiWiki '3.00'; + +This will import several variables and functions into your plugin's +namespace. These variables and functions are the ones most plugins need, +and a special effort will be made to avoid changing them in incompatible +ways, and to document any changes that have to be made in the future. + +Note that IkiWiki also provides other variables and functions that are not +exported by default. No guarantee is made about these in the future, so if +it's not exported, the wise choice is to not use it. ## Registering plugins @@ -68,20 +148,21 @@ In roughly the order they are called. This allows for plugins to perform their own processing of command-line options and so add options to the ikiwiki command line. It's called during -command line processing, with @ARGV full of any options that ikiwiki was +command line processing, with `@ARGV` full of any options that ikiwiki was not able to process on its own. The function should process any options it -can, removing them from @ARGV, and probably recording the configuration -settings in %config. It should take care not to abort if it sees +can, removing them from `@ARGV`, and probably recording the configuration +settings in `%config`. It should take care not to abort if it sees an option it cannot process, and should just skip over those options and -leave them in @ARGV. +leave them in `@ARGV`. ### checkconfig hook(type => "checkconfig", id => "foo", call => \&checkconfig); This is useful if the plugin needs to check for or modify ikiwiki's -configuration. It's called early in the startup process. The -function is passed no values. It's ok for the function to call +configuration. It's called early in the startup process. `%config` +is populated at this point, but other state has not yet been loaded. +The function is passed no values. It's ok for the function to call `error()` if something isn't configured right. ### refresh @@ -96,10 +177,15 @@ function is passed no values. hook(type => "needsbuild", id => "foo", call => \&needsbuild); -This allows a plugin to manipulate the list of files that need to be -built when the wiki is refreshed. The function is passed a reference to an -array of files that will be rebuilt, and can modify the array, either -adding or removing files from it. +This allows a plugin to observe or even manipulate the list of files that +need to be built when the wiki is refreshed. + +As its first parameter, the function is passed a reference to an array of +files that will be built. It should return an array reference that is a +modified version of its input. It can add or remove files from it. + +The second parameter passed to the function is a reference to an array of +files that have been deleted. ### scan @@ -117,8 +203,8 @@ value is ignored. hook(type => "filter", id => "foo", call => \&filter); -Runs on the raw source of a page, before anything else touches it, and can -make arbitrary changes. The function is passed named parameters "page", +Runs on the full raw source of a page, before anything else touches it, and +can make arbitrary changes. The function is passed named parameters "page", "destpage", and "content". It should return the filtered content. ### preprocess @@ -201,11 +287,22 @@ like `Makefile` that have no extension. If `hook` is passed an optional "longname" parameter, this value is used when prompting a user to choose a page type on the edit page form. +### indexhtml + + hook(type => "indexhtml", id => "foo", call => \&indexhtml); + +This hook is called once the page has been converted to html (but before +the generated html is put in a template). The most common use is to +update search indexes. Added in ikiwiki 2.54. + +The function is passed named parameters "page", "destpage", and "content". +Its return value is ignored. + ### pagetemplate hook(type => "pagetemplate", id => "foo", call => \&pagetemplate); -[[Templates|wikitemplates]] are filled out for many different things in +[[Templates]] are filled out for many different things in ikiwiki, like generating a page, or part of a blog page, or an rss feed, or a cgi. This hook allows modifying the variables available on those templates. The function is passed named parameters. The "page" and @@ -221,11 +318,20 @@ a new custom parameter to the template. hook(type => "templatefile", id => "foo", call => \&templatefile); -This hook allows plugins to change the [[template|wikitemplates]] that is +This hook allows plugins to change the [[template|templates]] that is used for a page in the wiki. The hook is passed a "page" parameter, and -should return the name of the template file to use, or undef if it doesn't -want to change the default ("page.tmpl"). Template files are looked for in -/usr/share/ikiwiki/templates by default. +should return the name of the template file to use (relative to the +template directory), or undef if it doesn't want to change the default +("page.tmpl"). + +### pageactions + + hook(type => "pageactions", id => "foo", call => \&pageactions); + +This hook allows plugins to add arbitrary actions to the action bar on a +page (next to Edit, RecentChanges, etc). The hook is passed a "page" +parameter, and can return a list of html fragments to add to the action +bar. ### sanitize @@ -237,17 +343,6 @@ modify the body of a page after it has been fully converted to html. The function is passed named parameters: "page", "destpage", and "content", and should return the sanitized content. -### postscan - - hook(type => "postscan", id => "foo", call => \&postscan); - -This hook is called once the full page body is available (but before the -format hook). The most common use is to update search indexes. Added in -ikiwiki 2.54. - -The function is passed named parameters "page" and "content". Its return -value is ignored. - ### format hook(type => "format", id => "foo", call => \&format); @@ -455,7 +550,13 @@ The data returned is a list of `%config` options, followed by a hash describing the option. There can also be an item named "plugin", which describes the plugin as a whole. For example: - return + return + plugin => { + description => "description of this plugin", + safe => 1, + rebuild => 1, + section => "misc", + }, option_foo => { type => "boolean", description => "enable foo?", @@ -470,11 +571,6 @@ describes the plugin as a whole. For example: safe => 1, rebuild => 0, }, - plugin => { - description => "description of this plugin", - safe => 1, - rebuild => 1, - }, * `type` can be "boolean", "string", "integer", "pagespec", or "internal" (used for values that are not user-visible). The type is @@ -495,36 +591,38 @@ describes the plugin as a whole. For example: the plugin) will require a wiki rebuild, false if no rebuild is needed, and undef if a rebuild could be needed in some circumstances, but is not strictly required. +* `section` can optionally specify which section in the config file + the plugin fits in. The convention is to name the sections the + same as the tags used for [[plugins|plugin]] on this wiki. ### genwrapper hook(type => "genwrapper", id => "foo", call => \&genwrapper); This hook is used to inject C code (which it returns) into the `main` -function of the ikiwiki wrapper when it is being generated. +function of the ikiwiki wrapper when it is being generated. -## Plugin interface +The code runs before anything else -- in particular it runs before +the suid wrapper has sanitized its environment. -To import the ikiwiki plugin interface: +### disable - use IkiWiki '3.00'; + hook(type => "disable", id => "foo", call => \&disable); -This will import several variables and functions into your plugin's -namespace. These variables and functions are the ones most plugins need, -and a special effort will be made to avoid changing them in incompatible -ways, and to document any changes that have to be made in the future. +This hook is only run when a previously enabled plugin gets disabled +during ikiwiki setup. Plugins can use this to perform cleanups. -Note that IkiWiki also provides other variables and functions that are not -exported by default. No guarantee is made about these in the future, so if -it's not exported, the wise choice is to not use it. +## Exported variables + +Several variables are exported to your plugin when you `use IkiWiki;` -### %config +### `%config` A plugin can access the wiki's configuration via the `%config` hash. The best way to understand the contents of the hash is to look at your ikiwiki setup file, which sets the hash content to configure the wiki. -### %pagestate +### `%pagestate` The `%pagestate` hash can be used by plugins to save state that they will need next time ikiwiki is run. The hash holds per-page state, so to set a value, @@ -542,7 +640,7 @@ When pages are deleted, ikiwiki automatically deletes their pagestate too. Note that page state does not persist across wiki rebuilds, only across wiki updates. -### %wikistate +### `%wikistate` The `%wikistate` hash can be used by a plugin to store persistant state that is not bound to any one page. To set a value, use @@ -551,23 +649,53 @@ serialize, `$key` is any string you like, and `$id` must be the same as the "id" parameter passed to `hook()` when registering the plugin, so that the state can be dropped if the plugin is no longer used. -### Other variables +### `%links` + +The `%links` hash can be used to look up the names of each page that +a page links to. The name of the page is the key; the value is an array +reference. Do not modify this hash directly; call `add_link()`. + + $links{"foo"} = ["bar", "baz"]; -If your plugin needs to access data about other pages in the wiki. It can -use the following hashes, using a page name as the key: +### `%typedlinks` -* `%links` lists the names of each page that a page links to, in an array - reference. -* `%destsources` contains the name of the source file used to create each - destination file. -* `%pagesources` contains the name of the source file for each page. +The `%typedlinks` hash records links of specific types. Do not modify this +hash directly; call `add_link()`. The keys are page names, and the values +are hash references. In each page's hash reference, the keys are link types +defined by plugins, and the values are hash references with link targets +as keys, and 1 as a dummy value, something like this: -Also, the `%IkiWiki::version` variable contains the version number for the -ikiwiki program. + $typedlinks{"foo"} = { + tag => { short_word => 1, metasyntactic_variable => 1 }, + next_page => { bar => 1 }, + }; -### Library functions +Ordinary [[WikiLinks|ikiwiki/WikiLink]] appear in `%links`, but not in +`%typedlinks`. -#### `hook(@)` +### `%pagesources` + +The `%pagesources` has can be used to look up the source filename +of a page. So the key is the page name, and the value is the source +filename. Do not modify this hash. + + $pagesources{"foo"} = "foo.mdwn"; + +### `%destsources` + +The `%destsources` hash records the name of the source file used to +create each destination file. The key is the output filename (ie, +"foo/index.html"), and the value is the source filename that it was built +from (eg, "foo.mdwn"). Note that a single source file may create multiple +destination files. Do not modify this hash directly; call `will_render()`. + + $destsources{"foo/index.html"} = "foo.mdwn"; + +## Library functions + +Several functions are exported to your plugin when you `use IkiWiki;` + +### `hook(@)` Hook into ikiwiki's processing. See the discussion of hooks above. @@ -576,12 +704,12 @@ named `no_override` is supported, If it's set to a true value, then this hook will not override any existing hook with the same id. This is useful if the id can be controled by the user. -#### `debug($)` +### `debug($)` Logs a debugging message. These are supressed unless verbose mode is turned on. -#### `error($;$)` +### `error($;$)` Aborts with an error message. If the second parameter is passed, it is a function that is called after the error message is printed, to do any final @@ -595,37 +723,42 @@ In other hooks, error() is a fatal error, so use with care. Try to avoid dying on bad input when building a page, as that will halt the entire wiki build and make the wiki unusable. -#### `template($;@)` +### `template($;@)` -Creates and returns a [[!cpan HTML::Template]] object. The first parameter -is the name of the file in the template directory. The optional remaining +Creates and returns a [[!cpan HTML::Template]] object. (In a list context, +returns the parameters needed to construct the obhect.) + +The first parameter is the name of the template file. The optional remaining parameters are passed to `HTML::Template->new`. -#### `htmlpage($)` +Normally, the template file is first looked for in the templates/ subdirectory +of the srcdir. Failing that, it is looked for in the templatedir. -Passed a page name, returns the base name that will be used for a the html -page created from it. (Ie, it appends ".html".) +Wiki pages can be used as templates. This should be done only for templates +which it is safe to let wiki users edit. Enable it by passing a filename +with no ".tmpl" extension. Template pages are normally looked for in +the templates/ directory. If the page name starts with "/", a page +elsewhere in the wiki can be used. -Use this when constructing the filename of a html file. Use `urlto` when -generating a link to a page. +If the template is not found, or contains a syntax error, an error is thrown. -### `deptype(@)` +### `template_depends($$;@)` -Use this function to generate ikiwiki's internal representation of a -dependency type from one or more of these keywords: +Use this instead of `template()` if the content of a template is being +included into a page. This causes the page to depend on the template, +so it will be updated if the template is modified. -* `content` is the default. Any change to the content - of a page triggers the dependency. -* `presence` is only triggered by a change to the presence - of a page. -* `links` is only triggered by a change to the links of a page. - This includes when a link is added, removed, or changes what - it points to due to other changes. It does not include the - addition or removal of a duplicate link. +Like `template()`, except the second parameter is the page. -If multiple types are specified, they are combined. +### `htmlpage($)` -#### `pagespec_match_list($$;@)` +Passed a page name, returns the base name that will be used for a the html +page created from it. (Ie, it appends ".html".) + +Use this when constructing the filename of a html file. Use `urlto` when +generating a link to a page. + +### `pagespec_match_list($$;@)` Passed a page name, and [[ikiwiki/PageSpec]], returns a list of pages in the wiki that match the [[ikiwiki/PageSpec]]. @@ -646,7 +779,10 @@ Additional named parameters can be specified: * `filter` is a reference to a function, that is called and passed a page, and returns true if the page should be filtered out of the list. * `sort` specifies a sort order for the list. See - [[ikiwiki/PageSpec/sorting]] for the avilable sort methods. + [[ikiwiki/PageSpec/sorting]] for the avilable sort methods. Note that + if a sort method is specified that depends on the + page content (such as 'meta(foo)'), the deptype needs to be set to + a content dependency. * `reverse` if true, sorts in reverse. * `num` if nonzero, specifies the maximum number of matching pages that will be returned. @@ -656,7 +792,7 @@ Additional named parameters can be specified: Any other named parameters are passed on to `pagespec_match`, to further limit the match. -#### `add_depends($$;$)` +### `add_depends($$;$)` Makes the specified page depend on the specified [[ikiwiki/PageSpec]]. @@ -664,7 +800,7 @@ By default, dependencies are full content dependencies, meaning that the page will be updated whenever anything matching the PageSpec is modified. This can be overridden by passing a `deptype` value as the third parameter. -#### `pagespec_match($$;@)` +### `pagespec_match($$;@)` Passed a page name, and [[ikiwiki/PageSpec]], returns a true value if the [[ikiwiki/PageSpec]] matches the page. @@ -678,7 +814,23 @@ The most often used is "location", which specifies the location the PageSpec should match against. If not passed, relative PageSpecs will match relative to the top of the wiki. -#### `bestlink($$)` +### `deptype(@)` + +Use this function to generate ikiwiki's internal representation of a +dependency type from one or more of these keywords: + +* `content` is the default. Any change to the content + of a page triggers the dependency. +* `presence` is only triggered by a change to the presence + of a page. +* `links` is only triggered by a change to the links of a page. + This includes when a link is added, removed, or changes what + it points to due to other changes. It does not include the + addition or removal of a duplicate link. + +If multiple types are specified, they are combined. + +### `bestlink($$)` Given a page and the text of a link on the page, determine which existing page that link best points to. Prefers pages under a @@ -686,7 +838,7 @@ subdirectory with the same name as the source page, failing that goes down the directory tree to the base looking for matching pages, as described in [[ikiwiki/SubPage/LinkingRules]]. -#### `htmllink($$$;@)` +### `htmllink($$$;@)` Many plugins need to generate html links and add them to a page. This is done by using the `htmllink` function. The usual way to call @@ -712,8 +864,9 @@ control some options. These are: * anchor - set to make the link include an anchor * rel - set to add a rel attribute to the link * class - set to add a css class to the link +* title - set to add a title attribute to the link -#### `readfile($;$)` +### `readfile($;$)` Given a filename, reads and returns the entire file. @@ -722,7 +875,7 @@ in binary mode. A failure to read the file will result in it dying with an error. -#### `writefile($$$;$$)` +### `writefile($$$;$$)` Given a filename, a directory to put it in, and the file's content, writes a file. @@ -750,7 +903,7 @@ generally the directory parameter is a trusted toplevel directory like the srcdir or destdir, and any subdirectories of this are included in the filename parameter. -#### `will_render($$)` +### `will_render($$)` Given a page name and a destination file name (not including the base destination directory), register that the page will result in that file @@ -766,34 +919,34 @@ Ikiwiki uses this information to automatically clean up rendered files when the page that rendered them goes away or is changed to no longer render them. will_render also does a few important security checks. -#### `pagetype($)` +### `pagetype($)` Given the name of a source file, returns the type of page it is, if it's a type that ikiwiki knowns how to htmlize. Otherwise, returns undef. -#### `pagename($)` +### `pagename($)` Given the name of a source file, returns the name of the wiki page that corresponds to that file. -#### `pagetitle($)` +### `pagetitle($)` Give the name of a wiki page, returns a version suitable to be displayed as the page's title. This is accomplished by de-escaping escaped characters in the page name. "_" is replaced with a space, and '__NN__' is replaced by the UTF character with code NN. -#### `titlepage($)` +### `titlepage($)` This performs the inverse of `pagetitle`, ie, it converts a page title into a wiki page name. -#### `linkpage($)` +### `linkpage($)` This converts text that could have been entered by the user as a [[ikiwiki/WikiLink]] into a wiki page name. -#### `srcfile($;$)` +### `srcfile($;$)` Given the name of a source file in the wiki, searches for the file in the source directory and the underlay directories (most recently added @@ -803,7 +956,7 @@ Normally srcfile will fail with an error message if the source file cannot be found. The second parameter can be set to a true value to make it return undef instead. -#### `add_underlay($)` +### `add_underlay($)` Adds a directory to the set of underlay directories that ikiwiki will search for files. @@ -811,33 +964,48 @@ search for files. If the directory name is not absolute, ikiwiki will assume it is in the parent directory of the configured underlaydir. -#### `displaytime($;$)` +### `displaytime($;$$)` Given a time, formats it for display. The optional second parameter is a strftime format to use to format the time. -#### `gettext` +If the third parameter is true, this is the publication time of a page. +(Ie, set the html5 pubdate attribute.) + +### `gettext` This is the standard gettext function, although slightly optimised. -#### `urlto($$;$)` +### `ngettext` + +This is the standard ngettext function, although slightly optimised. + +### `urlto($;$$)` Construct a relative url to the first parameter from the page named by the second. The first parameter can be either a page name, or some other destination file, as registered by `will_render`. -If the third parameter is passed and is true, an absolute url will be -constructed instead of the default relative url. +Provide a second parameter whenever possible, since this leads to better +behaviour for the [[plugins/po]] plugin and `file:///` URLs. + +If the second parameter is not specified (or `undef`), the URL will be +valid from any page on the wiki, or from the CGI; if possible it'll +be a path starting with `/`, but an absolute URL will be used if +the wiki and the CGI are on different domains. -#### `newpagefile($$)` +If the third parameter is passed and is true, the url will be a fully +absolute url. This is useful when generating an url to publish elsewhere. + +### `newpagefile($$)` This can be called when creating a new page, to determine what filename to save the page to. It's passed a page name, and its type, and returns the name of the file to create, relative to the srcdir. -#### `targetpage($$;$)` +### `targetpage($$;$)` Passed a page and an extension, returns the filename that page will be rendered to. @@ -846,11 +1014,31 @@ Optionally, a third parameter can be passed, to specify the preferred filename of the page. For example, `targetpage("foo", "rss", "feed")` will yield something like `foo/feed.rss`. -#### `add_link($$)` +### `add_link($$;$)` This adds a link to `%links`, ensuring that duplicate links are not added. Pass it the page that contains the link, and the link text. +An optional third parameter sets the link type. If not specified, +it is an ordinary [[ikiwiki/WikiLink]]. + +### `add_autofile($$$)` + +Sometimes you may want to add a file to the `srcdir` as a result of content +of other pages. For example, [[plugins/tag]] pages can be automatically +created as needed. This function can be used to do that. + +The three parameters are the filename to create (relative to the `srcdir`), +the name of the plugin, and a callback function. The callback will be +called if it is appropriate to automatically add the file, and should then +take care of creating it, and doing anything else it needs to (such as +checking it into revision control). Note that the callback may not always +be called. For example, if an automatically added file is deleted by the +user, ikiwiki will avoid re-adding it again. + +This function needs to be called during the scan hook, or earlier in the +build process, in order to add the file early enough for it to be built. + ## Miscellaneous ### Internal use pages @@ -888,16 +1076,20 @@ token, that will be passed into `rcs_commit` when committing. For example, it might return the current revision ID of the file, and use that information later when merging changes. -#### `rcs_commit($$$;$$)` +#### `rcs_commit(@)` + +Passed named parameters: `file`, `message`, `token` (from `rcs_prepedit`), +and `session` (optional). -Passed a file, message, token (from `rcs_prepedit`), user, and ip address. Should try to commit the file. Returns `undef` on *success* and a version of the page with the rcs's conflict markers on failure. -#### `rcs_commit_staged($$$)` +#### `rcs_commit_staged(@)` + +Passed named parameters: `message`, and `session` (optional). -Passed a message, user, and ip address. Should commit all staged changes. -Returns undef on success, and an error message on failure. +Should commit all staged changes. Returns undef on success, and an +error message on failure. Changes can be staged by calls to `rcs_add`, `rcs_remove`, and `rcs_rename`. @@ -940,7 +1132,9 @@ The data structure returned for each change is: { rev => # the RCSs id for this commit - user => # name of user who made the change, + user => # user who made the change (may be an openid), + nickname => # short name for user (optional; not an openid), + committype => # either "web" or the name of the rcs, when => # time when the change was made, message => [ @@ -957,19 +1151,30 @@ The data structure returned for each change is: ], } -#### `rcs_diff($)` +#### `rcs_diff($;$)` + +The first parameter is the rev from `rcs_recentchanges`. +The optional second parameter is how many lines to return (default: all). -The parameter is the rev from `rcs_recentchanges`. Should return a list of lines of the diff (including \n) in list -context, and the whole diff in scalar context. +context, and a string containing the whole diff in scalar context. #### `rcs_getctime($)` This is used to get the page creation time for a file from the RCS, by looking it up in the history. +If the RCS cannot determine a ctime for the file, return 0. + +#### `rcs_getmtime($)` + +This is used to get the page modification time for a file from the RCS, by +looking it up in the history. + It's ok if this is not implemented, and throws an error. +If the RCS cannot determine a mtime for the file, return 0. + #### `rcs_receive()` This is called when ikiwiki is running as a pre-receive hook (or @@ -979,9 +1184,9 @@ sense to implement for all RCSs. It should examine the incoming changes, and do any sanity checks that are appropriate for the RCS to limit changes to safe file adds, -removes, and changes. If something bad is found, it should exit -nonzero, to abort the push. Otherwise, it should return a list of -files that were changed, in the form: +removes, and changes. If something bad is found, it should die, to abort +the push. Otherwise, it should return a list of files that were changed, +in the form: { file => # name of file that was changed @@ -994,6 +1199,28 @@ files that were changed, in the form: The list will then be checked to make sure that each change is one that is allowed to be made via the web interface. +#### `rcs_preprevert($)` + +This is called by the revert web interface. It is passed a RCS-specific +change ID, and should determine what the effects would be of reverting +that change, and return the same data structure as `rcs_receive`. + +Like `rcs_receive`, it should do whatever sanity checks are appropriate +for the RCS to limit changes to safe changes, and die if a change would +be unsafe to revert. + +#### `rcs_revert($)` + +This is called by the revert web interface. It is passed a named +parameter rev that is the RCS-specific change ID to revert. + +It should try to revert the specified rev, and leave the reversion staged +so `rcs_commit_staged` will complete it. It should return undef on _success_ +and an error message on failure. + +This hook and `rcs_preprevert` are optional, if not implemented, no revert +web interface will be available. + ### PageSpec plugins It's also possible to write plugins that add new functions to @@ -1017,6 +1244,24 @@ For example, "backlink(foo)" is influenced by the contents of page foo; they match; "created_before(foo)" is influenced by the metadata of foo; while "glob(*)" is not influenced by the contents of any page. +### Sorting plugins + +Similarly, it's possible to write plugins that add new functions as +[[ikiwiki/pagespec/sorting]] methods. To achieve this, add a function to +the IkiWiki::SortSpec package named `cmp_foo`, which will be used when sorting +by `foo` or `foo(...)` is requested. + +The names of pages to be compared are in the global variables `$a` and `$b` +in the IkiWiki::SortSpec package. The function should return the same thing +as Perl's `cmp` and `<=>` operators: negative if `$a` is less than `$b`, +positive if `$a` is greater, or zero if they are considered equal. It may +also raise an error using `error`, for instance if it needs a parameter but +one isn't provided. + +The function will also be passed one or more parameters. The first is +`undef` if invoked as `foo`, or the parameter `"bar"` if invoked as `foo(bar)`; +it may also be passed additional, named parameters. + ### Setup plugins The ikiwiki setup file is loaded using a pluggable mechanism. If you look diff --git a/doc/plugins/write/external.mdwn b/doc/plugins/write/external.mdwn index e30bf2ff3..a3fbe8a2c 100644 --- a/doc/plugins/write/external.mdwn +++ b/doc/plugins/write/external.mdwn @@ -1,7 +1,7 @@ External plugins are standalone, executable programs, that can be written in any language. When ikiwiki starts up, it runs the program, and -communicates with it using [XML RPC][xmlrpc]. If you want to [[write]] an external -plugin, read on.. +communicates with it using [XML RPC][xmlrpc]. If you want to [[write]] an +external plugin, read on.. [xmlrpc]: http://www.xmlrpc.com/ @@ -85,8 +85,8 @@ language as part of their XML RPC interface. XML RPC has a limitation that it does not have a way to pass undef/NULL/None. There is an extension to the protocol that supports this, -but it is not yet available in the [[!cpan XML::RPC]] library used by -ikiwiki. +but it is not yet available in all versions of the [[!cpan XML::RPC]] library +used by ikiwiki. Until the extension is available, ikiwiki allows undef to be communicated over XML RPC by passing a sentinal value, a hash with a single key "null" diff --git a/doc/quotes.mdwn b/doc/quotes.mdwn new file mode 100644 index 000000000..22f3a28d8 --- /dev/null +++ b/doc/quotes.mdwn @@ -0,0 +1,3 @@ +Collecting some happy quotes about ikiwiki here. + +[[!inline pages="quotes/* and !*/Discussion"]] diff --git a/doc/quotes/pizza.mdwn b/doc/quotes/pizza.mdwn new file mode 100644 index 000000000..34899edfb --- /dev/null +++ b/doc/quotes/pizza.mdwn @@ -0,0 +1,4 @@ +> Best. Wiki. Ever. Now a wiki that I can't "git clone" and "git push" to is +> like a pizza that I have to eat with a knife and fork. + +-- [Don Marti](http://lwn.net/Articles/360888/) diff --git a/doc/quotes/pizza/discussion.mdwn b/doc/quotes/pizza/discussion.mdwn new file mode 100644 index 000000000..ecf8c44a6 --- /dev/null +++ b/doc/quotes/pizza/discussion.mdwn @@ -0,0 +1 @@ +It would be cool to know where this was written (assuming it was somewhere public :-)) Googling around, I've found a few places Don has recommended ikiwiki, but not in such glowing terms. -- [[Jon]] diff --git a/doc/quotes/sold.mdwn b/doc/quotes/sold.mdwn new file mode 100644 index 000000000..4bd021f74 --- /dev/null +++ b/doc/quotes/sold.mdwn @@ -0,0 +1,3 @@ +I'm totally sold on ikiwiki now. + +-- Anna Hess diff --git a/doc/rcs.mdwn b/doc/rcs.mdwn index f66b85495..4d75d6325 100644 --- a/doc/rcs.mdwn +++ b/doc/rcs.mdwn @@ -6,11 +6,39 @@ histories. Ikiwiki started out supporting only [[Subversion|svn]], but the interface ikiwiki uses to a revision control system is sufficiently simple and generic that it can be adapted to work with many systems by writing a -[[plugin|plugins/write]]. [[Subversion|svn]] is still a recommended choice; -[[git]] is another well-tested option. +[[plugin|plugins/write]]. These days, most people use [[git]]. -These are all the supported revision control systems: -[[!inline pages="rcs/* and !*/Discussion and !rcs/details" archive=yes]] +While all supported revision control systems work well enough for basic +use, some advanced or special features are not supported in all of them. +The table below summarises this for each revision control system and +links to more information about each. + +[[!table data=""" +feature |[[git]]|[[svn]]|[[bzr]] |[[monotone]]|[[mercurial]]|[[darcs]]|[[tla]] |[[cvs]] +[[ikiwiki-makerepo]]|yes |yes |yes |yes |yes |yes |no |yes +auto.setup |yes |yes |incomplete|yes |incomplete |yes |incomplete|yes +`rcs_commit_staged` |yes |yes |yes |yes |no |yes |no |yes +`rcs_rename` |yes |yes |yes |yes |no |yes |no |yes +`rcs_remove` |yes |yes |yes |yes |no |yes |no |yes +`rcs_diff` |yes |yes |yes |yes |no |yes |yes |yes +`rcs_getctime` |fast |slow |slow |slow |slow |slow |slow |slow +`rcs_getmtime` |fast |slow |slow |slow |no |no |no |no +`rcs_preprevert` |yes |no |no |no |no |no |no |no +`rcs_revert` |yes |no |no |no |no |no |no |no +anonymous push |yes |no |no |no |no |no |no |no +conflict handling |yes |yes |yes |buggy |yes |yes |yes |yes +openid username |yes |no |no |no |no |no |no |no +"""]] + +Notes: + +* Lack of support in [[ikiwiki-makerepo]] or auto.setup can make it harder to + set up a wiki using that revision control system. +* The `rcs_commit_staged` hook is needed to use [[attachments|plugins/attachment]] + or [[plugins/comments]]. +* `rcs_getctime` and `rcs_getmtime` may be implemented in a fast way (ie, one log + lookup for all files), or very slowly (one lookup per file). +* Openid username support allows avoiding display of Google's ugly openids. There is a page with [[details]] about how the different systems work with ikiwiki, for the curious. diff --git a/doc/rcs/cvs.mdwn b/doc/rcs/cvs.mdwn index f0bd0f6f0..9beb08ece 100644 --- a/doc/rcs/cvs.mdwn +++ b/doc/rcs/cvs.mdwn @@ -20,8 +20,9 @@ Consider creating `$HOME/.cvsrc` if you don't have one already; the plugin doesn * creates a repository, * imports `$SRCDIR` into top-level module `ikiwiki` (vendor tag IKIWIKI, release tag PRE_CVS), * configures the post-commit hook in `CVSROOT/loginfo`. -* CVS multi-directory commits happen separately; the post-commit hook sees only the first directory's changes in time for [[recentchanges|plugins/recentchanges]]. The next run of `ikiwiki --setup` will correctly re-render such a recentchanges entry. It should be possible to solve this problem with NetBSD's `commit_prep` and `log_accum` scripts (see below). ### To do -* Instead of resource-intensively scraping changesets with `cvsps`, have `ikiwiki-makerepo` set up NetBSD-like `log_accum` and `commit_prep` scripts that coalesce and keep records of commits. `cvsps` can be used as a fallback for repositories without such records. +* Have `ikiwiki-makerepo` set up NetBSD-like `log_accum` and `commit_prep` scripts that coalesce commits into changesets. Reasons: + 7. Obviates the need to scrape the repo's complete history to determine the last N changesets. (Repositories without such records can fall back on the `cvsps` and `File::ReadBackwards` code.) + 7. Arranges for ikiwiki to be run once per changeset, rather than CVS's once per committed file (!), which is a waste at best and bug-inducing at worst. (Currently, on multi-directory commits, only the first directory's changes get mentioned in [[recentchanges|plugins/recentchanges]].) * Perhaps prevent web edits from attempting to create `.../CVS/foo.mdwn` (and `.../cvs/foo.mdwn` on case-insensitive filesystems); thanks to the CVS metadata directory, the attempt will fail anyway (and much more confusingly) if we don't. diff --git a/doc/rcs/git.mdwn b/doc/rcs/git.mdwn index 000eb0b3c..1b66493dd 100644 --- a/doc/rcs/git.mdwn +++ b/doc/rcs/git.mdwn @@ -28,12 +28,7 @@ updates the published wiki itself. The other (optional) leaf node repositories are meant for you to work on, and commit to, changes should then be pushed to the bare root -repository. In theory, you could work on the same leaf node repository -that ikiwiki uses to compile the wiki from, and the [[cgi]] commits -to, as long as you ensure that permissions and ownership don't hinder -the working of the [[cgi]]. This can be done, for example, by using -ACL's, in practice, it is easier to just setup separate clones for -yourself. +repository. So, to reiterate, when using Git, you probably want to set up three repositories: @@ -41,9 +36,9 @@ repositories: * The root repository. This should be a bare repository (meaning that it does not have a working tree checked out), which the other repositories will push to/pull from. It is a bare repository, since - there are problems pushing to a repository that has a working + git does not support pushing to a repository that has a working directory. This is called _repository_ in [[ikiwiki-makerepo]]'s - manual page. Nominally, this bare repository has a `post-update` hook + manual page. This bare repository has a `post-update` hook that either is or calls ikiwiki's git wrapper, which changes to the working directory for ikiwiki, does a `git pull`, and refreshes ikiwiki to regenerate the wiki with any new content. The [[setup]] page describes @@ -51,17 +46,18 @@ repositories: * The second repository is a clone of the bare root repository, and has a working tree which is used as ikiwiki's srcdir for compiling - the wiki. **Never** push to this repository. When running as a - [[cgi]], the changes are committed to this repository, and pushed to - the master repository above. This is called _srcdir_ in - [[ikiwiki-makerepo]]'s manual page. + the wiki. **Never** push to this repository. It is wise to not make + changes or commits directly to this repository, to avoid conflicting + with ikiwiki's own changes. When running as a [[cgi]], the changes + are committed to this repository, and pushed to the master repository + above. This is called _srcdir_ in [[ikiwiki-makerepo]]'s manual page. * The other (third, fourth, fifth, sixth -- however many pleases you) repositories are also clones of the bare root repository above -- and these have a working directory for you to work on. Use either the `git` transport (if available), or `ssh`. These repositories may - be on remote machines, your laptop, whereever you find convenient to - hack on your wiki. you can commit local changes to the version on + be on remote machines, your laptop, wherever you find convenient to + hack on your wiki. You can commit local changes to the version on the laptop, perhaps while offline. Any new content should be pushed to the bare master repository when you are ready to publish it, and then the post-update hook of the bare repository will ensure that the @@ -87,8 +83,8 @@ It can be tricky to get the permissions right to allow multiple people to commit to an ikiwiki git repository. As the [[security]] page mentions, for a secure ikiwiki installation, only one person should be able to write to ikiwiki's srcdir. When other committers make commits, their commits -should go to the bare repository, which has a `post-update` hook that uses -ikiwiki to pull the changes to the srcdir. +should be pushed to the bare repository, which has a `post-update` hook +that uses ikiwiki to pull the changes to the srcdir. One setup that will work is to put all committers in a group (say, "ikiwiki"), and use permissions to allow that group to commit to the bare git diff --git a/doc/rcs/svn.mdwn b/doc/rcs/svn.mdwn index f8c44b6eb..7aa682978 100644 --- a/doc/rcs/svn.mdwn +++ b/doc/rcs/svn.mdwn @@ -1,4 +1,4 @@ -[Subversion](http://subversion.tigris.org/) is a revision control system. While ikiwiki is relatively +[Subversion](http://subversion.tigris.org/) is a [[revision control system|rcs]]. While ikiwiki is relatively independent of the underlying revision control system, and can easily be used without one, using it with Subversion or another revision control system is recommended. diff --git a/doc/rcs/tla.mdwn b/doc/rcs/tla.mdwn index cad5d51f4..79eecd627 100644 --- a/doc/rcs/tla.mdwn +++ b/doc/rcs/tla.mdwn @@ -2,6 +2,9 @@ [GNU](http://www.gnu.org/) [Arch](http://www.gnuarch.org/) revision control system. Ikiwiki supports storing a wiki in tla. +Warning: Since tla is not being maintained, neither is this plugin, and +using ikiwiki with tla is not recommended. + Ikiwiki can run as a [[post-commit]] hook to update a wiki whenever commits come in. When running as a [[cgi]] with tla, ikiwiki automatically commits edited pages to the Arch repostory, and uses the Arch diff --git a/doc/reviewed.mdwn b/doc/reviewed.mdwn new file mode 100644 index 000000000..14772a369 --- /dev/null +++ b/doc/reviewed.mdwn @@ -0,0 +1,7 @@ +This page lists [[branches]] that have been reviewed. If your branch +shows up here, the ball is back in your court, to respond to the review and +deal with whatever is preventing it from being merged into ikiwiki. Once +you do, remove the "reviewed" tag. + +[[!inline pages="(todo/* or bugs/*) and link(/branches) and !link(bugs/done) +and !link(todo/done) and !*/*/* and link(.)" show=0 archive=yes]] diff --git a/doc/roadmap.mdwn b/doc/roadmap.mdwn index a701a2685..134ebcb7b 100644 --- a/doc/roadmap.mdwn +++ b/doc/roadmap.mdwn @@ -69,6 +69,26 @@ backwards compatability. ---- +# compatability breaking changes + +Probably incomplete list: + +* Drop old `--getctime` option. +* Remove compatability code in `loadindex` to handle old index data layouts. +* Make pagespecs match relative by default? (see [[discussion]]) +* Flip wikilinks? (see [[todo/link_plugin_perhaps_too_general?]]) +* YADA format setup files per default? +* Enable tagbase by default (so that tag autocreation will work by default). + Note that this is already done for wikis created by `auto-blog.setup`. +* [[tips/html5]] on by default (some day..) +* Remove support for old `.ikiwiki/comments_pending` from comment plugin. +* Use yaml formatted setup files by default. (Not too compatability breaking + really.) + +In general, we try to use [[ikiwiki-transition]] or forced rebuilds on +upgrade to deal with changes that break compatability. Some things that +can't help with. + # future goals * Conversion support for existing other wikis. diff --git a/doc/roadmap/discussion.mdwn b/doc/roadmap/discussion.mdwn index 0b69867bf..8233b1990 100644 --- a/doc/roadmap/discussion.mdwn +++ b/doc/roadmap/discussion.mdwn @@ -3,6 +3,7 @@ backwards compatibility problems. Should this be marked as a future plan, perhap major version number like 2.0? --Ethan Yes, I'm looking at making this kind of change at 2.0, added to the list. +(Update: Didn't make it in 2.0 or 3.0...) However, I have doubts that it makes good sense to go relative by default. While it's not consitent with links, it seems to work better overall to have pagespecs be absolute by default, IMHO. --[[Joey]] diff --git a/doc/sandbox.mdwn b/doc/sandbox.mdwn index 8aedcbb9e..a5d686908 100644 --- a/doc/sandbox.mdwn +++ b/doc/sandbox.mdwn @@ -1,21 +1,30 @@ -This is the [[SandBox]], a page anyone can edit to try out ikiwiki (version [[!version ]]). +# Sandbox +This is the [[SandBox]], a page anyone can edit to try out ikiwiki +(version [[!version ]]). -# Header +[[!toc levels=1 startlevel=2 ]] -## Subheader +Testing this sandbox thing. + +## Blockquotes > This is a blockquote. > > This is the first level of quoting. > -> > This is nested blockquote. +> > This is a nested blockquote. > >> Without a space works too. >>> to three levels > > Back to the first level. +> It's kinda like e-mail... +>> ...but without the cool colored lines... +>>> ...and different font colors. +>>>> ...but it's nothing a little CSS can't fix. + Numbered list 1. First item. @@ -24,6 +33,7 @@ Numbered list 1. And another.. 1. foo 2. bar + 3. quz Bulleted list @@ -31,7 +41,10 @@ Bulleted list * *item* * item * one - * two + * footballs; runner; unices + * Cool ! + * Indeed. + ---- @@ -47,15 +60,36 @@ Bulleted list * [[different_name_for_a_WikiLink|ikiwiki/WikiLink]] * <http://www.gnu.org/> * [GNU](http://www.gnu.org/) +* <a href="http://kitenet.net/~joey/">Joey's blog</a> + +---- + +# header1 + +## header2 + +### header3 ------ +#### header4 -[[!progress percent=27]] +##### header 5 -[[!progress percent=78]] +**bold** ------ +_italic_ + +---- This **SandBox** is also a [[blog]]! [[!inline pages="sandbox/* and !*/Discussion" rootpage="sandbox" show="4" archive="yes"]] +lkj;kj; + + +how do + +This is super cool Joey! + +Testing a change! + +Testing multilanguage support via utf-8: Ελληνικά. 日本語。 diff --git a/doc/sandbox/Fantasia.mdwn b/doc/sandbox/Fantasia.mdwn new file mode 100644 index 000000000..8845ec967 --- /dev/null +++ b/doc/sandbox/Fantasia.mdwn @@ -0,0 +1,10 @@ +>> Block +>>> Two Block + +[[blog]] blog + +* one +* two + +# one +# two diff --git a/doc/sandbox/Hey__33__.mdwn b/doc/sandbox/Hey__33__.mdwn new file mode 100644 index 000000000..6902ee32d --- /dev/null +++ b/doc/sandbox/Hey__33__.mdwn @@ -0,0 +1 @@ +Don't you love it... diff --git a/doc/sandbox/Just_a_new_post_with_non-latin_characters:_日本語.mdwn b/doc/sandbox/Just_a_new_post_with_non-latin_characters:_日本語.mdwn new file mode 100644 index 000000000..44b139a92 --- /dev/null +++ b/doc/sandbox/Just_a_new_post_with_non-latin_characters:_日本語.mdwn @@ -0,0 +1 @@ +Ελληνικά diff --git a/doc/sandbox/Mooooo.mdwn b/doc/sandbox/Mooooo.mdwn new file mode 100644 index 000000000..6f11d357c --- /dev/null +++ b/doc/sandbox/Mooooo.mdwn @@ -0,0 +1 @@ +Hrm. diff --git a/doc/sandbox/Nur_so..mdwn b/doc/sandbox/Nur_so..mdwn new file mode 100644 index 000000000..32c9f2397 --- /dev/null +++ b/doc/sandbox/Nur_so..mdwn @@ -0,0 +1 @@ +Das ist ein Test. diff --git a/doc/sandbox/Testing_blog_entry.mdwn b/doc/sandbox/Testing_blog_entry.mdwn new file mode 100644 index 000000000..aa5fa5b20 --- /dev/null +++ b/doc/sandbox/Testing_blog_entry.mdwn @@ -0,0 +1,7 @@ +# Be cool, this is a test! + +Hello guys, this is *just a test* entry. + +* Did I say +* that I love +* bulleted lists? diff --git a/doc/sandbox/adding_a_new_post.mdwn b/doc/sandbox/adding_a_new_post.mdwn new file mode 100644 index 000000000..b42ae703e --- /dev/null +++ b/doc/sandbox/adding_a_new_post.mdwn @@ -0,0 +1,3 @@ +Bob has many drives to archive his data, most of them kept offline, in a safe place. + +With git-annex, Bob has a single directory tree that includes all his files, even if their content is being stored offline. He can reorganize his files using that tree, committing new versions to git, without worry about accidentally deleting anything. diff --git a/doc/sandbox/bullet_list_and_code_test.mdwn b/doc/sandbox/bullet_list_and_code_test.mdwn new file mode 100644 index 000000000..a17729c90 --- /dev/null +++ b/doc/sandbox/bullet_list_and_code_test.mdwn @@ -0,0 +1,12 @@ +paragraph. + + code + + * bullet list + * bullet list + + more code + + * bullet list continued + + tailing code diff --git a/doc/sandbox/danc.mdwn b/doc/sandbox/danc.mdwn new file mode 100644 index 000000000..9766475a4 --- /dev/null +++ b/doc/sandbox/danc.mdwn @@ -0,0 +1 @@ +ok diff --git a/doc/sandbox/dateenumeration.mdwn b/doc/sandbox/dateenumeration.mdwn new file mode 100644 index 000000000..adc40bd23 --- /dev/null +++ b/doc/sandbox/dateenumeration.mdwn @@ -0,0 +1,4 @@ +* 1. January +* 23. February +* 99. March +* 7. November diff --git a/doc/sandbox/hey.mdwn b/doc/sandbox/hey.mdwn new file mode 100644 index 000000000..a955185ef --- /dev/null +++ b/doc/sandbox/hey.mdwn @@ -0,0 +1 @@ +* Hello diff --git a/doc/sandbox/plop.mdwn b/doc/sandbox/plop.mdwn new file mode 100644 index 000000000..e8b7c915c --- /dev/null +++ b/doc/sandbox/plop.mdwn @@ -0,0 +1 @@ +plop diff --git a/doc/sandbox/revert_me.mdwn b/doc/sandbox/revert_me.mdwn new file mode 100644 index 000000000..2b1cd2f94 --- /dev/null +++ b/doc/sandbox/revert_me.mdwn @@ -0,0 +1 @@ +this looks good diff --git a/doc/sandbox/sidebar.mdwn b/doc/sandbox/sidebar.mdwn new file mode 100644 index 000000000..9daeafb98 --- /dev/null +++ b/doc/sandbox/sidebar.mdwn @@ -0,0 +1 @@ +test diff --git a/doc/security.mdwn b/doc/security.mdwn index 3924186c2..353854656 100644 --- a/doc/security.mdwn +++ b/doc/security.mdwn @@ -22,8 +22,8 @@ this would be to limit web commits to those done by a certain user. ## other stuff to look at -I need to audit the git backend a bit, and have been meaning to -see if any CRLF injection type things can be done in the CGI code. +I have been meaning to see if any CRLF injection type things can be +done in the CGI code. ---- @@ -162,10 +162,11 @@ closed though. ## HTML::Template security -If the [[plugins/template]] plugin is enabled, users can modify templates -like any other part of the wiki. This assumes that HTML::Template is secure +If the [[plugins/template]] plugin is enabled, all users can modify templates +like any other part of the wiki. Some trusted users can modify templates +without it too. This assumes that HTML::Template is secure when used with untrusted/malicious templates. (Note that includes are not -allowed, so that's not a problem.) +allowed.) ---- @@ -427,3 +428,49 @@ enabling TeX configuration options that disallow unsafe TeX commands. The fix was released on 30 Aug 2009 in version 3.1415926, and was backported to stable in version 2.53.4. If you use the teximg plugin, I recommend upgrading. ([[!cve CVE-2009-2944]]) + +## javascript insertion via svg uris + +Ivan Shmakov pointed out that the htmlscrubber allowed `data:image/*` urls, +including `data:image/svg+xml`. But svg can contain javascript, so that is +unsafe. + +This hole was discovered on 12 March 2010 and fixed the same day +with the release of ikiwiki 3.20100312. +A fix was also backported to Debian etch, as version 2.53.5. I recommend +upgrading to one of these versions if your wiki can be edited by third +parties. + +## javascript insertion via insufficient htmlscrubbing of comments + +Kevin Riggle noticed that it was not possible to configure +`htmlscrubber_skip` to scrub comments while leaving unscubbed the text +of eg, blog posts. Confusingly, setting it to "* and !comment(*)" did not +scrub comments. + +Additionally, it was discovered that comments' html was never scrubbed during +preview or moderation of comments with such a configuration. + +These problems were discovered on 12 November 2010 and fixed the same +hour with the release of ikiwiki 3.20101112. ([[!cve CVE-2010-1673]]) + +## javascript insertion via insufficient checking in comments + +Dave B noticed that attempting to comment on an illegal page name could be +used for an XSS attack. + +This hole was discovered on 22 Jan 2011 and fixed the same day with +the release of ikiwiki 3.20110122. A fix was backported to Debian squeeze, +as version 3.20100815.5. An upgrade is recommended for sites +with the comments plugin enabled. ([[!cve CVE-2011-0428]]) + +## possible javascript insertion via insufficient htmlscrubbing of alternate stylesheets + +Giuseppe Bilotta noticed that 'meta stylesheet` directives allowed anyone +who could upload a malicious stylesheet to a site to add it to a +page as an alternate stylesheet, or replacing the default stylesheet. + +This hole was discovered on 28 Mar 2011 and fixed the same hour with +the release of ikiwiki 3.20110328. An upgrade is recommended for sites +that have untrusted committers, or have the attachments plugin enabled. +([[!cve CVE-2011-1401]]) diff --git a/doc/setup.mdwn b/doc/setup.mdwn index 89444c9a8..ce51faa6d 100644 --- a/doc/setup.mdwn +++ b/doc/setup.mdwn @@ -4,7 +4,7 @@ This tutorial will walk you through setting up a wiki with ikiwiki. ## Install ikiwiki -If you're using Debian or Ubuntu, ikiwiki is an `apt-get install ikiwiki` away. +If you're using Debian or Ubuntu, ikiwiki is an <code><a href="http://www.debian.org/doc/manuals/debian-reference/ch02.en.html#_basic_package_management_operations">apt-get</a> install ikiwiki</code> away. If you're not, see the [[download]] and [[install]] pages. ## Create your wiki @@ -16,11 +16,13 @@ For more control, advanced users may prefer to set up a wiki [[by_hand|byhand]]. """]] - % ikiwiki -setup /etc/ikiwiki/auto.setup + % ikiwiki --setup /etc/ikiwiki/auto.setup Or, set up a blog with ikiwiki, run this command instead. - % ikiwiki -setup /etc/ikiwiki/auto-blog.setup + % ikiwiki --setup /etc/ikiwiki/auto-blog.setup + +`librpc-xml-perl` and `python-docutils` dependencies are needed. Either way, it will ask you a couple of questions. @@ -37,7 +39,7 @@ Then, wait for it to tell you an url for your new site.. destdir: ~/public_html/foo repository: ~/foo.git To modify settings, edit ~/foo.setup and then run: - ikiwiki -setup ~/foo.setup + ikiwiki --setup ~/foo.setup Done! @@ -47,7 +49,7 @@ Now you can go to the url it told you, and edit pages in your new wiki using the web interface. (If the web interface doesn't seem to allow editing or login, you may -need to configure [[configure_the_web_server|tips/dot_cgi]].) +need to [[configure_the_web_server|tips/dot_cgi]].) ## Checkout and edit wiki source @@ -65,8 +67,10 @@ source. (Remember to replace "foo" with the real directory name.) git clone foo.git foo.src svn checkout file://`pwd`/foo.svn/trunk foo.src + cvs -d `pwd`/foo get -P ikiwiki bzr clone foo foo.src hg clone foo foo.src + darcs get foo.darcs foo.src # TODO monotone, tla Now to edit pages by hand, go into the directory you checked out (ie, @@ -88,7 +92,7 @@ These range from changing the wiki's name, to enabling [[plugins]], to banning users and locking pages. If you log in as the admin user you configured earlier, and go to -your Preferences page, you can click on "Wiki Setup" to customize many +your Preferences page, you can click on "Setup" to customize many wiki settings and plugins. Some settings cannot be configured on the web, for security reasons or @@ -99,7 +103,13 @@ and gives a brief description of each. After making changes to this file, you need to tell ikiwiki to use it: - % ikiwiki -setup foo.setup + % ikiwiki --setup foo.setup + +Alternatively, you can ask ikiwiki to change settings in the file for you: + + % ikiwiki --changesetup foo.setup --plugin goodstuff + +See [[usage]] for more options. ## Customizing file locations @@ -122,11 +132,11 @@ old location won't work, and the easiest way to deal with this is to delete them and re-checkout from the new repository location. % rm -rf foo - % git clone /src/git/foo.git + % git clone /srv/git/foo.git Finally, edit the setup file. Modify the settings for `srcdir`, `destdir`, `url`, `cgiurl`, `cgi_wrapper`, `git_wrapper`, etc to reflect where -you moved things. Remember to run `ikiwiki -setup` after editing the +you moved things. Remember to run `ikiwiki --setup` after editing the setup file. ## Enjoy your new wiki! diff --git a/doc/setup/byhand.mdwn b/doc/setup/byhand.mdwn index 53f8d18bb..75a5648d5 100644 --- a/doc/setup/byhand.mdwn +++ b/doc/setup/byhand.mdwn @@ -104,7 +104,7 @@ is ok, run `ikiwiki --setup ikiwiki.setup`, and you're done! There are lots of other configuration options in ikiwiki.setup that you can uncomment, configure, and enable by re-running `ikiwiki --setup ikiwiki.setup`. Be sure to browse through all the -[[plugins]].. +[[plugins]]. ## Put your wiki in revision control. @@ -124,6 +124,12 @@ revision control. ikiwiki-makerepo svn $SRCDIR $REPOSITORY """]] +[[!toggle id=cvs text="CVS"]] +[[!toggleable id=cvs text=""" + REPOSITORY=~/wikirepo + ikiwiki-makerepo cvs $SRCDIR $REPOSITORY +"""]] + [[!toggle id=git text="Git"]] [[!toggleable id=git text=""" REPOSITORY=~/wiki.git @@ -171,7 +177,7 @@ about using the git repositories. Once your wiki is checked in to the revision control system, you should configure ikiwiki to use revision control. Edit your ikiwiki.setup, set -`rcs` to the the revision control system you chose to use. Be careful, +`rcs` to the revision control system you chose to use. Be careful, you may need to use the 'fullname'. For example, 'hg' doesn't work, you should use mercurial. Be sure to set `svnrepo` to the directory for your repository, if using subversion. Uncomment the configuration for the wrapper diff --git a/doc/setup/byhand/discussion.mdwn b/doc/setup/byhand/discussion.mdwn new file mode 100644 index 000000000..941976789 --- /dev/null +++ b/doc/setup/byhand/discussion.mdwn @@ -0,0 +1,7 @@ +What directory is the 'working copy'? There can be two interpretations: the current dir and the .git dir. + +> It is fairly common terminology amoung all version control systems to use +> "working copy" to refer to a checkout from version control, including +> copies of all the versioned files, and whatever VCS-specific cruft that +> entails. So, a working copy is everything you get when you `git clone` +> a repository. --[[Joey]] diff --git a/doc/setup/discussion.mdwn b/doc/setup/discussion.mdwn index 0501f443a..388d5a49c 100644 --- a/doc/setup/discussion.mdwn +++ b/doc/setup/discussion.mdwn @@ -1,3 +1,7 @@ +I have copied over the ikiwiki.setup file from /usr/share/doc/ikiwiki/ to /etc/ikiwiki/ and run it after editing. My site gets built but when I click on the 'edit' button, firefox and google chrome download the cgi file instead of creating a way to edit it. The permissions on my ikiwiki.cgi script look like this: -rwsr-sr-x 1 root root 13359 2009-10-13 19:21 ikiwiki.cgi. Is there something I should do, i.e. change permissions, so I can get it to run correctly? (jeremiah) + +> Have a look [[here|tips/dot_cgi]]. --[[Jogo]] + I just went through the standard procedure described for setup, copied the blog directory from examples into my source directory, ran ikiwiki, and everything seems to have worked, except that none of the [[!meta ... ]] tags get converted. They simply show up in the html files unformatted, with no exclamation point, and with p tags around them. Any ideas? using ikiwiki version 2.40 on freebsd --mjg @@ -238,3 +242,28 @@ Thank you! I'm not a Perl programmer, so what's your opinion: is this behavior a > That is not entirely clear to me from the documentation. It doesn't > say the path has to exist, but doesn't say it cannot either. --[[Joey]] + +I am experiencing the same problem "/etc/ikiwiki/custom: failed to set up the repository with ikiwiki-makerepo +" on Debian squeeze with perl5.10.0. Upgrading to ikiwiki 3.10 fixes it. -- [Albert](http://www.docunext.com/) + +---- + +Just a note, perl 5.10 isn't packaged as part of RHEL or thus CentOS nor EPEL, +so it's not especially trivial to satisfy that requirement for ikiwiki on +those platforms, without backporting it from Fedora or building from source. +However, I have an ikiwiki 3.20100403 running on RHEL-4 supplied 5.8.8 without +(seemingly too much) complaint. How strong is the 5.10 requirement? what +precicely breaks without it? -- [[Jon]] + +> I don't remember what was the specific problem with perl 5.8.8. All I can +> find is some taint checking bugs, which are currently worked around by +> taint checking being disabled. --[[Joey]] + +--- + +Did anyone tried to install ikiwiki under a vhost setup ? +ikiwiki is installed under a debian lenny system. but without write acces to /etc/ikiwiki (obvious) i am coming not far. +Or do i miss something which is probably hidden deeper in the documentation ? + +---- +Perhaps it's worth noting that when installing ikiwiki with apt on Debian stable, you need to use the backports version in order to follow the setup instructions. diff --git a/doc/shortcuts.mdwn b/doc/shortcuts.mdwn index b84d71c3d..cafe3f573 100644 --- a/doc/shortcuts.mdwn +++ b/doc/shortcuts.mdwn @@ -11,13 +11,13 @@ Some examples of using shortcuts include: This page controls what shortcut links the wiki supports. -* [[!shortcut name=google url="http://www.google.com/search?q=%s"]] +* [[!shortcut name=google url="https://encrypted.google.com/search?q=%s"]] * [[!shortcut name=archive url="http://web.archive.org/*/%S"]] * [[!shortcut name=gmap url="http://maps.google.com/maps?q=%s"]] * [[!shortcut name=gmsg url="http://groups.google.com/groups?selm=%s"]] -* [[!shortcut name=wikipedia url="http://en.wikipedia.org/wiki/%s"]] -* [[!shortcut name=wikitravel url="http://wikitravel.org/en/%s"]] -* [[!shortcut name=wiktionary url="http://en.wiktionary.org/wiki/%s"]] +* [[!shortcut name=wikipedia url="https://secure.wikimedia.org/wikipedia/en/wiki/%s"]] +* [[!shortcut name=wikitravel url="https://wikitravel.org/en/%s"]] +* [[!shortcut name=wiktionary url="https://secure.wikimedia.org/wiktionary/en/wiki/%s"]] * [[!shortcut name=debbug url="http://bugs.debian.org/%S" desc="Debian bug #%s"]] * [[!shortcut name=deblist url="http://lists.debian.org/debian-%s" desc="debian-%s@lists.debian.org"]] * [[!shortcut name=debpkg url="http://packages.debian.org/%s"]] @@ -60,6 +60,7 @@ This page controls what shortcut links the wiki supports. * [[!shortcut name=man url="http://linux.die.net/man/%s"]] * [[!shortcut name=ohloh url="http://www.ohloh.net/projects/%s"]] * [[!shortcut name=cpanrt url="https://rt.cpan.org/Ticket/Display.html?id=%s" desc="CPAN RT#%s"]] +* [[!shortcut name=novellbug url="https://bugzilla.novell.com/show_bug.cgi?id=%s" desc="bug %s"]] To add a new shortcut, use the `shortcut` [[ikiwiki/directive]]. In the url, "%s" is replaced with the diff --git a/doc/shortcuts/discussion.mdwn b/doc/shortcuts/discussion.mdwn new file mode 100644 index 000000000..aac98457e --- /dev/null +++ b/doc/shortcuts/discussion.mdwn @@ -0,0 +1,13 @@ +# Suggestions for multi-language links + +Sites like Wikipedia have different URLs for each language. The shortcut for Wikipedia `!wikipedia` points to `https://secure.wikimedia.org/wikipedia/en/wiki/%s` which is the English version. + +Do you have a suggestion on how to make that shortcut also be used to point to a different language. + +1. The option to just adapt the shortcut (`s/en/de/`) is quite cumbersome for non English speakers and also has the disadvantage of always updating the shortcut links manually after each modification in the upstream ikiwiki shortcut list to stay in sync. +1. Adding an extra shortcut for every language, e. g. `!wikipediade`, with for example the country TLD in it is an option but would make the shortcut list quite big. +1. Adding a `lang` parameter comes also to my mind, but I do not know how feasible that is. + +Thanks. --[[PaulePanter]] + +> Does anyone have an opinion on the shortcuts for google/wikipedia pointing at the HTTPS services? Introduced by [this edit by Paul Panter](http://git.ikiwiki.info/?p=ikiwiki;a=blobdiff;f=doc/shortcuts.mdwn;h=cafe3f573ef5cfd4811bee9688afa1675302aca9;hp=54dd0fdb1eadfac386b31b2dd2f014349a54184a;hb=704038db298c0f3a8039dcfe8d3a801c26fadf15;hpb=2a1077f8869ca65b49e09d99a76b595d90b28499). Personally, I think they should be separate shortcut keys. Most of my google/WP usage is such that I would prefer it over HTTP (faster, less resource usage at client side). However, I never use the shortcuts feature in ikiwiki anyway... -- [[Jon]] diff --git a/doc/smileys/icon-error.png b/doc/smileys/icon-error.png Binary files differindex 53b1055f6..c39e65c33 100644 --- a/doc/smileys/icon-error.png +++ b/doc/smileys/icon-error.png diff --git a/doc/style.css b/doc/style.css index 4770fc942..fcf39be6a 100644 --- a/doc/style.css +++ b/doc/style.css @@ -4,9 +4,17 @@ * local.css and use it to override or change settings in this one. */ +/* html5 compat */ +article, +header, +footer, +nav { + display: block; +} + .header { margin: 0; - font-size: 22px; + font-size: 140%; font-weight: bold; line-height: 1em; display: block; @@ -14,19 +22,20 @@ .inlineheader .author { margin: 0; - font-size: 18px; + font-size: 112%; font-weight: bold; display: block; } .actions ul { margin: 0; - padding: 6px; + padding: 6px .4em; + height: 1em; list-style-type: none; } .actions li { display: inline; - padding: .2em .4em; + padding: .2em; } .pageheader .actions ul { border-bottom: 1px solid #000; @@ -49,29 +58,29 @@ border-bottom: 1px solid #000; } -div.inlinecontent { +.inlinecontent { margin-top: .4em; } -.pagefooter { - clear: both; -} -.inlinefooter { +.pagefooter, +.inlinefooter, +.comments { clear: both; } -.tags { -} - #pageinfo { margin: 1em 0; border-top: 1px solid #000; } -div.tags { +.tags { margin-top: 1em; } +.inlinepage .tags { + display: inline; +} + .mapparent { text-decoration: none; } @@ -82,6 +91,18 @@ div.tags { text-align: center; } +img.img { + margin: 0.5ex; +} + +.align-left { + float:left; +} + +.align-right { + float:right; +} + #backlinks { margin-top: 1em; } @@ -92,19 +113,28 @@ div.tags { } #editcontent { - width: 100%; + width: 98%; +} + +.editcontentdiv { + width: auto; + overflow: auto; } img { border-style: none; } +pre { + overflow: auto; +} + div.recentchanges { border-style: solid; border-width: 1px; overflow: auto; - clear: both; - width: 100%; + width: auto; + clear: none; background: #eee; color: black !important; } @@ -142,23 +172,26 @@ div.recentchanges { width: 35%; font-size: small; } -.recentchanges .pagelinks { +.recentchanges .pagelinks, +.recentchanges .revert { float: right; margin: 0; width: 60%; } -/* Used for adding a blog page. */ -#blogform { +.blogform, #blogform { padding: 10px 10px; border: 1px solid #aaa; background: #eee; color: black !important; + width: auto; + overflow: auto; } .inlinepage { padding: 10px 10px; border: 1px solid #aaa; + overflow: auto; } .pagedate, @@ -173,90 +206,18 @@ div.recentchanges { color: #C00; } -/* Used for invalid form fields. */ -.fb_invalid { - color: red; - background: white !important; -} - -/* Used for required form fields. */ -.fb_required { - font-weight: bold; -} - -/* Orange feed button. */ -.feedbutton { - background: #ff6600; - color: white !important; - border-left: 1px solid #cc9966; - border-top: 1px solid #ccaa99; - border-right: 1px solid #993300; - border-bottom: 1px solid #331100; - padding: 0px 0.5em 0px 0.5em; - font-family: sans-serif; - font-weight: bold; - font-size: small; - text-decoration: none; - margin-top: 1em; -} -.feedbutton:hover { - color: white !important; - background: #ff9900; -} - -/* Tag cloud. */ -.pagecloud { - float: right; - width: 30%; - text-align: center; - padding: 10px 10px; - border: 1px solid #aaa; - background: #eee; - color: black !important; -} -.smallestPC { font-size: 70%; } -.smallPC { font-size: 85%; } -.normalPC { font-size: 100%; } -.bigPC { font-size: 115%; } -.biggestPC { font-size: 130%; } - -#sidebar { - line-height: 3ex; +.sidebar { width: 20ex; float: right; - margin-left: 40px; - margin-bottom: 40px; - padding: 2ex 2ex; + margin-left: 4px; + margin-bottom: 4px; + margin-top: -1px; + padding: 0ex 2ex; background: white; + border: 1px solid black; color: black !important; } -/* outlines */ -li.L1 { - list-style: upper-roman; -} -li.L2 { - list-style: decimal; -} -li.L3 { - list-style: lower-alpha; -} -li.L4 { - list-style: disc; -} -li.L5 { - list-style: square; -} -li.L6 { - list-style: circle; -} -li.L7 { - list-style: lower-roman; -} -li.L8 { - list-style: upper-alpha; -} - hr.poll { height: 10pt; color: white !important; @@ -270,6 +231,30 @@ div.poll { border: 1px solid #aaa; } +span.color { + padding: 2px; +} + +.comment-header, +.microblog-header { + font-style: italic; + margin-top: .3em; +} +.comment .author, +.microblog .author { + font-weight: bold; +} +.comment-subject { + font-weight: bold; +} +.comment-avatar { + float: right; +} +.comment { + border: 1px solid #aaa; + padding: 3px; +} + div.progress { margin-top: 1ex; margin-bottom: 1ex; @@ -286,33 +271,17 @@ div.progress-done { padding: 1px; } -input#openid_url { - background: url(wikiicons/openidlogin-bg.gif) no-repeat; - background-color: #fff; - background-position: 0 50%; - color: #000; - padding-left: 18px; -} - -input#searchbox { - background: url(wikiicons/search-bg.gif) no-repeat; - background-color: #fff; - background-position: 100% 50%; - color: #000; - padding-right: 16px; -} - -/* Things to hide in printouts. */ +/* things to hide in printouts */ @media print { .actions { display: none; } .tags { display: none; } .feedbutton { display: none; } #searchform { display: none; } - #blogform { display: none; } + .blogform, #blogform { display: none; } #backlinks { display: none; } } -/* Provided for use by template plugin for floating info boxes. */ +/* infobox template */ .infobox { float: right; margin-left: 2ex; @@ -324,7 +293,7 @@ input#searchbox { color: black !important; } -/* Provided for use by template plugin for floating note boxes. */ +/* notebox template */ .notebox { float: right; margin-left: 2ex; @@ -337,7 +306,7 @@ input#searchbox { color: black !important; } -/* Used by the popup template and for backlinks hiding. */ +/* popup template and backlinks hiding */ .popup { border-bottom: 1px dotted #366; color: #366; @@ -358,7 +327,7 @@ input#searchbox { color: black; } -/* Formbuilder styling */ +/* form styling */ fieldset { margin: 1ex 0; border: 1px solid black; @@ -370,40 +339,37 @@ legend { float: left; margin: 2px 0; } -#signin_openid_url_label { - float: left; - margin-right: 1ex; +label.block { + display: block; } -#signin_openid { - padding: 10px 10px; - border: 1px solid #aaa; - background: #eee; - color: black !important; +label.inline { + display: inline; } - -span.color { - padding: 2px; +input#openid_identifier { + background: url(wikiicons/openidlogin-bg.gif) no-repeat; + background-color: #fff; + background-position: 0 50%; + color: #000; + padding-left: 18px; } - -.comment-header, -.microblog-header { - font-style: italic; - margin-top: .3em; +input#searchbox { + background: url(wikiicons/search-bg.gif) no-repeat; + background-color: #fff; + background-position: 100% 50%; + color: #000; + padding-right: 16px; } -.comment .author, -.microblog .author { - font-weight: bold; +/* invalid form fields */ +.fb_invalid { + color: red; + background: white !important; } -.comment-subject { +/* required form fields */ +.fb_required { font-weight: bold; } -.comment { - border: 1px solid #aaa; - padding: 3px; -} - -/* Used by the highlight plugin. */ +/* highlight plugin */ pre.hl { color:#000000; background-color:#ffffff; } .hl.num { color:#2928ff; } .hl.esc { color:#ff00ff; } @@ -419,3 +385,111 @@ pre.hl { color:#000000; background-color:#ffffff; } .hl.kwb { color:#830000; } .hl.kwc { color:#000000; font-weight:bold; } .hl.kwd { color:#010181; } + +/* calendar plugin */ +.month-calendar-day-this-day, +.year-calendar-this-month { + background-color: #eee; +} +.month-calendar-day-head, +.month-calendar-day-nolink, +.month-calendar-day-link, +.month-calendar-day-this-day, +.month-calendar-day-future { + text-align: right; +} +.month-calendar-arrow A:link, +.year-calendar-arrow A:link, +.month-calendar-arrow A:visited, +.year-calendar-arrow A:visited { + text-decoration: none; + font-weight: normal; + font-size: 150%; +} + +/* outlines */ +li.L1 { list-style: upper-roman; } +li.L2 { list-style: decimal; } +li.L3 { list-style: lower-alpha; } +li.L4 { list-style: disc; } +li.L5 { list-style: square; } +li.L6 { list-style: circle; } +li.L7 { list-style: lower-roman; } +li.L8 { list-style: upper-alpha; } + +/* tag cloud */ +.pagecloud { + float: right; + width: 30%; + text-align: center; + padding: 10px 10px; + border: 1px solid #aaa; + background: #eee; + color: black !important; +} +.smallestPC { font-size: 70%; } +.smallPC { font-size: 85%; } +.normalPC { font-size: 100%; } +.bigPC { font-size: 115%; } +.biggestPC { font-size: 130%; } + +/* orange feed button */ +.feedbutton { + background: #ff6600; + color: white !important; + border-left: 1px solid #cc9966; + border-top: 1px solid #ccaa99; + border-right: 1px solid #993300; + border-bottom: 1px solid #331100; + padding: 0px 0.5em 0px 0.5em; + font-family: sans-serif; + font-weight: bold; + font-size: small; + text-decoration: none; + margin-top: 1em; +} +.feedbutton:hover { + color: white !important; + background: #ff9900; +} + +.FlattrButton { + display: none; +} + +/* openid selector */ +#openid_choice { + display: none; +} +#openid_input_area { + clear: both; + padding: 10px; +} +#openid_btns, #openid_btns br { + clear: both; +} +#openid_highlight { + background-color: black; + float: left; +} +.openid_large_btn { + padding: 1em 1.5em; + border: 1px solid #DDD; + margin: 3px; + float: left; +} +.openid_small_btn { + padding: 4px 4px; + border: 1px solid #DDD; + margin: 3px; + float: left; +} +a.openid_large_btn:focus { + outline: none; +} +a.openid_large_btn:focus { + outline-style: none; +} +.openid_selected { + border: 4px solid #DDD; +} diff --git a/doc/templates.mdwn b/doc/templates.mdwn index eff0e15e9..d189fa073 100644 --- a/doc/templates.mdwn +++ b/doc/templates.mdwn @@ -1,87 +1,89 @@ -[[!meta robots="noindex, follow"]] -[[!if test="enabled(template)" -then="This wiki has templates **enabled**." -else="This wiki has templates **disabled**." -]] +[[Ikiwiki]] uses many templates for many purposes. By editing its templates, +you can fully customise its appearance, and avoid duplicate content. -Templates are files that can be filled out and inserted into pages in the -wiki. +Ikiwiki uses the HTML::Template module as its template engine. This +supports things like conditionals and loops in templates and is pretty +easy to learn. All you really need to know to modify templates is this: -[[!if test="enabled(template) and enabled(inline)" then=""" +* To insert the value of a template variable, use `<TMPL_VAR variable>`. +* To make a block of text conditional on a variable being set use + `<TMPL_IF variable>text</TMPL_IF>`. +* To use one block of text if a variable is set and a second if it's not, + use `<TMPL_IF variable>text<TMPL_ELSE>other text</TMPL_IF>` -These templates are available for inclusion onto other pages in this -wiki: +[[!if test="enabled(template) or enabled(edittemplate)" then=""" +## template pages -[[!inline pages="templates/* and !*/discussion" feeds=no archive=yes -sort=title template=titlepage]] +Template pages are regular wiki pages that are used as templates for other +pages. """]] -## Using a template - -Using a template works like this: - - \[[!template id=note text="""Here is the text to insert into my note."""]] - -This fills out the [[note]] template, filling in the `text` field with -the specified value, and inserts the result into the page. - -Generally, a value can include any markup that would be allowed in the wiki -page outside the template. Triple-quoting the value even allows quotes to -be included in it. Combined with multi-line quoted values, this allows for -large chunks of marked up text to be embedded into a template: - - \[[!template id=foo name="Sally" color="green" age=8 notes=""" - * \[[Charley]]'s sister. - * "I want to be an astronaut when I grow up." - * Really 8 and a half. - """]] - -## Creating a template - -To create a template, simply add a template directive to a page, and the -page will provide a link that can be used to create the template. The template -is a regular wiki page, located in the `templates/` subdirectory inside -the source directory of the wiki. - -The template uses the syntax used by the [[!cpan HTML::Template]] perl -module, which allows for some fairly complex things to be done. Consult its -documentation for the full syntax, but all you really need to know are a -few things: +[[!if test="enabled(template)" then=""" +The [[!iki ikiwiki/directive/template desc="template directive"]] allows +template pages to be filled out and inserted into other pages in the wiki. +"""]] -* Each parameter you pass to the template directive will generate a - template variable. There are also some pre-defined variables like PAGE - and BASENAME. -* To insert the value of a variable, use `<TMPL_VAR variable>`. Wiki markup - in the value will first be converted to html. -* To insert the raw value of a variable, with wiki markup not yet converted - to html, use `<TMPL_VAR raw_variable>`. -* To make a block of text conditional on a variable being set use - `<TMPL_IF NAME="variable">text</TMPL_IF>`. -* To use one block of text if a variable is set and a second if it's not, - use `<TMPL_IF NAME="variable">text<TMPL_ELSE>other text</TMPL_IF>` +[[!if test="enabled(edittemplate)" then=""" +The [[!iki ikiwiki/directive/edittemplate desc="edittemplate directive"]] can +be used to make new pages default to containing text from a template +page, which can be filled out as the page is edited. +"""]] -Here's a sample template: +[[!if test="(enabled(template) or enabled(edittemplate)) +and enabled(inline)" then=""" +These template pages are currently available: - <span class="infobox"> - Name: \[[<TMPL_VAR raw_name>]]<br /> - Age: <TMPL_VAR age><br /> - <TMPL_IF NAME="color"> - Favorite color: <TMPL_VAR color><br /> - <TMPL_ELSE> - No favorite color.<br /> - </TMPL_IF> - <TMPL_IF NAME="notes"> - <hr /> - <TMPL_VAR notes> - </TMPL_IF> - </span> +[[!inline pages="templates/* and !*.tmpl and !templates/*/* and !*/discussion" +feeds=no archive=yes sort=title template=titlepage +rootpage=templates postformtext="Add a new template page named:"]] +"""]] -The filled out template will be formatted the same as the rest of the page -that contains it, so you can include WikiLinks and all other forms of wiki -markup in the template. Note though that such WikiLinks will not show up as -backlinks to the page that uses the template. +## template files + +Template files are unlike template pages in that they have the extension +`.tmpl`. Template files are used extensively by Ikiwiki to generate html. +They can contain html that would not normally be allowed on a wiki page. + +Template files are located in `/usr/share/ikiwiki/templates` by default; +the `templatedir` setting can be used to make another directory be +searched first. Customised template files can also be placed inside the +"templates/" directory in your wiki's source -- files placed there override +ones in the `templatedir`. + +Here is a full list of the template files used: + +* `page.tmpl` - Used for displaying all regular wiki pages. This is the + key template to customise to change the look and feel of Ikiwiki. + [[!if test="enabled(pagetemplate)" then=""" + (The [[!iki ikiwiki/directive/pagetemplate desc="pagetemplate directive"]] + can be used to make a page use a different template than `page.tmpl`.)"""]] +* `rsspage.tmpl` - Used for generating rss feeds for blogs. +* `rssitem.tmpl` - Used for generating individual items on rss feeds. +* `atompage.tmpl` - Used for generating atom feeds for blogs. +* `atomitem.tmpl` - Used for generating individual items on atom feeds. +* `inlinepage.tmpl` - Used for displaying a post in a blog. +* `archivepage.tmpl` - Used for listing a page in a blog archive page. +* `titlepage.tmpl` - Used for listing a page by title in a blog archive page. +* `microblog.tmpl` - Used for showing a microblogging post inline. +* `blogpost.tmpl` - Used for a form to add a post to a blog (and rss/atom links) +* `feedlink.tmpl` - Used to add rss/atom links if `blogpost.tmpl` is not used. +* `aggregatepost.tmpl` - Used by the aggregate plugin to create + a page for a post. +* `searchform.tmpl`, `googleform.tmpl` - Used by the search plugin + and google plugin to add search forms to wiki pages. +* `searchquery.tmpl` - This is a Omega template, used by the + search plugin. +* `comment.tmpl` - Used by the comments plugin to display a comment. +* `change.tmpl` - Used to create a page describing a change made to the wiki. +* `recentchanges.tmpl` - Used for listing a change on the RecentChanges page. +* `autoindex.tmpl` - Filled in by the autoindex plugin to make index pages. +* `autotag.tmpl` - Filled in by the tag plugin to make tag pages. +* `calendarmonth.tmpl`, `calendaryear.tmpl` - Used by ikiwiki-calendar to + make calendar archive pages. +* `editpage.tmpl`, `editconflict.tmpl`, `editcreationconflict.tmpl`, + `editfailedsave.tmpl`, `editpagegone.tmpl`, `pocreatepage.tmpl`, + `editcomment.tmpl` `commentmoderation.tmpl`, `renamesummary.tmpl`, + `passwordmail.tmpl`, `openid-selector.tmpl`, `revert.tmpl` - Parts of ikiwiki's user + interface; do not normally need to be customised. -Note the use of "raw_name" inside the [[ikiwiki/WikiLink]] generator. This -ensures that if the name contains something that might be mistaken for wiki -markup, it's not converted to html before being processed as a -[[ikiwiki/WikiLink]]. +[[!meta robots="noindex, follow"]] diff --git a/doc/templates/discussion.mdwn b/doc/templates/discussion.mdwn index 220d36455..c7115e4d6 100644 --- a/doc/templates/discussion.mdwn +++ b/doc/templates/discussion.mdwn @@ -6,3 +6,22 @@ note and popups are templates? But they're not in the templates directory, and i > your personal wiki sources. The note and popup template pages are > installed there, typically in `/usr/share/ikiwiki/basewiki/templates/` > --[[Joey]] + +> > And how am I able to use e.g. links? It's not listed in `/usr/share/ikiwiki/basewiki/templates`. +> > I intend do (mis)use links for a horizontal navigation. Or may I be better off altering page.tmpl? +> > --z3ttacht + +Is there a list of the TMPL_VAR-Variables that are defined by ikiwiki? + +What I'm trying to achieve is to print the URL of every page on the page itself and therefore I would need the corresponding value in the Template. + +Am I missing something? --[[jwalzer]] + +> If there isn't a suitable variable (I don't think there is a list at +> the moment), a [[plugin|plugins/write]] to add one would be about 10 +> lines of perl - you'd just need to define a `pagetemplate` hook. --[[smcv]] + +Is there a list of all the available variables somewhere, or do I just grep the source for TMPL_VAR? And is there a way to refer to a variable inside of a wiki page or does it have to be done from a template? Thanks. -- [[AdamShand]] + +I pulled a list of variables and posted it, its in the history for [[templates]] under my name. [[justint]] + diff --git a/doc/templates/gitbranch.mdwn b/doc/templates/gitbranch.mdwn index fcce925d9..4fdf937ff 100644 --- a/doc/templates/gitbranch.mdwn +++ b/doc/templates/gitbranch.mdwn @@ -1,9 +1,9 @@ <span class="infobox"> -Available in a [[!taglink /git]] repository.<br /> +Available in a [[!taglink /git]] repository [[!taglink branch|/branches]].<br /> Branch: <TMPL_VAR branch><br /> Author: <TMPL_VAR author><br /> </span> -<TMPL_UNLESS NAME="branch"> +<TMPL_UNLESS branch> This template is used to create an infobox for a git branch. It uses these parameters: @@ -13,6 +13,4 @@ these parameters: (e.g. github/master)</li> <li>author - the author of the branch</li> </ul> - -It also automatically tags the branch with `/git`. </TMPL_UNLESS> diff --git a/doc/templates/links.mdwn b/doc/templates/links.mdwn index 2b18bceb2..b7c3028cd 100644 --- a/doc/templates/links.mdwn +++ b/doc/templates/links.mdwn @@ -8,5 +8,9 @@ <li>[[Users|IkiWikiUsers]]</li> <li>[[SiteMap]]</li> <li>[[Contact]]</li> +<li>[[TipJar]]</li> </ul> +<a href="http://flattr.com/thing/39811/ikiwiki"> +<img src="http://api.flattr.com/button/button-compact-static-100x17.png" +alt="Flattr this" title="Flattr this" /></a> </div> diff --git a/doc/templates/note.mdwn b/doc/templates/note.mdwn index 4cc323c0e..9ef5ad942 100644 --- a/doc/templates/note.mdwn +++ b/doc/templates/note.mdwn @@ -1,7 +1,7 @@ <div class="notebox"> <TMPL_VAR text> </div> -<TMPL_UNLESS NAME="text"> +<TMPL_UNLESS text> Use this template to insert a note into a page. The note will be styled to float to the right of other text on the page. This template has one parameter: diff --git a/doc/templates/plugin.mdwn b/doc/templates/plugin.mdwn index c1d1974d6..322c49445 100644 --- a/doc/templates/plugin.mdwn +++ b/doc/templates/plugin.mdwn @@ -8,7 +8,7 @@ Currently enabled: [[!if test="enabled(<TMPL_VAR name>)" then="yes" else="no"]]< </span> [[!if test="sourcepage(plugins/contrib/*)" then="""[[!meta title="<TMPL_VAR name> (third party plugin)"]]"""]] <TMPL_IF core>[[!tag plugins/type/core]]</TMPL_IF> -<TMPL_UNLESS NAME="name"> +<TMPL_UNLESS name> This template is used to create an infobox for an ikiwiki plugin. It uses these parameters: <ul> diff --git a/doc/templates/popup.mdwn b/doc/templates/popup.mdwn index b355daa2e..92455eb21 100644 --- a/doc/templates/popup.mdwn +++ b/doc/templates/popup.mdwn @@ -1,4 +1,4 @@ -<TMPL_UNLESS NAME="mouseover"> +<TMPL_UNLESS mouseover> Use this template to create a popup window that is displayed when the mouse is over part of the page. This template has two parameters: <ul> diff --git a/doc/themes.mdwn b/doc/themes.mdwn new file mode 100644 index 000000000..57f899677 --- /dev/null +++ b/doc/themes.mdwn @@ -0,0 +1,27 @@ +A theme provides a style.css file, and any associated images to give +ikiwiki a nice look and feel. The local.css [[CSS]] file is left +free for you to further customize. + +Ikiwiki now comes with several themes contributed by users. +You can enable the [[theme_plugin|plugins/theme]] to use any of these: + +[[!img actiontabs_small.png align=left]] The **actiontabs** theme, contributed by +[[svend]]. This style sheet displays the action list +(Edit, RecentChanges, etc.) as tabs. + +<br clear="both" /> + +[[!img blueview_small.png align=left]] The **blueview** theme, contributed by +[[BerndZeimetz]], featuring a tiling panoramic photo he took. + +<br clear="both" /> + +[[!img goldtype_small.png align=left]] The **goldtype** theme, based on +blueview and featuring the photography of Lars Wirzenius. + +<br clear="both" /> + +[[!img none_small.png align=left]] For completeness, ikiwiki's default +anti-theme. + +<br clear="both" /> diff --git a/doc/themes/actiontabs_small.png b/doc/themes/actiontabs_small.png Binary files differnew file mode 100644 index 000000000..4b05ad3dc --- /dev/null +++ b/doc/themes/actiontabs_small.png diff --git a/doc/themes/blueview_small.png b/doc/themes/blueview_small.png Binary files differnew file mode 100644 index 000000000..74972c4c3 --- /dev/null +++ b/doc/themes/blueview_small.png diff --git a/doc/themes/discussion.mdwn b/doc/themes/discussion.mdwn new file mode 100644 index 000000000..87333b535 --- /dev/null +++ b/doc/themes/discussion.mdwn @@ -0,0 +1,6 @@ +I would like to contribute a theme I created and posted on github: + +[[https://github.com/AntPortal/ikiwiked]] + +For an example of the theme in action, see: [[https://antportal.com/wiki/]] + diff --git a/doc/themes/goldtype_small.png b/doc/themes/goldtype_small.png Binary files differnew file mode 100644 index 000000000..a011bb200 --- /dev/null +++ b/doc/themes/goldtype_small.png diff --git a/doc/themes/none_small.png b/doc/themes/none_small.png Binary files differnew file mode 100644 index 000000000..8272ae606 --- /dev/null +++ b/doc/themes/none_small.png diff --git a/doc/tipjar.mdwn b/doc/tipjar.mdwn index 787df9bf7..ae612e129 100644 --- a/doc/tipjar.mdwn +++ b/doc/tipjar.mdwn @@ -5,6 +5,9 @@ choose. If you'd like to fund development of a specific feature, see the <a href="https://www.paypal.com/cgi-bin/webscr?cmd=_xclick&business=joey%40kitenet%2enet&item_name=ikiwiki&no_shipping=1&cn=Comments%3f&tax=0¤cy_code=USD&lc=US&bn=PP%2dDonationsBF&charset=UTF%2d8"><img src="https://www.paypal.com/en_US/i/btn/x-click-but04.gif" alt="donate with PayPal" /></a> +<script type="text/javascript">var flattr_url = 'http://ikiwiki.info';</script> +<script src="http://api.flattr.com/button/load.js" type="text/javascript"></script> + Thanks to the following people for their kind contributions: * James Westby @@ -13,6 +16,9 @@ Thanks to the following people for their kind contributions: * Martin Krafft * Paweł Tęcza * Mick Pollard +* Nico Schottelius +* Jon Dowland +* Amitai Schlair (Note that this page is locked to prevent anyone from tampering with the PayPal button. If you prefer your donation *not* be listed here, let [[Joey]] know.) diff --git a/doc/tips/Importing_posts_from_Wordpress.mdwn b/doc/tips/Importing_posts_from_Wordpress.mdwn index 59330caa4..1ea82b862 100644 --- a/doc/tips/Importing_posts_from_Wordpress.mdwn +++ b/doc/tips/Importing_posts_from_Wordpress.mdwn @@ -1,9 +1,13 @@ Use case: You want to move away from Wordpress to Ikiwiki as your blogging/website platform, but you want to retain your old posts. -[This](http://git.chris-lamb.co.uk/?p=ikiwiki-wordpress-import.git) is a simple tool that generates [git-fast-import](http://www.kernel.org/pub/software/scm/git/docs/git-fast-import.html)-compatible data from a WordPress export XML file. It retains creation time of each post, so you can use Ikiwiki's <tt>--getctime</tt> to get the preserve creation times on checkout. +[This](http://git.chris-lamb.co.uk/?p=ikiwiki-wordpress-import.git) is a simple tool that generates [git-fast-import](http://www.kernel.org/pub/software/scm/git/docs/git-fast-import.html)-compatible data from a WordPress export XML file. WordPress categories are mapped onto Ikiwiki tags. The ability to import comments is planned. +The script uses the [BeautifulSoup][] module. + +[BeautifulSoup]: http://www.crummy.com/software/BeautifulSoup/ + ----- I include a modified version of this script. This version includes the ability to write \[[!tag foo]] directives, which the original intended, but didn't actually do. @@ -11,3 +15,88 @@ I include a modified version of this script. This version includes the ability t -- [[users/simonraven]] [[ikiwiki-wordpress-import]] + +----- + +Perhaps slightly insane, but here's an XSLT style sheet that handles my pages. It's basic, but sufficient to get started. +Note that I had to break up the ikiwiki meta strings to post this. + +-- JasonRiedy + + <?xml version="1.0" encoding="UTF-8"?> + <xsl:stylesheet version="2.0" + xmlns:xsl="http://www.w3.org/1999/XSL/Transform" + xmlns:content="http://purl.org/rss/1.0/modules/content/" + xmlns:wp="http://wordpress.org/export/1.0/"> + + <xsl:output method="text"/> + <xsl:output method="text" name="txt"/> + + <xsl:variable name='newline'><xsl:text> + </xsl:text></xsl:variable> + + <xsl:template match="channel"> + <xsl:apply-templates select="item[wp:post_type = 'post']"/> + </xsl:template> + + <xsl:template match="item"> + <xsl:variable name="idnum" select="format-number(wp:post_id,'0000')" /> + <xsl:variable name="basename" + select="concat('wp-posts/post-',$idnum)" /> + <xsl:variable name="filename" + select="concat($basename, '.html')" /> + <xsl:text>Creating </xsl:text> + <xsl:value-of select="concat($filename, $newline)" /> + <xsl:result-document href="{$filename}" format="txt"> + <xsl:text>[[</xsl:text><xsl:text>meta title="</xsl:text> + <xsl:value-of select="replace(title, '"', '&ldquo;')"/> + <xsl:text>"]]</xsl:text><xsl:value-of select="$newline"/> + <xsl:text>[[</xsl:text><xsl:text>meta date="</xsl:text> + <xsl:value-of select="pubDate"/> + <xsl:text>"]]</xsl:text><xsl:value-of select="$newline"/> + <xsl:text>[[</xsl:text><xsl:text>meta updated="</xsl:text> + <xsl:value-of select="pubDate"/> + <xsl:text>"]]</xsl:text> <xsl:value-of select="$newline"/> + <xsl:value-of select="$newline"/> + <xsl:value-of select="content:encoded"/> + <xsl:text> + + </xsl:text> + <xsl:apply-templates select="category[@domain='tag' and not(@nicename)]"> + <xsl:sort select="name()"/> + </xsl:apply-templates> + </xsl:result-document> + <xsl:apply-templates select="wp:comment"> + <xsl:sort select="date"/> + <xsl:with-param name="basename">$basename</xsl:with-param> + </xsl:apply-templates> + </xsl:template> + + <xsl:template match="wp:comment"> + <xsl:param name="basename"/> + <xsl:variable name="cnum" select="format-number(wp:comment_id, '000')" /> + <xsl:variable name="filename" select="concat($basename, '/comment_', $cnum, '._comment')"/> + <xsl:variable name="nickname" select="concat(' nickname="', wp:comment_author, '"')" /> + <xsl:variable name="username" select="concat(' username="', wp:comment_author_url, '"')" /> + <xsl:variable name="ip" select="concat(' ip="', wp:comment_author_IP, '"')" /> + <xsl:variable name="date" select="concat(' date="', wp:comment_date_gmt, '"')" /> + <xsl:result-document href="{$filename}" format="txt"> + <xsl:text>[[</xsl:text><xsl:text>comment format=html</xsl:text><xsl:value-of select="$newline"/> + <xsl:value-of select="$nickname"/> + <xsl:value-of select="$username"/> + <xsl:value-of select="$ip"/> + <xsl:value-of select="$date"/> + <xsl:text>subject=""</xsl:text><xsl:value-of select="$newline"/> + <xsl:text>content="""</xsl:text><xsl:value-of select="$newline"/> + <xsl:value-of select="wp:comment_content"/> + <xsl:value-of select="$newline"/> + <xsl:text>"""]]</xsl:text><xsl:value-of select="$newline"/> + </xsl:result-document> + </xsl:template> + + <xsl:template match="category"> + <xsl:text>[</xsl:text><xsl:text>[</xsl:text><xsl:text>!tag "</xsl:text><xsl:value-of select="."/><xsl:text>"]]</xsl:text> + <xsl:value-of select="$newline"/> + </xsl:template> + + </xsl:stylesheet> diff --git a/doc/tips/add_chatterbox_to_blog.mdwn b/doc/tips/add_chatterbox_to_blog.mdwn index aa35b9331..e07e36b07 100644 --- a/doc/tips/add_chatterbox_to_blog.mdwn +++ b/doc/tips/add_chatterbox_to_blog.mdwn @@ -18,4 +18,7 @@ from there, like I have on [my blog](http://kitenet.net/~joey/blog/) show=5 feeds=no]] """]] +* To filter out `@-replies`, append "and !*@*" to the [[ikiwiki/PageSpec]]. + The same technique can be used for other filtering. + Note: Works best with ikiwiki 3.10 or better. diff --git a/doc/tips/comments_feed.mdwn b/doc/tips/comments_feed.mdwn index 6f8137256..3d6a8c449 100644 --- a/doc/tips/comments_feed.mdwn +++ b/doc/tips/comments_feed.mdwn @@ -3,8 +3,15 @@ blog can have comments added to them. Pages with comments even have special feeds that can be used to subscribe to those comments. But you'd like to add a feed that contains all the comments posted to any page. Here's how: - \[[!inline pages="internal(*/comment_*)" template=comment]] + \[[!inline pages="comment(*)" template=comment]] The special [[ikiwiki/PageSpec]] matches all comments. The -[[template|wikitemplates]] causes the comments to be displayed formatted +[[template|templates]] causes the comments to be displayed formatted nicely. + +--- + +It's also possible to make a feed of comments that are held pending +moderation. + + \[[!inline pages="comment_pending(*)" template=comment]] diff --git a/doc/tips/convert_mediawiki_to_ikiwiki.mdwn b/doc/tips/convert_mediawiki_to_ikiwiki.mdwn index f03703b46..38de01109 100644 --- a/doc/tips/convert_mediawiki_to_ikiwiki.mdwn +++ b/doc/tips/convert_mediawiki_to_ikiwiki.mdwn @@ -1,4 +1,159 @@ -[[sabr]] explains how to [import MediaWiki content into -git](http://u32.net/Mediawiki_Conversion/index.html?updated), including -full edit hostory. The [[plugins/contrib/mediawiki]] plugin can then be -used by ikiwiki to build the wiki. +[[!toc levels=2]] + +Mediawiki is a dynamically-generated wiki which stores it's data in a +relational database. Pages are marked up using a proprietary markup. It is +possible to import the contents of a Mediawiki site into an ikiwiki, +converting some of the Mediawiki conventions into Ikiwiki ones. + +The following instructions describe ways of obtaining the current version of +the wiki. We do not yet cover importing the history of edits. + +Another set of instructions and conversion tools (which imports the full history) +can be found at <http://github.com/mithro/media2iki> + +## Step 1: Getting a list of pages + +The first bit of information you require is a list of pages in the Mediawiki. +There are several different ways of obtaining these. + +### Parsing the output of `Special:Allpages` + +Mediawikis have a special page called `Special:Allpages` which list all the +pages for a given namespace on the wiki. + +If you fetch the output of this page to a local file with something like + + wget -q -O tmpfile 'http://your-mediawiki/wiki/Special:Allpages' + +You can extract the list of page names using the following python script. Note +that this script is sensitive to the specific markup used on the page, so if +you have tweaked your mediawiki theme a lot from the original, you will need +to adjust this script too: + + import sys + from xml.dom.minidom import parse, parseString + + dom = parse(sys.argv[1]) + tables = dom.getElementsByTagName("table") + pagetable = tables[-1] + anchors = pagetable.getElementsByTagName("a") + for a in anchors: + print a.firstChild.toxml().\ + replace('&','&').\ + replace('<','<').\ + replace('>','>') + +Also, if you have pages with titles that need to be encoded to be represented +in HTML, you may need to add further processing to the last line. + +Note that by default, `Special:Allpages` will only list pages in the main +namespace. You need to add a `&namespace=XX` argument to get pages in a +different namespace. (See below for the default list of namespaces) + +Note that the page names obtained this way will not include any namespace +specific prefix: e.g. `Category:` will be stripped off. + +### Querying the database + +If you have access to the relational database in which your mediawiki data is +stored, it is possible to derive a list of page names from this. With mediawiki's +MySQL backend, the page table is, appropriately enough, called `table`: + + SELECT page_namespace, page_title FROM page; + +As with the previous method, you will need to do some filtering based on the +namespace. + +### namespaces + +The list of default namespaces in mediawiki is available from <http://www.mediawiki.org/wiki/Manual:Namespace#Built-in_namespaces>. Here are reproduced the ones you are most likely to encounter if you are running a small mediawiki install for your own purposes: + +[[!table data=""" +Index | Name | Example +0 | Main | Foo +1 | Talk | Talk:Foo +2 | User | User:Jon +3 | User talk | User_talk:Jon +6 | File | File:Barack_Obama_signature.svg +10 | Template | Template:Prettytable +14 | Category | Category:Pages_needing_review +"""]] + +## Step 2: fetching the page data + +Once you have a list of page names, you can fetch the data for each page. + +### Method 1: via HTTP and `action=raw` + +You need to create two derived strings from the page titles: the +destination path for the page and the source URL. Assuming `$pagename` +contains a pagename obtained above, and `$wiki` contains the URL to your +mediawiki's `index.php` file: + + src=`echo "$pagename" | tr ' ' _ | sed 's,&,&,g'` + dest=`"$pagename" | tr ' ' _ | sed 's,&,__38__,g'` + + mkdir -p `dirname "$dest"` + wget -q "$wiki?title=$src&action=raw" -O "$dest" + +You may need to add more conversions here depending on the precise page titles +used in your wiki. + +If you are trying to fetch pages from a different namespace to the default, +you will need to prefix the page title with the relevant prefix, e.g. +`Category:` for category pages. You probably don't want to prefix it to the +output page, but you may want to vary the destination path (i.e. insert an +extra directory component corresponding to your ikiwiki's `tagbase`). + +### Method 2: via HTTP and `Special:Export` + +Mediawiki also has a special page `Special:Export` which can be used to obtain +the source of the page and other metadata such as the last contributor, or the +full history, etc. + +You need to send a `POST` request to the `Special:Export` page. See the source +of the page fetched via `GET` to determine the correct arguments. + +You will then need to write an XML parser to extract the data you need from +the result. + +### Method 3: via the database + +It is possible to extract the page data from the database with some +well-crafted queries. + +## Step 3: format conversion + +The next step is to convert Mediawiki conventions into Ikiwiki ones. + +### categories + +Mediawiki uses a special page name prefix to define "Categories", which +otherwise behave like ikiwiki tags. You can convert every Mediawiki category +into an ikiwiki tag name using a script such as + + import sys, re + pattern = r'\[\[Category:([^\]]+)\]\]' + + def manglecat(mo): + return '\[[!tag %s]]' % mo.group(1).strip().replace(' ','_') + + for line in sys.stdin.readlines(): + res = re.match(pattern, line) + if res: + sys.stdout.write(re.sub(pattern, manglecat, line)) + else: sys.stdout.write(line) + +## Step 4: Mediawiki plugin + +The [[plugins/contrib/mediawiki]] plugin can be used by ikiwiki to interpret +most of the Mediawiki syntax. + +## External links + +[[sabr]] used to explain how to [import MediaWiki content into +git](http://u32.net/Mediawiki_Conversion/index.html?updated), including full +edit history, but as of 2009/10/16 that site is not available. A copy of the +information found on this website is stored at <http://github.com/mithro/media2iki> + + diff --git a/doc/tips/convert_mediawiki_to_ikiwiki/discussion.mdwn b/doc/tips/convert_mediawiki_to_ikiwiki/discussion.mdwn index e2eb56d47..4a7163eae 100644 --- a/doc/tips/convert_mediawiki_to_ikiwiki/discussion.mdwn +++ b/doc/tips/convert_mediawiki_to_ikiwiki/discussion.mdwn @@ -1,3 +1,15 @@ +20100428 - I just wrote a simple ruby script which will connect to a mysql server and then recreate the pages and their revision histories with Grit. It also does one simple conversion of equals titles to pounds. Enjoy! + +<http://github.com/docunext/mediawiki2gitikiwiki> + +-- [[users/Albert]] + +---- + +I wrote a script that will download all the latest revisions of a mediawiki site. In short, it does a good part of the stuff required for the migration: it downloads the goods (ie. the latest version of every page, automatically) and commits the resulting structure. There's still a good few pieces missing for an actual complete conversion to ikiwiki, but it's a pretty good start. It only talks with mediawiki through HTTP, so no special access is necessary. The downside of that is that it will not attempt to download every revision for performance reasons. The code is here: http://anarcat.ath.cx/software/mediawikigitdump.git/ See header of the file for more details and todos. -- [[users/Anarcat]] 2010-10-15 + +---- + The u32 page is excellent, but I wonder if documenting the procedure here would be worthwhile. Who knows, the remote site might disappear. But also there are some variations on the approach that might be useful: @@ -13,9 +25,31 @@ Also, some detail on converting mediawiki transclusion to ikiwiki inlines... -- [[users/Jon]] +---- + > "Who knows, the remote site might disappear.". Right now, it appears to > have done just that. -- [[users/Jon]] +I have manage to recover most of the site using the Internet Archive. What +I was unable to retrieve I have rewritten. You can find a copy of the code +at <http://github.com/mithro/media2iki> + +> This is excellent news. However, I'm still keen on there being a +> comprehensive and up-to-date set of instructions on *this* site. I wouldn't +> suggest importing that material into ikiwiki like-for-like (not least for +> [[licensing|freesoftware]] reasons), but it's excellent to have it available +> for reference, especially since it (currently) is the only set of +> instructions that gives you the whole history. +> +> The `mediawiki.pm` that was at u32.net is licensed GPL-2. I'd like to see it +> cleaned up and added to IkiWiki proper (although I haven't requested this +> yet, I suspect the way it (ab)uses linkify would disqualify it at present). +> +> I've imported Scott's initial `mediawiki.pm` into a repository at +> <http://github.com/jmtd/mediawiki.pm> as a start. +> -- [[Jon]] + +---- The iki-fast-load ruby script from the u32 page is given below: @@ -612,3 +646,24 @@ Mediawiki.pm - A plugin which supports mediawiki format. } 1 + +---- + +Hello. Got ikiwiki running and I'm planning to convert my personal +Mediawiki wiki to ikiwiki so I can take offline copies around. If anyone +has an old copy of the instructions, or any advice on where to start I'd be +glad to hear it. Otherwise I'm just going to chronicle my journey on the +page.--[[users/Chadius]] + +> Today I saw that someone is working to import wikipedia into git. +> <http://www.gossamer-threads.com/lists/wiki/foundation/181163> +> Since wikipedia uses mediawiki, perhaps his importer will work +> on mediawiki in general. It seems to produce output that could be +> used by the [[plugins/contrib/mediawiki]] plugin, if the filenames +> were fixed to use the right extension. --[[Joey]] + +>> Here's another I found while browsing around starting from the link you gave Joey<br /> +>> <http://github.com/scy/levitation><br /> +>> As I don't run mediawiki anymore, but I still have my xz/gzip-compressed XML dumps, +>> it's certainly easier for me to do it this way; also a file or a set of files is easier to lug +>> around on some medium than a full mysqld or postgres master and relevant databases. diff --git a/doc/tips/dot_cgi.mdwn b/doc/tips/dot_cgi.mdwn index da55c1f1c..42a0aa7bf 100644 --- a/doc/tips/dot_cgi.mdwn +++ b/doc/tips/dot_cgi.mdwn @@ -26,6 +26,8 @@ configuration changes should work anywhere. Or, if you've put it in a `~/public_html`, edit `/etc/apache2/mods-available/userdir.conf`. + You may also want to install some dependencies to enable CGI in apache2 setup as: `libcgi-formbuilder-perl` and `libcgi-session-perl`. + * You may also want to enable the [[plugins/404]] plugin. To make apache use it, the apache config file will need a further modification to make it use ikiwiki's CGI as the apache 404 handler. diff --git a/doc/tips/dot_cgi/discussion.mdwn b/doc/tips/dot_cgi/discussion.mdwn index 124b9edff..a8854565c 100644 --- a/doc/tips/dot_cgi/discussion.mdwn +++ b/doc/tips/dot_cgi/discussion.mdwn @@ -34,3 +34,13 @@ there), and so I need to choose the more secure solution. --Ivan Z. >> The easiest way though is probably >> to add your ssh key to the special user's `.ssh/authorized_keys` >> and push that way. --[[Joey]] + +## apache2 - run from userdir +Followed instructions but couldn't get it right to run from user dir (running ubuntu jaunty), +Finally got it working once I've sym linked as follow (& restarted apache): +\# ln -s ../mods-available/userdir.load . +\# ln -s ../mods-available/userdir.conf . +\# pwd +/etc/apache2/mods-enabled + + diff --git a/doc/tips/follow_wikilinks_from_inside_vim.mdwn b/doc/tips/follow_wikilinks_from_inside_vim.mdwn new file mode 100644 index 000000000..015a4ecee --- /dev/null +++ b/doc/tips/follow_wikilinks_from_inside_vim.mdwn @@ -0,0 +1,47 @@ +The [ikiwiki-nav](http://www.vim.org/scripts/script.php?script_id=2968) plugin +for vim eases the editing of IkiWiki wikis, by letting you "follow" the +wikilinks on your file (page), by loading the file associated with a given +wikilink in vim. The plugin takes care of following the ikiwiki linking rules +to figure out which file a wikilink points to + +The plugin also includes commands (and mappings) to make the cursor jump to the +previous/next wikilink in the current file + +## Jumping to pages + +To open the file associated to a wikilink, place the cursor over its text, and +hit Enter (`<CR>`). This functionality is also available through the +`:IkiJumpToPage` command + +## Moving to next/previous wikilink in current file + +`Ctrl-j` will move the cursor to the next wikilink. `Ctrl-k` will move it to the +previous wikilink. This functionality is also available through the +`:IkiNextWikiLink` command. This command takes one argument, the direction to +move into + + * `:IkiNextWikiLink 0` will look forward for the wikilink + * `:IkiNextWikiLink 1` will look backwards for the wikilink + +## Installation + +Copy the `ikiwiki_nav.vim` file to your `.vim/ftplugin` directory. + +## Current issues: + + * The plugin only works for wikilinks contained in a single text line; + multiline wikilinks are not (yet) seen as such + +## Notes + +The official releases of the plugin are in the +[vim.org script page](http://www.vim.org/scripts/script.php?script_id=2968) + +The latest version of this script can be found in the following location + +<http://git.devnull.li/cgi-bin/gitweb.cgi?p=ikiwiki-nav.git;a=blob;f=ftplugin/ikiwiki_nav.vim;hb=HEAD> + +Any feedback you can provide is appreciated; the contact details can be found +inside the plugin + +[[!tag vim]] diff --git a/doc/tips/github.mdwn b/doc/tips/github.mdwn index 9bdf15751..d745bfcc5 100644 --- a/doc/tips/github.mdwn +++ b/doc/tips/github.mdwn @@ -5,7 +5,7 @@ site. Your laptop is used to generate and publish changes to it. This is possible because github now supports [github pages](http://github.com/blog/272-github-pages). -Note that github limits free accounts to 100 mb of git storage. It's +Note that github limits free accounts to 100 MB of git storage. It's unlikely that a small wiki or blog will outgrow this, but we are keeping two copies of the website in git (source and the compiled site), and all historical versions too. So it could happen. If it does, you can pay github diff --git a/doc/tips/howto_limit_to_admin_users.mdwn b/doc/tips/howto_limit_to_admin_users.mdwn new file mode 100644 index 000000000..4d579327a --- /dev/null +++ b/doc/tips/howto_limit_to_admin_users.mdwn @@ -0,0 +1,9 @@ +Enable [[plugins/lockedit]] in your setup file. + +For example: + + add_plugins => [qw{goodstuff table rawhtml template embed typography sidebar img remove lockedit}], + +And to only allow admin users to edit the page, simply specify a pagespec for everything in the .setup: + + locked_pages => '*', diff --git a/doc/tips/htaccess_file.mdwn b/doc/tips/htaccess_file.mdwn new file mode 100644 index 000000000..6964cf24e --- /dev/null +++ b/doc/tips/htaccess_file.mdwn @@ -0,0 +1,27 @@ +If you try to include a `.htaccess` file in your wiki's source, in order to +configure the web server, you'll find that ikiwiki excludes it from +processing. In fact, ikiwiki excludes any file starting with a dot, as well +as a lot of other files, for good security reasons. + +You can tell ikiwiki not to exclude the .htaccess file by adding this to +your setup file: + + include => '^\.htaccess$', + +Caution! Before you do that, please think for a minute about who can edit +your wiki. Are attachment uploads enabled? Can users commit changes +directly to the version control system? Do you trust everyone who can +make a change to not do Bad Things with the htaccess file? Do you trust +everyone who *might* be able to make a change in the future? Note that a +determined attacker who can write to the htaccess file can probably get a +shell on your web server. + +If any of these questions have given you pause, I suggest you find a +different way to configure the web server. One way is to not put the +`.htaccess` file under ikiwiki's control, and just manually install it +in the destdir. --[[Joey]] + +[Apache's documentation](http://httpd.apache.org/docs/2.2/howto/htaccess.html) +says: +> In general, you should never use .htaccess files unless you don't have +> access to the main server configuration file. diff --git a/doc/tips/html5.mdwn b/doc/tips/html5.mdwn new file mode 100644 index 000000000..cb71c0887 --- /dev/null +++ b/doc/tips/html5.mdwn @@ -0,0 +1,27 @@ +First, if you just want to embed videos using the html5 `<video>` tag, +you can do that without switching anything else to html5. +However, if you want to fully enter the brave new world of html5, read on.. + +Currently, ikiwiki does not use html5 by default. There is a `html5` +setting that can be turned on, in your setup file. Rebuild with it set, and +lots of fancy new semantic tags will be used all over the place. + +You may need to adapt your CSS for html5. While all the class and id names +are the same, some of the `div` elements are changed to other things. +Ikiwiki's default CSS will work in both modes. + +The html5 support is still experimental, and may break in some browsers. +No care is taken to add backwards compatibility hacks for browsers that +are not html5 aware (like MSIE). If you want to include the javascript with +those hacks, you can edit `page.tmpl` to do so. +[Dive Into HTML5](http://diveintohtml5.org/) is a good reference for +current compatability issues and workarounds with html5. And a remotely-loadable +JS shiv for enabling HTML5 elements in IE is available through [html5shiv at Google Code](http://code.google.com/p/html5shiv/). + +--- + +Known ikiwiki-specific issues: + +* [[plugins/htmltidy]] uses `tidy`, which is not html5 aware, so if you + have that enabled, it will mangle it back to html4. +* [[plugins/toc]] does not understand the html5 outline algorithm. diff --git a/doc/tips/ikiwiki_on_Mac_OS_X_Snow_Leopard.mdwn b/doc/tips/ikiwiki_on_Mac_OS_X_Snow_Leopard.mdwn new file mode 100644 index 000000000..bb1db0cbb --- /dev/null +++ b/doc/tips/ikiwiki_on_Mac_OS_X_Snow_Leopard.mdwn @@ -0,0 +1,184 @@ +These are some notes on installing ikiwiki on Mac OS X Snow Leopard. I have a three year old machine with a lot of stuff on it so it took quite a while, YMMV. + +The best part of installing ikiwiki was learning how to use git. I never used source control before but its pretty slick. + + +## installing git: + +cd /opt/ikiwiki/install + +curl http://kernel.org/pub/software/scm/git/git-(latest version).tar.gz -O + +tar xzvf git-(latest version).tar.gz + +cd git-(latest version) + +./configure --prefix=/usr/local + +make prefix=/usr/local all + +sudo make install + + +git config --global user.name "firstname lastname" + +git config --global user.email "email here" + +git config --global color.ui "auto" + + +curl http://www.kernel.org/pub/software/scm/git/git-manpages-1.7.3.1.tar.gz | sudo tar -xzC /usr/local/share/man/ + + +## installing ikiwiki: +I had terrible trouble installing ikiwiki. It turned out I had accidentally installed Perl through ports. Uninstalling that made everything install nicely. +I got an error on msgfmt. Turns out this is a program in gettext. I installed that and it fixed the error. + +cd .. + +git clone git://git.ikiwiki.info/ + +cd git.ikiwiki.info/ + +perl Makefile.PL LIB=/Library/Perl/5.10.0 + +make + +sudo make install + +when you make ikiwiki it gives you a .git folder with the ikiwiki files. Stay out of this folder. You want to learn how to create a clone and make all your changes in the clone. When you push the changes ikiwiki will update. I moved a file in this folder by accident because I named my working file the same and I couldn't get into the setup page. I had apparently messed up my ikiwiki git repository. I did a pull into my clone, deleted the repository and webserver/ cgi folders and ran a new setup. Then I did a git clone and dragged all my old files into the new clone. Did the git dance and did git push. Then the angels sang. + + +## using git from inside a git folder: + +start with git clone, then learn to do the git dance like this. + +git pull + +make your changes to your clone + +git commit -a -m "message here" + +git push + + +When you can't get into the setup page or you get strange behavior after a setup update the Utilities > Console app is your friend. + +## installing gitweb + +cd ../git-1.7.3.1/gitweb + +make GITWEB_PROJECTROOT="/opt/ikiwiki/" GITWEB_CSS="/gitweb.css" GITWEB_LOGO="/git-logo.png" GITWEB_FAVICON="/git-favicon.png" GITWEB_JS="/gitweb.js" + +cp gitweb.cgi /Library/WebServer/CGI-Executables/ + +cp /usr/local/share/gitweb/static/git-favicon.png /Library/WebServer/Documents/ + +cp /usr/local/share/gitweb/static/git-logo.png /Library/WebServer/Documents/ + +cp /usr/local/share/gitweb/static/gitweb.css /Library/WebServer/Documents/ + +cp /usr/local/share/gitweb/static/gitweb.js /Library/WebServer/Documents/ + + +sudo chmod 2755 /Library/WebServer/CGI-Executables/gitweb.cgi + +sudo chmod 2755 /Library/WebServer/Documents/git-favicon.png + +sudo chmod 2755 /Library/WebServer/Documents/git-logo.png + +sudo chmod 2755 /Library/WebServer/Documents/gitweb.css + +sudo chmod 2755 /Library/WebServer/Documents/gitweb.js + + +## installing xapian: + +download xapian and omega + +I needed pcre: sudo ports install pcre + +./configure + +make + +sudo make install + + +## installing omega: + +I had a build error do to libiconv undefined symbols. sudo port deactivate libiconv took care of it. After install I had trouble with ikiwiki so I did a sudo port install libiconv and ikiwiki came back. + +./configure + +make + +sudo make install + + +## installing Search::Xapian from CPAN + +for some reason this wouldn't install using CPAN console so I went to CPAN online and downloaded the source. + +perl Makefile.PL + +make + +make test + +sudo make install + +it installed without issue so I'm baffled why it didn't install from command line. + + + ## setup file + _!/usr/bin/perl + _ Ikiwiki setup automator. + + _ This setup file causes ikiwiki to create a wiki, check it into revision + _ control, generate a setup file for the new wiki, and set everything up. + + _ Just run: ikiwiki -setup /etc/ikiwiki/auto.setup + + _By default, it asks a few questions, and confines itself to the user's home + _directory. You can edit it to change what it asks questions about, or to + _modify the values to use site-specific settings. + require IkiWiki::Setup::Automator; + + our $wikiname="your wiki"; + our $wikiname_short="yourwiki"; + our $rcs="git"; + our $admin="your name"; + use Net::Domain q{hostfqdn}; + our $domain="your.domain"; + + IkiWiki::Setup::Automator->import( + wikiname => $wikiname, + adminuser => [$admin], + rcs => $rcs, + srcdir => "/opt/ikiwiki/$wikiname_short", + destdir => "/Library/WebServer/Documents/$wikiname_short", + repository => "/opt/ikiwiki/$wikiname_short.".($rcs eq "monotone" ? "mtn" : $rcs), + dumpsetup => "/opt/ikiwiki/$wikiname_short.setup", + url => "http://$domain/$wikiname_short", + cgiurl => "http://$domain/cgi-bin/$wikiname_short/ikiwiki.cgi", + cgi_wrapper => "/Library/WebServer/CGI-Executables/$wikiname_short/ikiwiki.cgi", + adminemail => "your\@email.com", + add_plugins => [qw{goodstuff websetup}], + disable_plugins => [qw{}], + libdir => "/opt/ikiwiki/.ikiwiki", + rss => 1, + atom => 1, + syslog => 1, + ) + + +## turning on search plugin: + +I turned on the plugin from the setup page in ikiwiki but it gave an error when I went to search. Error "Error: /usr/lib/cgi-bin/omega/omega failed: No such file or directory". +I did a "find / -name "omega" -print" and found the omega program in "/usr/local/lib/xapian-omega/bin/omega". + +Then I went into the 2wiki.setup file and replaced the bad path, updated and badda-boom badda-bing. + + + diff --git a/doc/tips/ikiwiki_on_Mac_OS_X_Snow_Leopard/discussion.mdwn b/doc/tips/ikiwiki_on_Mac_OS_X_Snow_Leopard/discussion.mdwn new file mode 100644 index 000000000..ae3969879 --- /dev/null +++ b/doc/tips/ikiwiki_on_Mac_OS_X_Snow_Leopard/discussion.mdwn @@ -0,0 +1 @@ +If you want do a bunch of manual labor, this is good, but most people probably want to get ikiwiki via a package system. My Mac laptop's ikiwiki is installed from pkgsrc. --[[schmonz]] diff --git a/doc/tips/inside_dot_ikiwiki.mdwn b/doc/tips/inside_dot_ikiwiki.mdwn index b81ffae8d..a74d00f47 100644 --- a/doc/tips/inside_dot_ikiwiki.mdwn +++ b/doc/tips/inside_dot_ikiwiki.mdwn @@ -6,9 +6,10 @@ you need/want to. ## the index -`.ikiwiki/indexdb` contains a cache of information about pages, as well -as all persisitant state about pages. It used to be a (semi) human-readable -text file, but is not anymore. +`.ikiwiki/indexdb` contains a cache of information about pages. +This information can always be recalculated by rebuilding the wiki. +(So the file is safe to delete and need not be backed up.) +It used to be a (semi) human-readable text file, but is not anymore. To dump the contents of the file, enter a perl command like this. diff --git a/doc/tips/inside_dot_ikiwiki/discussion.mdwn b/doc/tips/inside_dot_ikiwiki/discussion.mdwn index 34d5b9252..69df369ec 100644 --- a/doc/tips/inside_dot_ikiwiki/discussion.mdwn +++ b/doc/tips/inside_dot_ikiwiki/discussion.mdwn @@ -6,14 +6,15 @@ My database appears corrupted: No idea how this happened. I've blown it away and recreated it but, for future reference, is there any less violent way to recover from this situation? I miss having the correct created and last edited times. --[[sabr]] > update: fixed ctimes and mtimes using [these instructions](http://u32.net/Mediawiki_Conversion/Git_Import/#Correct%20Creation%20and%20Last%20Edited%20time) --[[sabr]] -> That's overly complex. Just run `ikiwiki -setup your.setup -getctime`. +> That's overly complex. Just run `ikiwiki -setup your.setup -gettime`. > BTW, I'd be interested in examining such a corrupt storable file to try > to see what happened to it. --[[Joey]] ->> --getctime appears to only set the last edited date. It's not supposed to set the creation date, is it? The only place that info is stored is in the git repo. +>> --gettime appears to only set the last edited date. It's not supposed to set the creation date, is it? The only place that info is stored is in the git repo. >>> Pulling the page creation date out of the git history is exactly what ->>> --getctime does. --[[Joey]] +>>> --gettime does. (It used to be called --getctime, and only do that; now +>>> it also pulls out the last modified date). --[[Joey]] >> Alas, I seem to have lost the bad index file to periodic /tmp wiping; I'll send it to you if it happens again. --[[sabr]] diff --git a/doc/tips/integrated_issue_tracking_with_ikiwiki.mdwn b/doc/tips/integrated_issue_tracking_with_ikiwiki.mdwn index ea7835b33..0c871d6c0 100644 --- a/doc/tips/integrated_issue_tracking_with_ikiwiki.mdwn +++ b/doc/tips/integrated_issue_tracking_with_ikiwiki.mdwn @@ -155,7 +155,7 @@ be inlined into a given page. A few examples: * A typical list of all open bugs, with their full text, and a form to post new bugs. - \[[!inline pages="bugs/* and !link(done) and !*/Discussion" actions=yes postform=yes show=0]] + \[[!inline pages="bugs/* and !link(done) and !*/Discussion" actions=yes postform=yes show=0 rootpage="bugs"]] * Index of the 30 most recently fixed bugs. diff --git a/doc/tips/laptop_wiki_with_git.mdwn b/doc/tips/laptop_wiki_with_git.mdwn index 9758beb80..cfa565d1a 100644 --- a/doc/tips/laptop_wiki_with_git.mdwn +++ b/doc/tips/laptop_wiki_with_git.mdwn @@ -1,3 +1,5 @@ +[[!toc]] + Using ikiwiki with the [[rcs/git]] backend, some interesting things can be done with creating mirrors (or, really, branches) of a wiki. In this tip, I'll assume your wiki is located on a server, and you want to take a copy with @@ -8,6 +10,8 @@ version on the laptop, perhaps while offline. You can browse and edit the wiki using a local web server. When you're ready, you can manually push the changes to the main wiki on the server. +## simple clone approach + First, set up the wiki on the server, if it isn't already. Nothing special needs to be done here, just follow the regular instructions in [[setup]] for setting up ikiwiki with git. @@ -49,3 +53,19 @@ update the wiki, with a command such as `ikiwiki -setup wiki.setup -refresh`. If you'd like it to automatically update when changes are merged in, you can simply make a symlink `post-merge` hook pointing at the `post-update` hook ikiwiki created. + +## bare mirror approach + +As above, set up a normal ikiwiki on the server, with the usual bare repository. + +Next, `git clone --mirror server:/path/to/bare/repository` + +This will be used as the $REPOSITORY on the laptop. Then you can follow +the instructions in [[setup by hand|/setup/byhand]] as per a normal ikiwiki +installation. This means that you can clone from the local bare repository +as many times as you want (thus being able to have a repository which is +used by the ikiwiki CGI, and another which you can use for updating via +git). + +Use standard git commands, run in the laptop's bare git repository +to handle pulling from and pushing to the server. diff --git a/doc/tips/nearlyfreespeech.mdwn b/doc/tips/nearlyfreespeech.mdwn index 4b3b02eac..a3d1ec678 100644 --- a/doc/tips/nearlyfreespeech.mdwn +++ b/doc/tips/nearlyfreespeech.mdwn @@ -14,7 +14,8 @@ After you [get an account](https://www.nearlyfreespeech.net/about/start.php), create a site using their web interface. Mine is named `ikiwiki-test` and I used their DNS instead of getting my -own, resulting in <http://ikiwiki-test.nfshost.com/> +own, resulting in <http://ikiwiki-test.nfshost.com/>. (Not being kept up +anymore.) They gave me 2 cents free funding for signing up, which is enough to pay for 10 megabytes of bandwidth, or about a thousand typical page views, at diff --git a/doc/tips/nearlyfreespeech/discussion.mdwn b/doc/tips/nearlyfreespeech/discussion.mdwn index a003760b9..b76432566 100644 --- a/doc/tips/nearlyfreespeech/discussion.mdwn +++ b/doc/tips/nearlyfreespeech/discussion.mdwn @@ -9,3 +9,14 @@ BEGIN failed--compilation aborted at (eval 19) line 2. perl is 5.8.9 > This is fixed in 3.1415926. --[[Joey]] + + +Hi!<br /> +How can i upgrade my nearlyfreespeech installation of ikiwiki? To install it i have followed your instructions here.<br> +But now if I want to upgrade it to a newer version?<br /> +Thanks for your incredible work! + +> You can move `~/ikiwiki` out of the way and then re-download and install +> it ikiwiki. --[[Joey]] + +Thanks a lot Joey. :-) diff --git a/doc/tips/optimising_ikiwiki.mdwn b/doc/tips/optimising_ikiwiki.mdwn new file mode 100644 index 000000000..d66ee9343 --- /dev/null +++ b/doc/tips/optimising_ikiwiki.mdwn @@ -0,0 +1,188 @@ +Ikiwiki is a wiki compiler, which means that, unlike a traditional wiki, +all the work needed to display your wiki is done up front. Where you can +see it and get annoyed at it. In some ways, this is better than a wiki +where a page view means running a program to generate the page on the fly. + +But enough excuses. If ikiwiki is taking too long to build your wiki, +let's fix that. Read on for some common problems that can be avoided to +make ikiwiki run quick. + +[[!toc]] + +(And if none of that helps, file a [[bug|bugs]]. One other great thing about +ikiwiki being a wiki compiler is that it's easy to provide a test case when +it's slow, and get the problem fixed!) + +## rebuild vs refresh + +Are you building your wiki by running a command like this? + + ikiwiki -setup my.setup + +If so, you're always telling ikiwiki to rebuild the entire site, from +scratch. But, ikiwiki is smart, it can incrementally update a site, +building only things affected by the changes you make. You just have to let +it do so: + + ikiwiki -setup my.setup -refresh + +Ikiwiki automatically uses an incremental refresh like this when handing +a web edit, or when run from a [[rcs]] post-commit hook. (If you've +configured the hook in the usual way.) Most people who have run into this +problem got in the habit of running `ikiwiki -setup my.setup` by hand +when their wiki was small, and found it got slower as they added pages. + +## use the latest version + +If your version of ikiwiki is not [[!version]], try upgrading. New +optimisations are frequently added to ikiwiki, some of them yielding +*enormous* speed increases. + +## run ikiwiki in verbose mode + +Try changing a page, and run ikiwiki with `-v` so it will tell you +everything it does to deal with that changed page. Take note of +which other pages are rebuilt, and which parts of the build take a long +time. This can help you zero in on individual pages that contain some of +the expensive things listed below. + +## expensive inlines + +Do you have an archive page for your blog that shows all posts, +using an inline that looks like this? + + \[[!inline pages="blog/*" show=0]] + +Or maybe you have some tag pages for your blog that show all tagged posts, +something like this? + + \[[!inline pages="blog/* and tagged(foo)" show=0]] + +These are expensive, because they have to be updated whenever you modify a +matching page. And, if there are a lot of pages, it generates a large html +file, which is a lot of work. And also large RSS/Atom files, which is even +more work! + +To optimise the inline, consider enabling quick archive mode. Then the +inline will only need to be updated when new pages are added; no RSS +or Atom feeds will be built, and the generated html file will be much +smaller. + + \[[!inline pages="blog/*" show=0 archive=yes quick=yes]] + + \[[!inline pages="blog/* and link(tag)" show=0 archive=yes quick=yes]] + +Only downsides: This won't show titles set by the [[ikiwiki/directive/meta]] +directive. And there's no RSS feed for users to use -- but if this page +is only for the archives or tag for your blog, users should be subscribing +to the blog's main page's RSS feed instead. + +For the main blog page, the inline should only show the latest N posts, +which won't be a performance problem: + + \[[!inline pages="blog/*" show=30]] + +## expensive maps + +Do you have a sitemap type page, that uses a map directive like this? + + \[[!map pages="*" show=title]] + +This is expensive because it has to be updated whenever a page is modified. +The resulting html file might get big and expensive to generate as you +keep adding pages. + +First, consider removing the "show=title". Then the map will not show page +titles set by the [[ikiwiki/directive/meta]] directive -- but will also +only need to be generated when pages are added or removed, not for every +page change. + +Consider limiting the map to only show the toplevel pages of your site, +like this: + + \[[!map pages="* and !*/*" show=title]] + +Or, alternatively, to drop from the map parts of the site that accumulate +lots of pages, like individual blog posts: + + \[[!map pages="* and !blog/*" show=title]] + +## sidebar issues + +If you enable the [[plugins/sidebar]] plugin, be careful of what you put in +your sidebar. Any change that affects what is displayed by the sidebar +will require an update of *every* page in the wiki, since all pages include +the sidebar. + +Putting an expensive map or inline in the sidebar is the most common cause +of problems. At its worst, it can result in any change to any page in the +wiki requiring every page to be rebuilt. + +## avoid htmltidy + +A few plugins do neat stuff, but slowly. Such plugins are tagged +[[plugins/type/slow]]. + +The worst offender is possibly [[plugins/htmltidy]]. This runs an external +`tidy` program on each page that is built, which is necessarily slow. So don't +use it unless you really need it; consider using the faster +[[plugins/htmlbalance]] instead. + +## be careful of large linkmaps + +[[plugins/Linkmap]] generates a cool map of links between pages, but +it does it using the `graphviz` program. And any changes to links between +pages on the map require an update. So, avoid using this to map a large number +of pages with frequently changing links. For example, using it to map +all the pages on a traditional, highly WikiLinked wiki, is asking for things +to be slow. But using it to map a few related pages is probably fine. + +This site's own [[plugins/linkmap]] rarely slows it down, because it +only shows the index page, and the small set of pages that link to it. +That is accomplished as follows: + + \[[!linkmap pages="index or (backlink(index)"]] + +## overhead of the search plugin + +Be aware that the [[plugins/search]] plugin has to update the search index +whenever any page is changed. This can slow things down somewhat. + +## profiling + +If you have a repeatable change that ikiwiki takes a long time to build, +and none of the above help, the next thing to consider is profiling +ikiwiki. + +The best way to do it is: + +* Install [[!cpan Devel::NYTProf]] +* `PERL5OPT=-d:NYTProf` +* `export PER5OPT` +* Now run ikiwiki as usual, and it will generate a `nytprof.out` file. +* Run `nytprofhtml` to generate html files. +* Those can be examined to see what parts of ikiwiki are being slow. + +## scaling to large numbers of pages + +Finally, let's think about how huge number of pages can affect ikiwiki. + +* Every time it's run, ikiwiki has to scan your `srcdir` to find + new and changed pages. This is similar in speed to running the `find` + command. Obviously, more files will make it take longer. + +* Also, to see what pages match a [[ikiwiki/PageSpec]] like "blog/*", it has + to check if every page in the wiki matches. These checks are done quite + quickly, but still, lots more pages will make PageSpecs more expensive. + +* The backlinks calculation has to consider every link on every page + in the wiki. (In practice, most pages only link to at most a few dozen + other pages, so this is not a `O(N^2)`, but closer to `O(N)`.) + +* Ikiwiki also reads and writes an `index` file, which contains information + about each page, and so if you have a lot of pages, this file gets large, + and more time is spent on it. For a wiki with 2000 pages, this file + will run about 500 kb. + +If your wiki will have 100 thousand files in it, you might start seeing +the above contribute to ikiwiki running slowly. diff --git a/doc/tips/parentlinks_style.mdwn b/doc/tips/parentlinks_style.mdwn index 5294e5452..f9dfa8f55 100644 --- a/doc/tips/parentlinks_style.mdwn +++ b/doc/tips/parentlinks_style.mdwn @@ -6,7 +6,7 @@ a subset of a page's parents. It also provides a few bonus possibilities, such as styling the parent links depending on their place in the path. -[[!toc ]] +[[!toc levels=2]] Content ======= @@ -77,10 +77,29 @@ following lines in `page.tmpl`: <a href="<TMPL_VAR NAME="URL">" class="height<TMPL_VAR NAME="HEIGHT">"> <TMPL_VAR NAME="PAGE"> </a> / + </TMPL_IF> </TMPL_LOOP> Then write the appropriate CSS bits for `a.height1`, etc. +Avoid showing title of toplevel index page +------------------------------------------ + +If you don't like having "index" appear on the top page of the wiki, +but you do want to see the name of the page otherwise, you can use a +special `HAS_PARENTLINKS` template variable that the plugin provides. +It is true for every page *except* the toplevel index. + +Here is an example of using it to hide the title of the toplevel index +page: + + <TMPL_LOOP NAME="PARENTLINKS"> + <a href="<TMPL_VAR NAME=URL>"><TMPL_VAR NAME=PAGE></a>/ + </TMPL_LOOP> + <TMPL_IF HAS_PARENTLINKS> + <TMPL_VAR TITLE> + </TMPL_IF> + Full-blown example ------------------ diff --git a/doc/tips/psgi.mdwn b/doc/tips/psgi.mdwn new file mode 100644 index 000000000..0d2eeefc8 --- /dev/null +++ b/doc/tips/psgi.mdwn @@ -0,0 +1,21 @@ +Here's the app.psgi file if you want to run ikiwiki with [PSGI](http://plackperl.org) instead of apache or other web servers: + + use Plack::App::CGIBin; + use Plack::Builder; + use Plack::App::File; + + builder { + mount '/ikiwiki.cgi' => Plack::App::CGIBin->new(file => './ikiwiki.cgi')->to_app; + enable "Plack::Middleware::Static", + path => sub { s!(^(?:/[^.]*)?/?$)!${1}/index.html! }, + root => '.'; + mount '/' => Plack::App::File->new(root => ".")->to_app; + }; + +Put it in your destdir and now your can run `plackup -p <port>`. + +Note that you should configure your `url` and `cgiurl` to point to the listening address of plackup. + +Also, the app.psgi residing in the destdir means that /app.psgi is accessible from the web server. + +Hopefully some day ikiwiki web ui will speak psgi natively. diff --git a/doc/tips/spam_and_softwaresites.mdwn b/doc/tips/spam_and_softwaresites.mdwn new file mode 100644 index 000000000..a07889e6b --- /dev/null +++ b/doc/tips/spam_and_softwaresites.mdwn @@ -0,0 +1,87 @@ +Any wiki with a form of web-editing enabled will have to deal with +spam. (See the [[plugins/blogspam]] plugin for one defensive tool you +can deploy). + +If: + + * you are using ikiwiki to manage the website for a [[examples/softwaresite]] + * you allow web-based commits, to let people correct documentation, or report + bugs, etc. + * the documentation is stored in the same revision control repository as your + software + +It is undesirable to have your software's VCS history tainted by spam and spam +clean-up commits. Here is one approach you can use to prevent this. This +example is for the [[git]] version control system, but the principles should +apply to others. + +## Isolate web commits to a specific branch + +Create a separate branch to contain web-originated edits (named `doc` in this +example): + + $ git checkout -b doc + +Adjust your setup file accordingly: + + gitmaster_branch => 'doc', + +## merging good web commits into the master branch + +You will want to periodically merge legitimate web-based commits back into +your master branch. Ensure that there is no spam in the documentation +branch. If there is, see 'erase spam from the commit history', below, first. + +Once you are confident it's clean: + + # ensure you are on the master branch + $ git branch + doc + * master + $ git merge --ff doc + +## removing spam + +### short term + +In the short term, just revert the spammy commit. + +If the spammy commit was the top-most: + + $ git revert HEAD + +This will clean the spam out of the files, but it will leave both the spam +commit and the revert commit in the history. + +### erase spam from the commit history + +Git allows you to rewrite your commit history. We will take advantage of this +to eradicate spam from the history of the doc branch. + +This is a useful tool, but it is considered bad practise to rewrite the +history of public repositories. If your software's repository is public, you +should make it clear that the history of the `doc` branch in your repository +is unstable. + +Once you have been spammed, use `git rebase` to remove the spam commits from +the history. Assuming that your `doc` branch was split off from a branch +called `master`: + + # ensure you are on the doc branch + $ git branch + * doc + master + $ git rebase --interactive master + +In your editor session, you will see a series of lines for each commit made to +the `doc` branch since it was branched from `master` (or since the last merge +back into `master`). Delete the lines corresponding to spammy commits, then +save and exit your editor. + +Caveat: if there are no commits you want to keep (i.e. all the commits since +the last merge into master are either spam or spam reverts) then `git rebase` +will abort. Therefore, this approach only works if you have at least one +non-spam commit to the documentation since the last merge into `master`. For +this reason, it's best to wait until you have at least one +commit you want merged back into the main history before doing a rebase, +and until then, tackle spam with reverts. diff --git a/doc/tips/spam_and_softwaresites/discussion.mdwn b/doc/tips/spam_and_softwaresites/discussion.mdwn new file mode 100644 index 000000000..21f0a5d7e --- /dev/null +++ b/doc/tips/spam_and_softwaresites/discussion.mdwn @@ -0,0 +1,8 @@ +In the cleanup spam section: + +> Caveat: if there are no commits you want to keep (i.e. all the commits since the last merge into master are either spam or spam reverts) then git rebase will abort. + +Wouldn't it be enough then to use `git reset --hard` to the desired last good commit? + +regards, +iustin diff --git a/doc/tips/switching_to_usedirs.mdwn b/doc/tips/switching_to_usedirs.mdwn index 183ce00ac..92871439f 100644 --- a/doc/tips/switching_to_usedirs.mdwn +++ b/doc/tips/switching_to_usedirs.mdwn @@ -8,9 +8,7 @@ to usedirs, or edit your setup file and turn usedirs back off. or manually. * Since usedirs is enabled, ikiwiki will have created a bunch of new html files. Where before ikiwiki generated a `dest/foo.html`, now it will - generate `dest/foo/index.html`. But, the old html files will still be - present too. Remove them: - find dest -name \*.html -not -name index.html -exec rm {} \; + generate `dest/foo/index.html`. The old html files will be removed. * If you have a blog that is aggregated on a Planet or similar, all the items in the RSS or atom feed will seem like new posts, since their URLs have changed. See [[howto_avoid_flooding_aggregators]] for a workaround. diff --git a/doc/tips/untrusted_git_push.mdwn b/doc/tips/untrusted_git_push.mdwn index 3573a0ddf..948a55063 100644 --- a/doc/tips/untrusted_git_push.mdwn +++ b/doc/tips/untrusted_git_push.mdwn @@ -68,7 +68,7 @@ untrusted changes. It should *not* include the user that ikiwiki normally runs as. Once you're done modifying the setup file, don't forget to run -`ikiwiki -setup --refresh --wrappers` on it. +`ikiwiki --setup ikiwiki.setup --refresh --wrappers` on it. ## git setup @@ -112,11 +112,3 @@ abort the push before refs are updated. However, the changeset will still be present in your repository, wasting space. Since nothing refers to it, it will be expired eventually. You can speed up the expiry by running `git prune`. - -When aborting a push, ikiwiki displays an error message about why it didn't -accept it. If using git over ssh, the user will see this error message, -which is probably useful to them. But `git-daemon` is buggy, and hides this -message from the user. This can make it hard for users to figure out why -their push was rejected. (If this happens to you, look at "'git log --stat -origin/master..`" and think about whether your changes would be accepted -over the web interface.) diff --git a/doc/tips/vim_and_ikiwiki.mdwn b/doc/tips/vim_and_ikiwiki.mdwn new file mode 100644 index 000000000..e4136aa5d --- /dev/null +++ b/doc/tips/vim_and_ikiwiki.mdwn @@ -0,0 +1,28 @@ +# Vim and ikiwiki + +## Syntax highlighting + +[ikiwiki-syntax](http://www.vim.org/scripts/script.php?script_id=3156) is a vim +syntax highlighting file for ikiwiki [[ikiwiki/markdown]] files. It highlights +directives and wikilinks. It only supports prefixed directives, i.e., +\[[!directive foo=bar baz]], not the old format with spaces. + +------ + +The previous syntax definition for ikiwiki links is at [[vim_syntax_highlighting/ikiwiki.vim]]; however, +it seems to not be [[maintained +anymore|forum/navigation_of_wiki_pages_on_local_filesystem_with_vim#syn-maintenance]], +and it has some [[issues|forum/ikiwiki_vim_syntaxfile]]. + +## Page creation and navigation + +The [ikiwiki-nav](http://www.vim.org/scripts/script.php?script_id=2968) package +is a vim plugin that enables you to do the following from inside vim: + + * Jumping to the file corresponding to the wikilink under the cursor. + * Creating the file corresponding to the wikilink under the cursor (including + directories if necessary.) + * Jumping to the previous/next wikilink in the current file. + * Autocomplete link names. + +Download it from [here](http://www.vim.org/scripts/script.php?script_id=2968) diff --git a/doc/tips/vim_syntax_highlighting.mdwn b/doc/tips/vim_syntax_highlighting.mdwn index 172b763c3..8f2fdc1f0 100644 --- a/doc/tips/vim_syntax_highlighting.mdwn +++ b/doc/tips/vim_syntax_highlighting.mdwn @@ -1,4 +1,20 @@ -[[ikiwiki.vim]] is a vim syntax highlighting file for ikiwiki -[[ikiwiki/markdown]] files. +This page is deprecated. See [[tips/vim_and_ikiwiki]] for the most up to date +content -Installation instructions are at the top of the file. +-------- + +[ikiwiki-syntax](http://www.vim.org/scripts/script.php?script_id=3156) is a vim +syntax highlighting file for ikiwiki [[ikiwiki/markdown]] files. It highlights +directives and wikilinks. It only supports prefixed directives, i.e., +\[[!directive foo=bar baz]], not the old format with spaces. + +See also: [[follow_wikilinks_from_inside_vim]] + +------ + +The previous syntax definition for ikiwiki links is at [[ikiwiki.vim]]; however, +it seems to not be [[maintained +anymore|forum/navigation_of_wiki_pages_on_local_filesystem_with_vim#syn-maintenance]], +and it has some [[issues|forum/ikiwiki_vim_syntaxfile]]. + +[[!tag vim]] diff --git a/doc/tips/yaml_setup_files.mdwn b/doc/tips/yaml_setup_files.mdwn new file mode 100644 index 000000000..e8ab4f144 --- /dev/null +++ b/doc/tips/yaml_setup_files.mdwn @@ -0,0 +1,12 @@ +Here's how to convert your existing standard format ikiwiki setup file into +the new YAML format recently added to ikiwiki. + +1. First, make sure you have the [[!cpan YAML]] perl module installed. + (Run: `apt-get install libyaml-perl`) +2. Run: `ikiwiki -setup my.setup -dumpsetup my.setup --set setuptype=Yaml` + +The format of the YAML setup file should be fairly self-explanatory. + +(To convert the other way, use "setuptype=Standard" instead.) + +--[[Joey]] diff --git a/doc/todo/ACL.mdwn b/doc/todo/ACL.mdwn index e9fb2717f..dd9793233 100644 --- a/doc/todo/ACL.mdwn +++ b/doc/todo/ACL.mdwn @@ -21,6 +21,11 @@ something, that I think is very valuable. >>>> Which would rule out openid, or other fun forms of auth. And routing all access >>>> through the CGI sort of defeats the purpose of ikiwiki. --[[Ethan]] +>>>>> I think what Joey is suggesting is to use apache ACLs in conjunction +>>>>> with basic HTTP auth to control read access, and ikiwiki can use the +>>>>> information via the httpauth plugin for other ACLs (write, admin). But +>>>>> yes, that would rule out non-httpauth mechanisms. -- [[Jon]] + Also see [[!debbug 443346]]. > Just a few quick thoughts about this: @@ -69,3 +74,25 @@ Here is how I see it: <pre> \[[!acl user=* page=/subsite/* acl=/subsite/acl.mdwn]] </pre> + +Any idea when this is going to be finished? If you want, I am happy to beta test. + +> It's already done, though that is sorta hidden in the above. :-) +> Example of use to only allow two users to edit the tipjar page: +> locked_pages => 'tipjar and !(user(joey) or user(bob))', +> --[[Joey]] + +> > Thank you for the hint but I am being still confused (read: dense)... What I am trying to do is this: + +> > * No anonymous access. +> > * Logged in users can edit and create pages. +> > * Users can set who can edit their pages. +> > * Some pages are only viewable by admins. + +> > Is it possible? If so how?... + +>>> I don't believe this is currently possible. What is missing is the concept +>>> of page 'ownership'. -- [[Jon]] + +>>>> GAH! That is really a shame... Any chance of adding that? No, I do not really expect it to be added, after all my requirements are pushing the boundary of what a wikiwiki + should be. Nonetheless, thanks for your help! diff --git a/doc/todo/Add_HTML_support_to_po_plugin.mdwn b/doc/todo/Add_HTML_support_to_po_plugin.mdwn new file mode 100644 index 000000000..ec29e4f61 --- /dev/null +++ b/doc/todo/Add_HTML_support_to_po_plugin.mdwn @@ -0,0 +1,7 @@ +The HTML page type should be fully supported by the PO plugin: po4a's +HTML support is able to extract translatable strings and to disregard +the rest. + +This is implemented in my po branch, please review. --[[intrigeri]] + +[[!tag patch]] diff --git a/doc/todo/Add_label_to_search_form_input_field.mdwn b/doc/todo/Add_label_to_search_form_input_field.mdwn index e4e83428c..514108fba 100644 --- a/doc/todo/Add_label_to_search_form_input_field.mdwn +++ b/doc/todo/Add_label_to_search_form_input_field.mdwn @@ -47,4 +47,10 @@ The patch below adds a label for the field to improve usability: > to get it to appear higher up is to put it first, or to use Evil absolute > positioning. (CSS sucks.) --[[Joey]] -[[!tag done wishlist]] +> Update: html5 allows just adding `placeholder="Search"` to the input +> element. already works in eg, chromium. However, ikiwiki does not use +> html5 yet. --[[Joey]] + +>> [[Done]], placeholder added, in html5 mode only. + +[[!tag wishlist bugs/html5_support]] diff --git a/doc/todo/Add_nicer_math_formatting.mdwn b/doc/todo/Add_nicer_math_formatting.mdwn new file mode 100644 index 000000000..3a5e94a14 --- /dev/null +++ b/doc/todo/Add_nicer_math_formatting.mdwn @@ -0,0 +1,26 @@ +It would be nice to add nicer math formatting. I currently use the +[[plugins/teximg]] plugin, but I wonder if +[jsMath](http://www.math.union.edu/~dpvc/jsMath/) wouldn't be a better option. + +[[Will]] + +> I've looked at jsmath (which is nicely packaged in Debian), and +> I agree that this is nicer than TeX images. That text-mode browsers +> get to see LaTeX as a fallback is actually a nice feature (better +> than nothing, right? :) That browsers w/o javascript will not be able to +> see the math either is probably ok. +> +> A plugin would probably be a pretty trivial thing to write. +> It just needs to include the javascript files, +> and slap a `<div class="math"> avound the user's code`, then +> call `jsMath.Process(document);` at the end of the page. +> +> My only concern is security: Has jsMath's parser been written +> to be safe when processing untrusted input? Could a user abuse the +> parser to cause it to emit/run arbitrary javascript code? +> I've posted a question about this to its forum: --[[Joey]] +> <https://sourceforge.net/projects/jsmath/forums/forum/592273/topic/3831574> + +I think [mathjax](http://www.mathjax.org/) would be the best option. This is the math rendering engine used in mathoverflow. + +[[!tag wishlist]] diff --git a/doc/todo/CSS_classes_for_links.mdwn b/doc/todo/CSS_classes_for_links.mdwn index 38db87724..29ed3770e 100644 --- a/doc/todo/CSS_classes_for_links.mdwn +++ b/doc/todo/CSS_classes_for_links.mdwn @@ -101,3 +101,38 @@ I find CSS3 support still spotty... Here are some notes on how to do this in Ik >>> >>> `htmllink` can never be used to generate an external link. So, >>> patching it seems the best approach. --[[Joey]] + +>>>> I had a quick look to this issue. Internal links are generated at +>>>> 11 places in the Perl code and would need to be patched (this +>>>> number could be lowered a bit if a htmllink-like function existed +>>>> for CGI urls; such a function would use `cgiurl`, and be used in +>>>> most places where `cgiurl` is currently called by plugins). +>>>> +>>>> Also, more than 30 `<a>` links appear in templates, most of those +>>>> being internal links. +>>>> +>>>> Sure, patching those few dozen places is trivial. On the other +>>>> hand, I'm wondering how doable it would be to make sure, on the +>>>> long run, any generated internal link has the right CSS class +>>>> applied. One would need to write tests running against the code +>>>> with all plugins enabled, all templates put to work, in order to +>>>> ensure consistency is maintained. --[[intrigeri]] + +----- +If you're going to be patching htmllink anyway, might I suggest something more flexible, like being able to configure the link format? +(Yes, PmWiki allows this, that's where I got the idea) +That is, rather than having "<a href=". blah . blah ... +one could use a sprintf with a default format which could be configured in the setup file. + +For example: + + $format = ($config{createlink_format} + ? $config{createlink_format} + : '<span class=\"createlink\"><a href="%s" rel="nofollow">?</a>%s</span>'); + return sprintf($format, + cgiurl(do => "create", page => lc($link), from => $lpage), + $linktext); + +I admit, I've been wanting something like this for a long time, because I dislike the existing createlink format... + +--[[KathrynAndersen]] diff --git a/doc/todo/Extensible_inlining.mdwn b/doc/todo/Extensible_inlining.mdwn new file mode 100644 index 000000000..994ed0759 --- /dev/null +++ b/doc/todo/Extensible_inlining.mdwn @@ -0,0 +1,263 @@ +Here's an idea with [[patch]] for extending inline in two directions: + +1. Permit the content-fetching function to return undef to skip a page. The limiting of @list to a set size is performed after that filtering. +2. Permit other directive plugins to pass a function to generate content via an inliner_ parameter. The current patch doesn't try to remove that key from the parameters, so hilarity might ensue if someone is too clever. I suppose I should fix that... My *intent* is that other, custom directives can add inliner_. + +The diff looks large because the first requires switching some loops. + +I'm using this along with a custom BibTeX formatter (one item per file) to generate larger pages and tiny listings. I still need to hammer the templates for that, but I think that's possible without further patches. + +(Setting up a git branch for a single plugin is a pain, but I can if necessary. I also could separate this into some sequence rather than all at once, but I won't have time for a week or two.) + +-- [[JasonRiedy]] + +<pre><code> +--- /home/ejr/src/git.ikiwiki.info/IkiWiki/Plugin/inline.pm 2011-03-05 14:18:30.261293808 -0500 ++++ inline.pm 2011-03-06 21:44:18.887903638 -0500 +@@ -185,6 +185,7 @@ + } + + my @list; ++ my $num = 0; + + if (exists $params{pagenames}) { + foreach my $p (qw(sort pages)) { +@@ -213,23 +214,121 @@ + if ($params{feedshow} && $num < $params{feedshow} && $num > 0) { + $num=$params{feedshow}; + } +- if ($params{skip} && $num) { +- $num+=$params{skip}; +- } + + @list = pagespec_match_list($params{page}, $params{pages}, + deptype => deptype($quick ? "presence" : "content"), + filter => sub { $_[0] eq $params{page} }, + sort => exists $params{sort} ? $params{sort} : "age", + reverse => yesno($params{reverse}), +- ($num ? (num => $num) : ()), + ); + } + + if (exists $params{skip}) { + @list=@list[$params{skip} .. $#list]; + } ++ ++ if ($params{show} && $params{show} > $num) { ++ $num = $params{show} ++ } ++ ++ my $ret=""; ++ my @displist; ++ if ($feedonly) { ++ @displist = @list; ++ } else { ++ my $template; ++ if (! $raw) { ++ # cannot use wiki pages as templates; template not sanitized due to ++ # format hook hack ++ eval { ++ $template=template_depends($params{template}.".tmpl", $params{page}, ++ blind_cache => 1); ++ }; ++ if ($@) { ++ error sprintf(gettext("failed to process template %s"), $params{template}.".tmpl").": $@"; ++ } ++ } ++ my $needcontent=$raw || (!($archive && $quick) && $template->query(name => 'content')); ++ ++ foreach my $page (@list) { ++ last if ($num && scalar @displist >= $num); ++ my $file = $pagesources{$page}; ++ my $type = pagetype($file); ++ if (! $raw) { ++ # Get the content before populating the ++ # template, since getting the content uses ++ # the same template if inlines are nested. ++ if ($needcontent) { ++ my $content; ++ if (exists $params{inliner_} && defined $params{inliner_}) { ++ $content = &{$params{inliner_}}($page, $template, %params); ++ } else { ++ $content=get_inline_content($page, $params{destpage}); ++ } ++ next if !defined $content; ++ $template->param(content => $content); ++ push @displist, $page; ++ } ++ $template->param(pageurl => urlto($page, $params{destpage})); ++ $template->param(inlinepage => $page); ++ $template->param(title => pagetitle(basename($page))); ++ $template->param(ctime => displaytime($pagectime{$page}, $params{timeformat}, 1)); ++ $template->param(mtime => displaytime($pagemtime{$page}, $params{timeformat})); ++ $template->param(first => 1) if $page eq $list[0]; ++ $template->param(last => 1) if ($num && scalar @displist == $num); ++ $template->param(html5 => $config{html5}); + ++ if ($actions) { ++ my $file = $pagesources{$page}; ++ my $type = pagetype($file); ++ if ($config{discussion}) { ++ if ($page !~ /.*\/\Q$config{discussionpage}\E$/i && ++ (length $config{cgiurl} || ++ exists $pagesources{$page."/".lc($config{discussionpage})})) { ++ $template->param(have_actions => 1); ++ $template->param(discussionlink => ++ htmllink($page, ++ $params{destpage}, ++ $config{discussionpage}, ++ noimageinline => 1, ++ forcesubpage => 1)); ++ } ++ } ++ if (length $config{cgiurl} && ++ defined $type && ++ IkiWiki->can("cgi_editpage")) { ++ $template->param(have_actions => 1); ++ $template->param(editurl => cgiurl(do => "edit", page => $page)); ++ ++ } ++ } ++ ++ run_hooks(pagetemplate => sub { ++ shift->(page => $page, destpage => $params{destpage}, ++ template => $template,); ++ }); ++ ++ $ret.=$template->output; ++ $template->clear_params; ++ } ++ else { ++ if (defined $type) { ++ $ret.="\n". ++ linkify($page, $params{destpage}, ++ preprocess($page, $params{destpage}, ++ filter($page, $params{destpage}, ++ readfile(srcfile($file))))); ++ } ++ else { ++ $ret.="\n". ++ readfile(srcfile($file)); ++ } ++ push @displist, $page; ++ } ++ } ++ } ++ @list = @displist; ++ + my @feedlist; + if ($feeds) { + if (exists $params{feedshow} && +@@ -241,10 +340,6 @@ + } + } + +- if ($params{show} && @list > $params{show}) { +- @list=@list[0..$params{show} - 1]; +- } +- + if ($feeds && exists $params{feedpages}) { + @feedlist = pagespec_match_list( + $params{page}, "($params{pages}) and ($params{feedpages})", +@@ -302,8 +397,6 @@ + } + } + +- my $ret=""; +- + if (length $config{cgiurl} && ! $params{preview} && (exists $params{rootpage} || + (exists $params{postform} && yesno($params{postform}))) && + IkiWiki->can("cgi_editpage")) { +@@ -355,91 +448,7 @@ + } + $ret.=$linktemplate->output; + } +- +- if (! $feedonly) { +- my $template; +- if (! $raw) { +- # cannot use wiki pages as templates; template not sanitized due to +- # format hook hack +- eval { +- $template=template_depends($params{template}.".tmpl", $params{page}, +- blind_cache => 1); +- }; +- if ($@) { +- error sprintf(gettext("failed to process template %s"), $params{template}.".tmpl").": $@"; +- } +- } +- my $needcontent=$raw || (!($archive && $quick) && $template->query(name => 'content')); +- +- foreach my $page (@list) { +- my $file = $pagesources{$page}; +- my $type = pagetype($file); +- if (! $raw) { +- if ($needcontent) { +- # Get the content before populating the +- # template, since getting the content uses +- # the same template if inlines are nested. +- my $content=get_inline_content($page, $params{destpage}); +- $template->param(content => $content); +- } +- $template->param(pageurl => urlto($page, $params{destpage})); +- $template->param(inlinepage => $page); +- $template->param(title => pagetitle(basename($page))); +- $template->param(ctime => displaytime($pagectime{$page}, $params{timeformat}, 1)); +- $template->param(mtime => displaytime($pagemtime{$page}, $params{timeformat})); +- $template->param(first => 1) if $page eq $list[0]; +- $template->param(last => 1) if $page eq $list[$#list]; +- $template->param(html5 => $config{html5}); +- +- if ($actions) { +- my $file = $pagesources{$page}; +- my $type = pagetype($file); +- if ($config{discussion}) { +- if ($page !~ /.*\/\Q$config{discussionpage}\E$/i && +- (length $config{cgiurl} || +- exists $pagesources{$page."/".lc($config{discussionpage})})) { +- $template->param(have_actions => 1); +- $template->param(discussionlink => +- htmllink($page, +- $params{destpage}, +- $config{discussionpage}, +- noimageinline => 1, +- forcesubpage => 1)); +- } +- } +- if (length $config{cgiurl} && +- defined $type && +- IkiWiki->can("cgi_editpage")) { +- $template->param(have_actions => 1); +- $template->param(editurl => cgiurl(do => "edit", page => $page)); + +- } +- } +- +- run_hooks(pagetemplate => sub { +- shift->(page => $page, destpage => $params{destpage}, +- template => $template,); +- }); +- +- $ret.=$template->output; +- $template->clear_params; +- } +- else { +- if (defined $type) { +- $ret.="\n". +- linkify($page, $params{destpage}, +- preprocess($page, $params{destpage}, +- filter($page, $params{destpage}, +- readfile(srcfile($file))))); +- } +- else { +- $ret.="\n". +- readfile(srcfile($file)); +- } +- } +- } +- } +- + if ($feeds && ($emptyfeeds || @feedlist)) { + if ($rss) { + my $rssp=$feedbase."rss".$feednum; +</code></pre> diff --git a/doc/todo/Fix_selflink_in_po_plugin.mdwn b/doc/todo/Fix_selflink_in_po_plugin.mdwn new file mode 100644 index 000000000..b276c075d --- /dev/null +++ b/doc/todo/Fix_selflink_in_po_plugin.mdwn @@ -0,0 +1,21 @@ +Using the po plugin, a link to /bla is present in the sidebar. +When viewing /bla in the default language, this link is detected as +a selflink. When viewing a translation of /bla, it +isn't. --[[intrigeri]] + +Fixed in my po branch. --[[intrigeri]] + +[[!tag patch done]] + +> bump? + +>> I know I've looked at 88c6e2891593fd508701d728602515e47284180c +>> before, and something about it just seemed wrong. Maybe it's +>> the triviality of the sub, which it would seem to be easy to +>> decide to refactor back into its one caller (which would reintroduce the +>> bug). --[[Joey]] + +>>> Well, I can hear and understand this. Apart of adding a comment to +>>> the sub, explaining the rationale (which is now done in my po +>>> branch), I don't know what I can do to make it not seem wrong. +>>> --[[intrigeri]] diff --git a/doc/todo/Google_Analytics_support.mdwn b/doc/todo/Google_Analytics_support.mdwn new file mode 100644 index 000000000..8bbb1c69b --- /dev/null +++ b/doc/todo/Google_Analytics_support.mdwn @@ -0,0 +1,31 @@ +[[!template id=gitbranch branch=GiuseppeBilotta/google-analytics +author="[[GiuseppeBilotta]]"]] + +I've extended the google plugin to add support for Google Analytics. +This is done in two steps: + +* a `google_sitesearch` config option is introduced, to allow disabling + sitesearch even when the `google` plugin is loaded +* a `google_analytics_account` config option is introduced. When it's + defined, its value is assumed to be a Google Analytics account ID + and the corresponding JavaScript code is automatically inserted in all + documents. The way this is done is shamelessy stolen from the flattr + plugin + +> Putting this in the google plugin does not seem to be a good approach. +> That this "functionality" is offered by the same company as google search +> is really of no consequence. + +Well, my idea was to put all Google-related functionality (in the sense +of support for any service provided by Google) into the google plugin. +The alternative would have been to have one separate plugin per feature, +but that doesn't sound particularly nice to me. I can split it in a +separate plugin if you believe it's cleaner that way + +> Also, can't this be easily accomplished by editing page.tmpl? --[[Joey]] + +Yes, and so would flattr. But precisely because this kind of code would require +editing page.tmpl, doing it the manual way carries the burden of keeping it in +sync across Ikiwiki updates (I'm sure I don't need to mention the number of +help requests that essentially boil down to "oops, I was using custom templates +and hadn't updated them"). diff --git a/doc/todo/Google_Sitemap_protocol.mdwn b/doc/todo/Google_Sitemap_protocol.mdwn index 057a88b72..ea8ee7f03 100644 --- a/doc/todo/Google_Sitemap_protocol.mdwn +++ b/doc/todo/Google_Sitemap_protocol.mdwn @@ -34,6 +34,9 @@ for an example. You will probably need to strip out the metadata variables I >>>[xtermin.us rather than localhost](http://xtermin.us/git/?p=website.git;a=blob;f=plugins/googlesitemap.pm) is 404 now. >>> -- weakish + +Although it is not able to read the meta-data from files, using google-sitemapgen [works well for me](http://bzed.de/posts/2010/06/creating_a_google_sitemap_for_ikiwiki/) to create a sitemap for my ikiwiki installation. -- [[bzed|BerndZeimetz]] + There is a [sitemap XML standard](http://www.sitemaps.org/protocol.php) that ikiwiki needs to generate for. # Google Webmaster tools and RSS @@ -45,3 +48,13 @@ On [Google Webmaster tools](https://www.google.com/webmasters/tools) you can sub [Google should grok feeds as sitemaps.](http://www.google.com/support/webmasters/bin/answer.py?answer=34654) Or rather [[plugins/inline]] should be improved to support the [sitemap protocol](http://sitemaps.org/protocol.php) natively. -- [[Hendry]] + + +Took me a minute to figure this out so I figured I'd share the steps I took: + +* Added rss=>1 and allowrss=>1 to my setup file +* Created a new page where the RSS would be created with this content, replacing "first_page" with the page in my wiki with the earliest date: + +<pre> +\[[!inline pages="* and !*/Discussion and created_after(first_page)" archive="yes" rss="yes" ]] +</pre> diff --git a/doc/todo/Improving_the_efficiency_of_match__95__glob.mdwn b/doc/todo/Improving_the_efficiency_of_match__95__glob.mdwn new file mode 100644 index 000000000..4e1df3381 --- /dev/null +++ b/doc/todo/Improving_the_efficiency_of_match__95__glob.mdwn @@ -0,0 +1,228 @@ +[[!template id=gitbranch branch=smcv/ready/glob-cache + author="[[KathrynAndersen]], [[smcv]]"]] +[[!tag patch]] + +I've been profiling my IkiWiki to try to improve speed (with many pages makes speed even more important) and I've written a patch to improve the speed of match_glob. This matcher is a good one to improve the speed of, because it gets called so many times. + +Here's my patch - please consider it! -- [[KathrynAndersen]] + +> It seems to me as though changing `glob2re` to return qr/$re/, and calling +> `memoize(glob2re)` next to the other memoize calls, would be a less +> verbose way to do this? --[[smcv]] + +>> I think so, yeah. Anyway, do you have any benchmark results handy, +>> Kathryn? --[[Joey]] + +>>> See below. +>>> Also, would it make more sense for glob2re to return qr/^$re$/i rather than qr/$re/? Everything that uses glob2re seems to use + $foo =~ /^$re$/i +>>> rather than /$re/ so I think that would make sense. +>>> -- [[KathrynAndersen]] + +>>>> Git branch `smcv/ka-glob-cache` has Kathryn's patch. Git +>>>> branch `smcv/memoize-glob2re` does as I suggested, which +>>>> is less verbose than Kathryn's patch but also not as +>>>> fast; I'm not sure why, tbh. --[[smcv]] + +>>>>> I think it's because my patch focuses on match_glob while the memoize patch focuses on `glob2re`, and `glob2re` is called in `filecheck`, `meta` and `po` as well as in `match_glob` and `match_user`; thus the memoized `glob2re` is dealing with a bigger set of globs to look up, and thus could be just that little bit slower. -- [[KathrynAndersen]] + +>>>>>> What may be going on is that glob2re is already a fairly fast +>>>>>> function, so the overhead of memoizing it with the very generic +>>>>>> `_memoizer` (see its source) swamps the memoization gain. Note +>>>>>> that the few functions memoized with the Memoizer before were much +>>>>>> more expensive, so that little overhead was acceptable then. +>>>>>> +>>>>>> It also may be that Kathryn's patch is slightly faster due to using +>>>>>> the construct `$foo =~ $regexp` rather than `$foo =~ /$regexp/` +>>>>>> (probably avoids a copy or something like that internally) -- +>>>>>> this despite checking both `exists` and `defined` on the hash, which +>>>>>> should be reundant AFAICS. +>>>>>> +>>>>>> My guess is that the best of both worlds would be to move +>>>>>> the byhand memoization to glob2re and have it return a compiled +>>>>>> `/^/i` regexp that can be used without further modifiction in most +>>>>>> cases. --[[Joey]] + +>>>>>>> Done, see `smcv/ready/glob-cache` and `smcv/glob-cache-too-far`. +>>>>>>> +>>>>>>> Kathryn's patch is a significant improvement; my first patch on top of +>>>>>>> that is a trivial cleanup that speeds it up a little, and the next two +>>>>>>> patches (using precompiled regexes) have surprisingly little effect +>>>>>>> (they don't slow it down either though, so either omit them or merge +>>>>>>> them, whichever). Detailed benchmark results below. +>>>>>>> +>>>>>>> Moving the memoization to `glob2re` actually seems to slow things down +>>>>>>> again - I suspect the docwiki has few enough mentions of `user()` etc. +>>>>>>> that caching them is a waste of time, but perhaps it's not the most +>>>>>>> representative. +>>>>>>> --[[smcv]] + +[[done]] --[[Joey]] + +-------------------------------------------------------------- + +[[!toggle id="smcv-benchmark" text="current benchmarks"]] + +[[!toggleable id="smcv-benchmark" text=""" +master at time of branch: + + time elapsed (wall): 29.6348 + time running program: 24.9212 (84.09%) + time profiling (est.): 4.7136 (15.91%) + number of calls: 1360181 + number of exceptions: 13 + + %Time Sec. #calls sec/call F name + 13.24 3.2986 3408 0.000968 Text::Balanced::_match_tagged + 10.94 2.7253 79514 0.000034 IkiWiki::PageSpec::match_glob + 3.19 0.7952 59454 0.000013 <anon>:IkiWiki/Plugin/inline.pm:223 + +`Improve the speed of match_glob`: + + time elapsed (wall): 27.9755 + time running program: 23.5293 (84.11%) + time profiling (est.): 4.4461 (15.89%) + number of calls: 1280875 + number of exceptions: 13 + + %Time Sec. #calls sec/call F name + 14.56 3.4257 3408 0.001005 Text::Balanced::_match_tagged + 7.82 1.8403 79514 0.000023 IkiWiki::PageSpec::match_glob + 3.27 0.7698 59454 0.000013 <anon>:IkiWiki/Plugin/inline.pm:223 + +`match_glob: streamline glob cache slightly`: + + time elapsed (wall): 27.5753 + time running program: 23.1714 (84.03%) + time profiling (est.): 4.4039 (15.97%) + number of calls: 1280875 + number of exceptions: 13 + + %Time Sec. #calls sec/call F name + 14.09 3.2637 3408 0.000958 Text::Balanced::_match_tagged + 7.74 1.7926 79514 0.000023 IkiWiki::PageSpec::match_glob + 3.30 0.7646 59454 0.000013 <anon>:IkiWiki/Plugin/inline.pm:223 + +`glob2re: return a precompiled, anchored case-insensitiv...`: + + time elapsed (wall): 27.5656 + time running program: 23.1464 (83.97%) + time profiling (est.): 4.4192 (16.03%) + number of calls: 1282189 + number of exceptions: 13 + + %Time Sec. #calls sec/call F name + 14.21 3.2891 3408 0.000965 Text::Balanced::_match_tagged + 7.72 1.7872 79514 0.000022 IkiWiki::PageSpec::match_glob + 3.32 0.7678 59454 0.000013 <anon>:IkiWiki/Plugin/inline.pm:223 + +`make use of precompiled regex objects`: + + time elapsed (wall): 27.5357 + time running program: 23.1289 (84.00%) + time profiling (est.): 4.4068 (16.00%) + number of calls: 1281981 + number of exceptions: 13 + + %Time Sec. #calls sec/call F name + 14.17 3.2776 3408 0.000962 Text::Balanced::_match_tagged + 7.70 1.7814 79514 0.000022 IkiWiki::PageSpec::match_glob + 3.35 0.7756 59454 0.000013 <anon>:IkiWiki/Plugin/inline.pm:223 + +`move memoization from match_glob to glob2re`: + + time elapsed (wall): 28.7677 + time running program: 23.9473 (83.24%) + time profiling (est.): 4.8205 (16.76%) + number of calls: 1360181 + number of exceptions: 13 + + %Time Sec. #calls sec/call F name + 13.98 3.3469 3408 0.000982 Text::Balanced::_match_tagged + 8.85 2.1194 79514 0.000027 IkiWiki::PageSpec::match_glob + 3.24 0.7750 59454 0.000013 <anon>:IkiWiki/Plugin/inline.pm:223 + +--[[smcv]] +"""]] + +-------------------------------------------------------------- + +[[!toggle id="ka-benchmarks" text="Kathryn's benchmarks"]] + +[[!toggleable id="ka-benchmarks" text=""" +Benchmarks done with Devel::Profile on the same testbed IkiWiki setup. I'm just showing the start of the profile output, since that's what's relevant. + +Before: +<pre> +time elapsed (wall): 27.4173 +time running program: 22.5909 (82.40%) +time profiling (est.): 4.8264 (17.60%) +number of calls: 1314729 +number of exceptions: 65 + +%Time Sec. #calls sec/call F name +11.05 2.4969 62333 0.000040 IkiWiki::PageSpec::match_glob + 4.10 0.9261 679 0.001364 Text::Balanced::_match_tagged + 2.72 0.6139 59812 0.000010 IkiWiki::SuccessReason::merge_influences +</pre> + +After: +<pre> +time elapsed (wall): 26.1843 +time running program: 21.5673 (82.37%) +time profiling (est.): 4.6170 (17.63%) +number of calls: 1252433 +number of exceptions: 65 + +%Time Sec. #calls sec/call F name + 7.66 1.6521 62333 0.000027 IkiWiki::PageSpec::match_glob + 4.33 0.9336 679 0.001375 Text::Balanced::_match_tagged + 2.81 0.6057 59812 0.000010 IkiWiki::SuccessReason::merge_influences +</pre> + +Note that the seconds per call for match_glob in the "after" case has gone down by about a third. + +K.A. +"""]] + +-------------------------------------------------------------- + +[[!toggle id="ka-patch" text="Kathryn's original patch"]] + +[[!toggleable id="ka-patch" text=""" + +<pre> +diff --git a/IkiWiki.pm b/IkiWiki.pm +index 08a3d78..c187b98 100644 +--- a/IkiWiki.pm ++++ b/IkiWiki.pm +@@ -2482,6 +2482,8 @@ sub derel ($$) { + return $path; + } + ++my %glob_cache; ++ + sub match_glob ($$;@) { + my $page=shift; + my $glob=shift; +@@ -2489,8 +2491,15 @@ sub match_glob ($$;@) { + + $glob=derel($glob, $params{location}); + +- my $regexp=IkiWiki::glob2re($glob); +- if ($page=~/^$regexp$/i) { ++ # Instead of converting the glob to a regex every time, ++ # cache the compiled regex to save time. ++ if (!exists $glob_cache{$glob} ++ or !defined $glob_cache{$glob}) ++ { ++ my $re=IkiWiki::glob2re($glob); ++ $glob_cache{$glob} = qr/^$re$/i; ++ } ++ if ($page =~ $glob_cache{$glob}) { + if (! IkiWiki::isinternal($page) || $params{internal}) { + return IkiWiki::SuccessReason->new("$glob matches $page"); + } +</pre> +"""]] +-------------------------------------------------------------- diff --git a/doc/todo/Mailing_list.mdwn b/doc/todo/Mailing_list.mdwn index b6a207420..67cbbb00b 100644 --- a/doc/todo/Mailing_list.mdwn +++ b/doc/todo/Mailing_list.mdwn @@ -18,3 +18,19 @@ Does this sound okay? > todo/bugs/forum feeds, or to some other feed they create on their user page. > And there's work on making the discussion pages more structured, on > accepting comments sent via mail, etc. --[[Joey]] + +>>I was going to make the very same request, so I'm glad to know I'm not the only one who felt the need for it. + +>>I can see your reasoning, though I don't think ikiwiki has reached the level (yet) of facilitating discussion as well as a mailing list does. +>>You've already pointed out the need for (a) more structured discussion pages, (b) comments sent via mail, but I'm not sure whether that will be enough. This is because the nature of a wiki means that discussions are scattered all over the site, as people discuss in discussion pages about the given topic - and so they should. The consequence of this, however, is that one has a choice (in regard to RSS feeds) of having too much or too little. Too little, if one only feeds on news/todo/bugs/forum, since one misses out on discussions elsewhere. Too much, because the only other option appears to be subscribing to recentchanges, which will give one *everything*, whether it is relevant or not. +>>Unfortunately, I'm not really sure what the best solution is for this problem. + +>> For those who might be interested, I've added the following RSS feeds to <http://www.dreamwidth.org>: +*ikiwiki_bugs_feed, +ikiwiki_forum_feed, +ikiwiki_news_feed, +ikiwiki_recent_feed, +ikiwiki_todo_feed, +ikiwiki_wishlist_feed* + +>>--[[KathrynAndersen]] diff --git a/doc/todo/More_flexible_po-plugin_for_translation.mdwn b/doc/todo/More_flexible_po-plugin_for_translation.mdwn new file mode 100644 index 000000000..3399f7834 --- /dev/null +++ b/doc/todo/More_flexible_po-plugin_for_translation.mdwn @@ -0,0 +1,5 @@ +I have a website with multi-language content, where some content is only in English, some in German, and some is available in both languages. + +The po-module currently has only one master-language, with slave languages, and a PageSpec should be considered. + +It would be nice to flag the content which should have a translation on a file-by-file basis (with some inline directive?) which could contain the information of the master-language for that file and the desired target-languages. diff --git a/doc/todo/Multiple_categorization_namespaces.mdwn b/doc/todo/Multiple_categorization_namespaces.mdwn new file mode 100644 index 000000000..3e9f8feaa --- /dev/null +++ b/doc/todo/Multiple_categorization_namespaces.mdwn @@ -0,0 +1,103 @@ +I came across this when working on converting my old blog into an ikiwiki, but I think it could be of more general use. + +The background: I have a (currently suspended, waiting to be converted) blog on the [il Cannocchiale](http://www.ilcannocchiale.it) hosting platform. Aside from the usual metatadata (title, author), il Cannocchiale also provides tags and two additional categorization namespaces: a blog-specific user-defind "column" (Rubrica) and a platform-wide "category" (Categoria). The latter is used to group and label a couple of platform-wide lists of latest posts, the former may be used in many different ways (e.g. multi-author blogs could have one column per author or so, or as a form of 'macro-tagging'). Columns are also a little more sophisticated than classical tags because you can assign them a subtitle too. + +When I started working on the conversion, my first idea was to convert Rubriche to subdirectories of an ikiwiki blog. However, this left me with a few annoying things: when rebuilding links from the import, I had to (programmatically) dive into each subdirectory to see where each post was; this would also be problematic for future posting, too. It also meant that moving a post from a Rubrica to the other would break all links (unless ikiwiki has a way to fix this automagically). And I wasn't too keen on the fact that the Rubrica would come up in the URL of the post. And finally, of course, I couldn't use this to preserve the Categoria metadata. + +Another solution I thought about was to use special deeper tags for the Rubrica and Categoria (like: `\[[!tag "Rubrica/Some name"]]`), but this is horrible, clumsy, and makes special treatment of these tags a PITN (for example you wouldn't want the Rubrica to be displayed together with the other tags, and you would want it displayed somewhere else like next to the title of the post). This solution however looks to me as the proper path, as long as tags could support totally separate namespaces. I have a tentative implementation of this `tagtype` feature at [my git clone of ikiwiki](http://git.oblomov.eu/ikiwiki). + +The feature is currently implemented as follows: a `tagtypes` config options takes an array of strings: the tag types to be defined _aside from the usual tags_. Each tag type automatically provides a new directive which sets up tags that different from standard tags by having a different tagbase (the same as the tagtype) and link type (again, the same as the tagtype) (a TODO item for this would to make the directive, tagbase and link type customizable). For example, for my imported blog I would define + + tagtypes => [qw{Categoria Rubrica}] + +and then in the blog posts I would have stuff like + + \[[!Categoria "LAVORO/Vita da impiegato"]] + \[[!Rubrica "Il mio mondo"]] + \[[!meta title="Blah blah"]] + \[[!meta author="oblomov"]] + + The body of the article + + \[[!tag a bunch of tags]] + +and the tags would appear at the bottom of the post, the Rubrica next to the title, etc. All of this information would end up as categories in the feeds (although I would like to rework that code to make use of namespaces, terms and labels in a different way). + +> Note [[plugins/contrib/report/discussion]]. To quote myself from the latter page: +> *I find tags as they currently exist to be too limiting. I prefer something that can be used for Faceted Tagging http://en.wikipedia.org/wiki/Faceted_classification; that is, things like Author:Fred Nurk, Genre:Historical, Rating:Good, and so on. Of course, that doesn't mean that each tag is limited to only one value, either; just to take the above examples, something might have more than one author, or have multiple genres (such as Historical + Romance).* + +> So you aren't the only one who wants to do more with tags, but I don't think that adding a new directive for each tag type is the way to go; I think it would be simpler to just have one directive, and take advantage of the new [[matching different kinds of links]] functionality, and enhance the tag directive. +> Perhaps something like this: + + \[[!tag categorica="LAVORO/Vita da impiegato" rubrica="Il mio mondo"]] + +> Part of my thinking in this is to also combine tags with [[plugins/contrib/field]], so that the tags for a page could be queried and displayed; that way, one could put them wherever you wanted on the page, using any of [[plugins/contrib/getfield]], [[plugins/contrib/ftemplate]], or [[plugins/contrib/report]]. +> --[[KathrynAndersen]] + +>> A very generic metadata framework could cover all possible usages of fields, tags, and related metadata, but keeping its _user interface_ generic would only make it hard to use. Note that this is not an objection to the idea of collapsing the fields and tags functionality (at quick glance, I cannot see a real difference between single-valued custom tagtypes and fields, but see below), but more about the syntax. + +>> I had thought about the `\[[!tag type1=value1 type2=value2]]` syntax myself, but ultimately decided against it for a number of reasons, most importantly the fact that (1) it's harder to type, (2) it's harder to spot errors in the tag types (so for example if one misspelled `categoria` as `categorica`, he might not notice it as quickly as seeing the un-parsed `\[[!categorica ]]` directive in the output html) and (3) it encourages collapsing possibly unrelated metadata together (for example, I would never consider putting the categoria information together with the rubrica one; of course with your syntax it's perfectly possible to keep them separate as well). + +>> Point (2) may be considered a downside as well as an upside, depending on perspective, of course. And it would be possible to have a set of predefined tag types to match against, like in my tagtype directive approach but with your syntax. + +>>> You seem to have answered your own objections already. -- K.A. + +>>Point (3) is of course entirely in the hands of the user, but that's exactly what syntax should be about. There is nothing functionally wrong with e.g. `\[[!meta tag=sometag author=someauthor title=sometitle rubrica=somecolumn]]`, but I honestly find it horrible. + +>>> So, really, point 3 comes down to differing aesthetics. -- K.A. + +>> A solution could be to allow both syntaxes, getting to have for example `\[[!sometagtype "blah"]]` as a shortcut for `\[[!tag sometagtype="blah"]]` (or, in the more general case, `\[[!somefieldname "blah"]]` as a shortcut for `\[[!meta fieldname="blah"]]`). + +>> I would like to point out however that there are some functional differences between categorization metadata vs other metadata that might suggest to keep fields and (my extended) tags separate. For examples, in feeds you'd want all categorization metadata to fall in one place, with some appropriate manipulation (which I still have to implement, by the way), while things like author or title would go to the corresponding feed item properties. Although it all would be possible with appropriate report or template juggling, having such default metadata handled natively looks like a bonus to me. + +>>> Whereas I prefer being able to control such things with templates, because it gives more flexibility AND control. - K.A. + +>>>> Flexibility and control is good for tuning and power-usage, but sensible defaults are a must for a platform to be usable out of the box without much intervention. Moreover, there's a possible problem with what kind of data must be passed over to templates. + +Aside from the name of the plugin (and thus of the main directive), which could be `tag`, `meta`, `field` or whatever (maybe extending `meta` would be the most sensible choice), the features we want are + +1. allow multiple values per type/attribute/field/whatever (fields currently only allows one) + * Agreed about multiple values; I've been considering whether I should add that to `field`. -- K.A. +2. allow both hidden and visible references (a la tag vs taglink) + * Hidden and visible references; that's fair enough too. My approach with `ymlfront` and `getfield` is that the YAML code is hidden, and the display is done with `getfield`, but there's no reason not to use additional approaches. -- K.A. +3. allow each type/attribute/field to be exposed under multiple queries (e.g. tags and categories; this is mostly important for backwards compatibility, not sure if it might have other uses too) + * I'm not sure what you mean here. -- K.A. + * Typical example is tags: they are accessible both as `tags` and as `categories`, although the way they are presented changes a little -- G.B. +4. allow arbitrary types/attributes/fields/whatever (even 'undefined' ones) + * Are you saying that these must be typed, or are you saying that they can be user-defined? -- K.A. + * I am saying that the user should be able to define (e.g. in the config) some set of types/fields/attributes/whatever, following the specification illustrated below, but also be able to use something like `\[[!meta somefield="somevalue"]]` where `somefield` was never defined before. In this case `somefield` will have some default values for the properties described in the spec below. -- G.B. + +Each type/attribute/field/whatever (predefined, user-defined, arbitrary) would thus have the following parameters: + +* `directive` : the name of the directive that can be used to set the value as a hidden reference; we can discuss whether, for pre- or user-defined types, it being undef means no directive or a default directive matching the attribute name would be defined. + * I still want there to be able to be enough flexibility in the concept to enable plugins such as `yamlfront`, which sets the data using YAML format, rather than using directives. -- K.A. + * The possibility to use a directive does not preclude other ways of defining the field values. IOW, even if the directive `somefield` is defined, the user would still be able to use the syntax `\[[!meta somefield="somevalue"]]`, or any other syntax (such as YAML). -- G.B. +* `linkdirective` : the name of the directive that can be used for a visible reference; no such directive would be defined by default +* `linktype` : link type for (hidden and visible) references + * Is this the equivalent to "field name"? -- K.A. + * This would be such by default, but it could be set to something different. [[Typed links|matching_different_kinds_of_links]] is a very recent ikiwiki feature. -- G.B. +* `linkbase` : akin to the tagbase parameter + * Is this a field-name -> directory mapping? -- K.A. + * yes, with each directory having one page per value. It might not make sense for all fields, of course -- G.B. + * (nods) I've been working on something similar with my unreleased `tagger` module. In that, by default, the field-name maps to the closest wiki-page of the same name. Thus, if one had the field "genre=poetry" on the page fiction/stories/mary/lamb, then that would map to fiction/genre/poetry if fiction/genre existed. --K.A. + * that's the idea. In your case you could have the linkbase of genre be fiction/genre, and it would be created if it was missing. -- G.B. +* `queries` : list of template queries this type/attribute/field/whatever is exposed to + * I'm not sure what you mean here. -- K.A. + * as mentioned before, some fields may be made accessible through different template queries, in different form. This is the case already for tags, that also come up in the `categories` query (used by Atom and RSS feeds). -- G.B. + * Ah, do you mean that the input value is the same, but the output format is different? Like the difference between TMPL_VAR NAME="FOO" and TMPL_VAR NAME="raw_FOO"; one is htmlized, and the other is not. -- K.A. + * Actually this is about the same information appearing in different queries (e.g. NAME="FOO" and NAME="BAR"). Example: say that I defined a "Rubrica" field. I would want both tags and categories to appear in `categories` template query, but only tags would appear in the `tags` query, and only Rubrica values to appear in `rubrica` queries. The issue of different output formats was presented in the next paragraph instead. -- G.B. + +Where this approach is limiting is on the kind of data that is passed to (template) queries. The value of the metadata fields might need some massaging (e.g. compare how tags are passed to tags queries vs cateogires queries, or also see what is done with the fields in the current `meta` plugin). I have problems on picturing an easy way to make this possible user-side (i.e. via templates and not in Perl modules). Suggestions welcome. + +One possibility could be to have the `queries` configuration allow a hash mapping query names to functions that would transform the data. Lacking that possibility, we might have to leave some predefined fields to have custom Perl-side treatment and leave custom fields to be untransformable. + +----- + +I've now updated the [[plugins/contrib/field]] plugin to have: + +* arrays (multi-valued fields) +* the "linkbase" option as mentioned above (called field_tags), where the linktype is the field name. + +I've also updated [[plugins/contrib/ftemplate]] and [[plugins/contrib/report]] to be able to use multi-valued fields, and [[plugins/contrib/ymlfront]] to correctly return multi-valued fields when they are requested. + +--[[KathrynAndersen]] diff --git a/doc/todo/OpenSearch.mdwn b/doc/todo/OpenSearch.mdwn index e63ded688..c35da54e1 100644 --- a/doc/todo/OpenSearch.mdwn +++ b/doc/todo/OpenSearch.mdwn @@ -15,4 +15,24 @@ contain the wiki title from `ikiwiki.setup`. --[[JoshTriplett]] +> I support adding this. I think all that is needed, beyond the simple task +> of adding the link header, is to make the search plugin write out +> the xml file, probably based on a template. +> +> One problem is that the +> [specification](http://www.opensearch.org/Specifications/OpenSearch/1.1#OpenSearch_description_document) +> for the XML file contains a number of silly limits to field lenghs. +> For example, it wants a "ShortName" that identifies the search engine, +> to be 16 characters or less. The Description is limited to 1024, +> the LongName to 48. This limits what existing config settings can be +> reused for those. +> +> Another semi-problem is that the specification saz: +> +>> OpenSearch description documents should include at least one Query element of role="example" that is expected to return search results. Search clients may use this example query to validate that the search engine is working properly. +> +> How should ikiwiki know what example query will return actual results? +> (How would a client know if a HTML page contains results or not, anyway?) +> Sillyness. Ignore this? --[[Joey]] + [[wishlist]] diff --git a/doc/todo/Option_to_make_title_an_h1__63__.mdwn b/doc/todo/Option_to_make_title_an_h1__63__.mdwn index f4023d6dd..8345cd010 100644 --- a/doc/todo/Option_to_make_title_an_h1__63__.mdwn +++ b/doc/todo/Option_to_make_title_an_h1__63__.mdwn @@ -11,4 +11,4 @@ Currently, the page title (either the name of the page or the title specified wi > latter, making `#` (only when on the first line) set the page title, removing it from > the page body. --[[JasonBlevins]], October 22, 2008 - [h1title]: http://code.jblevins.org/ikiwiki/plugins.git/plain/h1title.pm + [h1title]: http://jblevins.org/git/ikiwiki/plugins.git/plain/h1title.pm diff --git a/doc/todo/Resolve_native_reStructuredText_links_to_ikiwiki_pages.mdwn b/doc/todo/Resolve_native_reStructuredText_links_to_ikiwiki_pages.mdwn index ca7b282fa..6e0f32fd5 100644 --- a/doc/todo/Resolve_native_reStructuredText_links_to_ikiwiki_pages.mdwn +++ b/doc/todo/Resolve_native_reStructuredText_links_to_ikiwiki_pages.mdwn @@ -322,3 +322,12 @@ The page is rST-parsed once in 'scan' and once in 'htmlize' (the first to genera >> However, I think that if the cache does not work for a big load, it should >> not work at all; small loads are small so they don't matter. --ulrik +----- + +Another possiblity is using empty url for wikilinks (gitit uses this approach), for example: + + `SomePage <>`_ + +Since it uses *empty* url, I would like to call it *proposal 0* :-) --[weakish] + +[weakish]: http://weakish.pigro.net diff --git a/doc/todo/Separate_OpenIDs_and_usernames.mdwn b/doc/todo/Separate_OpenIDs_and_usernames.mdwn index 2cd52e8c4..a4940220a 100644 --- a/doc/todo/Separate_OpenIDs_and_usernames.mdwn +++ b/doc/todo/Separate_OpenIDs_and_usernames.mdwn @@ -6,6 +6,48 @@ I see this being implemented in one of two possible ways. The easiest seems like A slightly more complex next step would be to request sreg from the provider and, if provided, automatically set the identity's username and email address from the provided persona. If username login to accounts with blank passwords is disabled, then you have the best of both worlds. Passwordless signin, human-friendly attribution, automatic setting of preferences. +> Given that openids are a global user identifier, that can look as pretty +> as the user cares to make it look via delegation, I am not a fan of +> having a site-local identifier that layered on top of that. Perhaps +> partly because every site that I have seen that does that has openid +> implemented as a badly-done wart on the side of their regular login +> system. +> +> The openid plugin now attempts to get an email and a username, and stores +> them in the session database for later use (ie, when the user edits a +> page). +> +> I am considering displaying the userid or fullname, if available, +> instead of the munged openid url in recentchanges and comments. +> It would be nice for those nasty [[google_openids|forum/google_openid_broken?]]. +> But, I first have to find a way to encode the name in the VCS commit log, +> while still keeping the openid of the committer in there too. +> Perhaps something like this (for git): --[[Joey]] +> +> Author: Joey Hess <http://joey.kitenet.net/@web> +> +> Only problem with the above is that the openid will still be displayed +> by CIA. Other option is this, which solves that, but at the expense of +> having to munge the username to fit inside the email address, +> and generally seems backwards: --[[Joey]] +> +> Author: http://joey.kitenet.net/ <Joey_Hess@web> +> +> So, what needs to be done: +> +> * Change `rcs_commit` and `rcs_commit_staged` to take a session object, +> instead of just a userid. (For back-compat, if the parameter is +> not an object, it's a userid.) Bump ikiwiki plugin interface version. +> (done) +> * Modify all RCS plugins to include the session username somewhere +> in the commit, and parse it back out in `rcs_recentchanges`. +> (done for git only so far) +> * Modify recentchanges plugin to display the username instead of the +> `openiduser`. +> (done) +> * Modify comment plugin to put the session username in the comment +> template instead of the `openiduser`. (done) + Unfortunately I don't speak Perl, so hopefully someone thinks these suggestions are good enough to code up. I've hacked on openid code in Ruby before, so hopefully these changes aren't all that difficult to implement. Even if you don't get any data via sreg, you're no worse off than where you are now, so I don't think there'd need to be much in the way of error/sanity-checking of returned data. If it's null or not available then no big deal, typing in a username is no sweat. -[[!tag wishlist]] +[[!tag wishlist done]] diff --git a/doc/todo/Set_templates_for_whole_sections_of_the_site.mdwn b/doc/todo/Set_templates_for_whole_sections_of_the_site.mdwn index d0c09796f..b130f4ec5 100644 --- a/doc/todo/Set_templates_for_whole_sections_of_the_site.mdwn +++ b/doc/todo/Set_templates_for_whole_sections_of_the_site.mdwn @@ -29,3 +29,5 @@ I've written a new plugin, sectiontemplate, available in the `page_tmpl` branch >>> >>> I do still think combining this with pagetemplate would be good. >>> --[[Joey]] + +>>>> This is exactly what I was looking for and it took me a while to find it. I very much support the idea to provide this as a regular plugin, be it merged with pagetemplate or stand-alone. Thank you for your work and code! --BenTo diff --git a/doc/todo/Support_XML-RPC-based_blogging.mdwn b/doc/todo/Support_XML-RPC-based_blogging.mdwn index f9685be73..6a0593b17 100644 --- a/doc/todo/Support_XML-RPC-based_blogging.mdwn +++ b/doc/todo/Support_XML-RPC-based_blogging.mdwn @@ -9,6 +9,9 @@ blog names would work. --[[JoshTriplett]] >> I'd love to see support for this and would be happy to contribute towards a bounty (say US$100) :-). [PmWiki](http://www.pmwiki.org/) has a plugin which [implements this](http://www.pmwiki.org/wiki/Cookbook/XMLRPC) in a way which seems fairly sensible as an end user. --[[AdamShand]] +>>> Bump. This would be a nice feature, and with the talent on this project I'm sure it could be done safely, too. + + [[!tag soc]] [[!tag wishlist]] diff --git a/doc/todo/Tags_list_in_page_footer_uses_basename.mdwn b/doc/todo/Tags_list_in_page_footer_uses_basename.mdwn index e2221bb84..603e82b20 100644 --- a/doc/todo/Tags_list_in_page_footer_uses_basename.mdwn +++ b/doc/todo/Tags_list_in_page_footer_uses_basename.mdwn @@ -6,3 +6,6 @@ I think the tag list should always contain the full path to the tag, with the ta > What if tagbase is not used? I know this would clutter up the display of > my tags on several wikis, including this one. --[[Joey]] + +>> Since Giuseppe's patches to fix [[bugs/tag_behavior_changes_introduced_by_typed_link_feature]], +>> the tag list has what Josh requested, but only if a tagbase is used. [[done]] --[[smcv]] diff --git a/doc/todo/__42__forward__42__ing_functionality_for_the_meta_plugin.mdwn b/doc/todo/__42__forward__42__ing_functionality_for_the_meta_plugin.mdwn index 61b19d302..b3804d652 100644 --- a/doc/todo/__42__forward__42__ing_functionality_for_the_meta_plugin.mdwn +++ b/doc/todo/__42__forward__42__ing_functionality_for_the_meta_plugin.mdwn @@ -4,7 +4,7 @@ to the [[`meta`_plugin|plugins/meta]]. > [[done]], with some changes --[[Joey]] Find the most recent version at -<http://www.schwinge.homeip.net/~thomas/tmp/meta_forward.patch>. +<http://schwinge.homeip.net/~thomas/tmp/meta_forward.patch>. I can't use `scrub(...)`, as that will strip out the forwarding HTML command. How to deal with that? diff --git a/doc/todo/abbreviation.mdwn b/doc/todo/abbreviation.mdwn index d24166710..f2880091c 100644 --- a/doc/todo/abbreviation.mdwn +++ b/doc/todo/abbreviation.mdwn @@ -2,4 +2,6 @@ We might want some kind of abbreviation and acronym plugin. --[[JoshTriplett]] * Not sure if this is what you mean, but I'd love a way to make works which match existing page names automatically like (eg. if there is a page called "MySQL" then any time the word MySQL is mentioned it should become a link to that page). -- [[AdamShand]] + * The python-markdown-extras package has support for [abbreviations](http://www.freewisdom.org/projects/python-markdown/Abbreviations), with the syntax that you just use the abbreviation in text (e.g. HTML) and then define the abbreviations at the end (like "footnote-style" links). For consistency, it might be good to use the same syntax, which apparently derives from [PHP-markdown-extra](http://michelf.com/projects/php-markdown/extra/#abbr). + [[wishlist]] diff --git a/doc/todo/adjust_commit_message_for_rename__44___remove.mdwn b/doc/todo/adjust_commit_message_for_rename__44___remove.mdwn new file mode 100644 index 000000000..3d0d1aff4 --- /dev/null +++ b/doc/todo/adjust_commit_message_for_rename__44___remove.mdwn @@ -0,0 +1,5 @@ +When you rename or remove pages using the relevant plugins, a commit message is generated automatically by the plugin. + +It would be nice to provide a text field in the remove/rename form, pre-populated with the automatic message, so that a user may customize or append to the message (modulo VCS support) + +-- [[Jon]] diff --git a/doc/todo/alias_directive.mdwn b/doc/todo/alias_directive.mdwn new file mode 100644 index 000000000..71a2efc76 --- /dev/null +++ b/doc/todo/alias_directive.mdwn @@ -0,0 +1,72 @@ +An alias directive could work like an inverse redirect, but in a more +maintainable way. Currently, a page might have several redirects leading to it, +without an easy way of enumerating them. Therefore, the following directive is +suggested for addition (possibly by means of a plugin): + +> The `alias` and `aliastext` directives implicitly create +> redirect pages to the page they are used on. If two or more pages claim a +> non-existing page to be an alias, a disambiguation page will automatically +> generated. If an existing page is claimed as an alias, it will be prefixed +> with a note that its topic is also an alias for other pages. +> +> All aliases to a page are automatically listed below the backlink and tag +> lists at the bottom of a page by default. This can be configured globally by +> setting the `alias_list` configuration option to `false`, or set explicitly +> per alias by specifying `list=true` or `list=false`. +> +> Similar to the `taglink` directive, `aliastext` produces the alias name as +> well as registering it. +> +> ## Usage example +> +> `Greece.mdwn`: +> +> > Greece, also known as \[[!aliastext Hellas]] and officially the +> > \[[!aliastext "Hellenic Republic"]], is a … +> > +> > <!-- there are so many people who misspell this, let's create a redirect --> +> > \[[!alias Grece list=false]] +> +> This page by itself will redirect from the "Hellas", "Hellenic Republic" and +> "Grece" pages as if they both contained just: +> +> > \[[!meta redir="Greece"]] +> +> If, on the other hand, `Hellas Planitia` also claims `[[!alias Hellas]]`, the +> Hellas page will look like this: +> +> > **Hellas** is an alias for the following pages: +> > +> > * \[[Greece]] +> > * \[[Hellas Planitia]] + +The proposed plugin/directive could be extended, eg. by also including +old-style redirects in the alias list, but that might introduce unwanted +coupling with the meta directive. + +----------------- + +On second thought, implementing this might have similarities with +[[todo/auto-create tag pages according to a template]] -- the auto-created +pages would, if the way of the alias directive is followed, not create physical +files, though, but be created just when someone edits them. + +If multiple plugins do such a trick, they would have to fight over who comes +first. If, for example, we have a setup where not yet created tag pages are +automatically generated as "\[[!inline pages="link(<TMPL_VAR TAG>)" +archive="yes"]]" and aliases are enabled, and a non-tag pages grabs a tag as an +alias (as to redirect all taglinks of the tag to itself), there are two +possibilities: + +* The autotag plugin comes first: + * autotag sees the missing tag and creates its "\[[!inline" stuff + * alias sees that there is already content and adds its prefix +* The alias plugin comes first (this is the prefered way): + * alias sees the empty page, sees it is not contested by other alias + directives and creates its "\[[!meta" redirect + * autotag sees there is already content and doesn't do anything + +That issue could be handled with "priority number" on the hook, with plugins +with a lower number being called first. + +[[!tag wishlist]] diff --git a/doc/todo/allow_displaying_number_of_comments.mdwn b/doc/todo/allow_displaying_number_of_comments.mdwn new file mode 100644 index 000000000..02d55fc9b --- /dev/null +++ b/doc/todo/allow_displaying_number_of_comments.mdwn @@ -0,0 +1,30 @@ +My `numcomments` Git branch adds a `NUMCOMMENTS` `TMPL_VAR`, which is +useful to add to the `forumpage.tmpl` template to emulate (the nice +bits of) a more usual webforum. + +Please review... and pull :) + +-- [[intrigeri]] + +> How is having this variable for showing a count of the comments +> better (or more forum-ish) than the COMMENTSLINK variable which +> includes a count and a link to the comments, and is already displayed +> in inlinepage.tmpl? +> +> `num_comments` will never return undef. +> +> I see no need to add a second pagetemplate hook. +> The existing one can be added to. Probably inside its `if ($shown)` +> block. +> +> It may also be a good idea to either combine the calls to `num_comments` +> used for this and for the commentslink, +> or to memoize it. I'm thinking generally memoizing it may be a good idea +> since the comments for a page will typically be counted twice when it's +> inlined. +> --[[Joey]] + +[[patch]] + +>> Well, the COMMENTSLINK variable fits my needs. Sorry for +>> the disturbance. [[done]] --[[intrigeri]] diff --git a/doc/todo/allow_plugins_to_add_sorting_methods.mdwn b/doc/todo/allow_plugins_to_add_sorting_methods.mdwn new file mode 100644 index 000000000..b523cd19f --- /dev/null +++ b/doc/todo/allow_plugins_to_add_sorting_methods.mdwn @@ -0,0 +1,304 @@ +[[!tag patch]] + +The available [[ikiwiki/pagespec/sorting]] methods are currently hard-coded in +IkiWiki.pm, making it difficult to add any extra sorting mechanisms. I've +prepared a branch which adds 'sort' as a hook type and uses it to implement a +new `meta_title` sort type. + +Someone could use this hook to make `\[[!inline sort=title]]` prefer the meta +title over the page name, but for compatibility, I'm not going to (I do wonder +whether it would be worth making sort=name an alias for the current sort=title, +and changing the meaning of sort=title in 4.0, though). + +> What compatability concerns, exactly, are there that prevent making that +> change now? --[[Joey]] + +*[sort-hooks branch now withdrawn in favour of sort-package --s]* + +I briefly tried to turn *all* the current sort types into hook functions, and +have some of them pre-registered, but decided that probably wasn't a good idea. +That earlier version of the branch is also available for comparison: + +*[also withdrawn in favour of sort-package --s]* + +>> I wonder if IkiWiki would benefit from the concept of a "sortspec", like a [[ikiwiki/PageSpec]] but dedicated to sorting lists of pages rather than defining lists of pages? Rather than defining a sort-hook, define a SortSpec class, and enable people to add their own sort methods as functions defined inside that class, similarly to the way they can add their own pagespec definitions. --[[KathrynAndersen]] + +>>> [[!template id=gitbranch branch=smcv/ready/sort-package author="[[Simon_McVittie|smcv]]"]] +>>> I'd be inclined to think that's overkill, but it wasn't very hard to +>>> implement, and in a way is more elegant. I set it up so sort mechanisms +>>> share the `IkiWiki::PageSpec` package, but with a `cmp_` prefix. Gitweb: +>>> <http://git.pseudorandom.co.uk/smcv/ikiwiki.git?a=shortlog;h=refs/heads/sort-package> + +>>>> I agree it seems more elegant, so I have focused on it. +>>>> +>>>> I don't know about reusing `IkiWiki::PageSpec` for this. +>>>> --[[Joey]] + +>>>>> Fair enough, `IkiWiki::SortSpec::cmp_foo` would be just +>>>>> as easy, or `IkiWiki::Sorting::cmp_foo` if you don't like +>>>>> introducing "sort spec" in the API. I took a cue from +>>>>> [[ikiwiki/pagespec/sorting]] being a subpage of +>>>>> [[ikiwiki/pagespec]], and decided that yes, sorting is +>>>>> a bit like a pagespec :-) Which name would you prefer? --s + +>>>>>> `SortSpec` --[[Joey]] + +>>>>>>> [[Done]]. --s + +>>>> I would be inclined to drop the `check_` stuff. --[[Joey]] + +>>>>> It basically exists to support `title_natural`, to avoid +>>>>> firing up the whole import mechanism on every `cmp` +>>>>> (although I suppose that could just be a call to a +>>>>> memoized helper function). It also lets sort specs that +>>>>> *must* have a parameter, like +>>>>> [[field|plugins/contrib/field/discussion]], fail early +>>>>> (again, not so valuable). +>>>>> +>>>>>> AFAIK, `use foo` has very low overhead when the module is already +>>>>>> loaded. There could be some evalation overhead in `eval q{use foo}`, +>>>>>> if so it would be worth addressing across the whole codebase. +>>>>>> --[[Joey]] +>>>>>> +>>>>>>> check_cmp_foo now dropped. --s +>>>>> +>>>>> The former function could be achieved at a small +>>>>> compatibility cost by putting `title_natural` in a new +>>>>> `sortnatural` plugin (that fails to load if you don't +>>>>> have `title_natural`), if you'd prefer - that's what would +>>>>> have happened if `title_natural` was written after this +>>>>> code had been merged, I suspect. Would you prefer this? --s + +>>>>>> Yes! (Assuming it does not make sense to support +>>>>>> natural order sort of other keys than the title, at least..) +>>>>>> --[[Joey]] + +>>>>>>> Done. I added some NEWS.Debian for it, too. --s + +>>>> Wouldn't it make sense to have `meta(title)` instead +>>>> of `meta_title`? --[[Joey]] + +>>>>> Yes, you're right. I added parameters to support `field`, +>>>>> and didn't think about making `meta` use them too. +>>>>> However, `title` does need a special case to make it +>>>>> default to the basename instead of the empty string. +>>>>> +>>>>> Another special case for `title` is to use `titlesort` +>>>>> first (the name `titlesort` is derived from Ogg/FLAC +>>>>> tags, which can have `titlesort` and `artistsort`). +>>>>> I could easily extend that to other metas, though; +>>>>> in fact, for e.g. book lists it would be nice for +>>>>> `field(bookauthor)` to behave similarly, so you can +>>>>> display "Douglas Adams" but sort by "Adams, Douglas". +>>>>> +>>>>> `meta_title` is also meant to be a prototype of how +>>>>> `sort=title` could behave in 4.0 or something - sorting +>>>>> by page name (which usually sorts in approximately the +>>>>> same place as the meta-title, but occasionally not), while +>>>>> displaying meta-titles, does look quite odd. --s + +>>>>>> Agreed. --[[Joey]] + +>>>>>>> I've implemented meta(title). meta(author) also has the +>>>>>>> `sortas` special case; meta(updated) and meta(date) +>>>>>>> should also work how you'd expect them to (but they're +>>>>>>> earliest-first, unlike age). --s + +>>>> As I read the regexp in `cmpspec_translate`, the "command" +>>>> is required to have params. They should be optional, +>>>> to match the documentation and because most sort methods +>>>> do not need parameters. --[[Joey]] + +>>>>> No, `$2` is either `\w+\([^\)]*\)` or `[^\s]+` (with the +>>>>> latter causing an error later if it doesn't also match `\w+`). +>>>>> This branch doesn't add any parameterized sort methods, +>>>>> in fact, although I did provide one on +>>>>> [[field's_discussion_page|plugins/contrib/report/discussion]]. --s + +>>>> I wonder if it would make sense to add some combining keywords, so +>>>> a sortspec reads like `sort="age then ascending title"` +>>>> In a way, this reduces the amount of syntax that needs to be learned. +>>>> I like the "then" (and it could allow other operations than +>>>> simple combination, if any others make sense). Not so sure about the +>>>> "ascending", which could be "reverse" instead, but "descending age" and +>>>> "ascending age" both seem useful to be able to explicitly specify. +>>>> --[[Joey]] + +>>>>> Perhaps. I do like the simplicity of [[KathrynAndersen]]'s syntax +>>>>> from [[plugins/contrib/report]] (which I copied verbatim, except for +>>>>> turning sort-by-`field` into a parameterized spec). +>>>>> +>>>>> If we're getting into English-like (or at least SQL-like) queries, +>>>>> it might make sense to change the signature of the hook function +>>>>> so it's a function to return a key, e.g. +>>>>> `sub key_age { return -%pagemtime{$_[0]) }`. Then we could sort like +>>>>> this: +>>>>> +>>>>> field(artistsort) or field(artist) or constant(Various Artists) then meta(titlesort) or meta(title) or title +>>>>> +>>>>> with "or" binding more closely than "then". Does this seem valuable? +>>>>> I think the implementation would be somewhat more difficult. and +>>>>> it's probably getting too complicated to be worthwhile, though? +>>>>> (The keys that actually benefit from this could just +>>>>> have smarter cmp functions, I think.) +>>>>> +>>>>> If the hooks return keys rather than cmp results, then we could even +>>>>> have "lowercase" as an adjective used like "ascending"... maybe. +>>>>> However, there are two types of adjective here: "lowercase" +>>>>> really applies to the keys, whereas "ascending" applies to the "cmp" +>>>>> result. Again, I think this is getting too complex, and could just +>>>>> be solved with smarter cmp functions. +>>>>> +>>>>>> I agree. (Also, I think returning keys may make it harder to write +>>>>>> smarter cmp functions.) --[[Joey]] +>>>>> +>>>>> Unfortunately, `sort="ascending mtime"` actually sorts by *descending* +>>>>> timestamp (but`sort=age` is fine, because `age` could be defined as +>>>>> now minus `ctime`). `sort=freshness` isn't right either, because +>>>>> "sort by freshness" seems as though it ought to mean freshest first, +>>>>> but "sort by ascending freshness" means put the least fresh first. If +>>>>> we have ascending and descending keywords which are optional, I don't +>>>>> think we really want different sort types to have different default +>>>>> directions - it seems clearer to have `ascending` always be a no-op, +>>>>> and `descending` always negate. +>>>>> +>>>>>> I think you've convinced me that ascending/descending impose too +>>>>>> much semantics on it, so "-" is better. --[[Joey]] + +>>>>>>> I've kept the semantics from `report` as-is, then: +>>>>>>> e.g. `sort="age -title"`. --s + +>>>>> Perhaps we could borrow from `meta updated` and use `update_age`? +>>>>> `updateage` would perhaps be a more normal IkiWiki style - but that +>>>>> makes me think that updateage is a quantity analagous to tonnage or +>>>>> voltage, with more or less recently updated pages being said to have +>>>>> more or less updateage. I don't know whether that's good or bad :-) +>>>>> +>>>>> I'm sure there's a much better word, but I can't see it. Do you have +>>>>> a better idea? --s + +[Regarding the `meta title=foo sort=bar` special case] + +> I feel it sould be clearer to call that "sortas", since "sort=" is used +> to specify a sort method in other directives. --[[Joey]] +>> Done. --[[smcv]] + +## speed + +I notice the implementation does not use the magic `$a` and `$b` globals. +That nasty perl optimisation is still worthwhile: + + perl -e 'use warnings; use strict; use Benchmark; sub a { $a <=> $b } sub b ($$) { $_[0] <=> $_[1] }; my @list=reverse(1..9999); timethese(10000, {a => sub {my @f=sort a @list}, b => sub {my @f=sort b @list}, c => => sub {my @f=sort { b($a,$b) } @list}})' + Benchmark: timing 10000 iterations of a, b, c... + a: 80 wallclock secs (76.74 usr + 0.05 sys = 76.79 CPU) @ 130.23/s (n=10000) + b: 112 wallclock secs (106.14 usr + 0.20 sys = 106.34 CPU) @ 94.04/s (n=10000) + c: 330 wallclock secs (320.25 usr + 0.17 sys = 320.42 CPU) @ 31.21/s (n=10000) + +Unfortunatly, I think that c is closest to the new implementation. +--[[Joey]] + +> Unfortunately, `$a` isn't always `$main::a` - it's `$Package::a` where +> `Package` is the call site of the sort call. This was a showstopper when +> `sort` was a hook implemented in many packages, but now that it's a +> `SortSpec`, I may be able to fix this by putting a `sort` wrapper in the +> `SortSpec` namespace, so it's like this: +> +> sub sort ($@) +> { +> my $cmp = shift; +> return sort $cmp @_; +> } +> +> which would mean that the comparison used `$IkiWiki::SortSpec::a`. +> --s + +>> I've now done this. On a wiki with many [[plugins/contrib/album]]s +>> (a full rebuild takes half an hour!), I tested a refresh after +>> `touch tags/*.mdwn` (my tag pages contain inlines of the form +>> `tagged(foo)` sorted by date, so they exercise sorting). +>> I also tried removing sorting from `pagespec_match_list` +>> altogether, as an upper bound for how fast we can possibly make it. +>> +>> * `master` at branch point: 63.72user 0.29system +>> * `master` at branch point: 63.91user 0.37system +>> * my branch, with `@_`: 65.28user 0.29system +>> * my branch, with `@_`: 65.21user 0.28system +>> * my branch, with `$a`: 64.09user 0.28system +>> * my branch, with `$a`: 63.83user 0.36system +>> * not sorted at all: 58.99user 0.29system +>> * not sorted at all: 58.92user 0.29system +>> +>> --s + +> I do notice that `pagespec_match_list` performs the sort before the +> filter by pagespec. Is this a deliberate design choice, or +> coincidence? I can see that when `limit` is used, this could be +> used to only run the pagespec match function until `limit` pages +> have been selected, but the cost is that every page in the wiki +> is sorted. Or, it might be useful to do the filtering first, then +> sort the sub-list thus produced, then finally apply the limit? --s + +>> Yes, it was deliberate, pagespec matching can be expensive enough that +>> needing to sort a lot of pages seems likely to be less work. (I don't +>> remember what benchmarking was done though.) --[[Joey]] + +>>> We discussed this on IRC and Joey pointed out that this also affects +>>> dependency calculation, so I'm not going to get into this now... --s + +Joey pointed out on IRC that the `titlesort` feature duplicates all the +meta titles. I did that in order to sort by the unescaped version, but +I've now changed the branch to only store that if it makes a difference. +--s + +## Documentation from sort-package branch + +### advanced sort orders (conditionally added to [[ikiwiki/pagespec/sorting]]) + +* `title_natural` - Orders by title, but numbers in the title are treated + as such, ("1 2 9 10 20" instead of "1 10 2 20 9") +* `meta(title)` - Order according to the `\[[!meta title="foo" sortas="bar"]]` + or `\[[!meta title="foo"]]` [[ikiwiki/directive]], or the page name if no + full title was set. `meta(author)`, `meta(date)`, `meta(updated)`, etc. + also work. + +### Multiple sort orders (added to [[ikiwiki/pagespec/sorting]]) + +In addition, you can combine several sort orders and/or reverse the order of +sorting, with a string like `age -title` (which would sort by age, then by +title in reverse order if two pages have the same age). + +### meta sortas parameter (added to [[ikiwiki/directive/meta]]) + +[in title] + +An optional `sort` parameter will be used preferentially when +[[ikiwiki/pagespec/sorting]] by `meta(title)`: + + \[[!meta title="The Beatles" sort="Beatles, The"]] + + \[[!meta title="David Bowie" sort="Bowie, David"]] + +[in author] + + An optional `sortas` parameter will be used preferentially when + [[ikiwiki/pagespec/sorting]] by `meta(author)`: + + \[[!meta author="Joey Hess" sortas="Hess, Joey"]] + +### Sorting plugins (added to [[plugins/write]]) + +Similarly, it's possible to write plugins that add new functions as +[[ikiwiki/pagespec/sorting]] methods. To achieve this, add a function to +the IkiWiki::SortSpec package named `cmp_foo`, which will be used when sorting +by `foo` or `foo(...)` is requested. + +The names of pages to be compared are in the global variables `$a` and `$b` +in the IkiWiki::SortSpec package. The function should return the same thing +as Perl's `cmp` and `<=>` operators: negative if `$a` is less than `$b`, +positive if `$a` is greater, or zero if they are considered equal. It may +also raise an error using `error`, for instance if it needs a parameter but +one isn't provided. + +The function will also be passed one or more parameters. The first is +`undef` if invoked as `foo`, or the parameter `"bar"` if invoked as `foo(bar)`; +it may also be passed additional, named parameters. diff --git a/doc/todo/allow_site-wide_meta_definitions.mdwn b/doc/todo/allow_site-wide_meta_definitions.mdwn index 70ccc2b68..82670250e 100644 --- a/doc/todo/allow_site-wide_meta_definitions.mdwn +++ b/doc/todo/allow_site-wide_meta_definitions.mdwn @@ -5,8 +5,159 @@ I'd like to define [[plugins/meta]] values to apply across all pages site-wide unless the pages define their own: default values for meta definitions essentially. -Here's a patch to achieve this (also in the "defaultmeta" branch of -my github ikiwiki fork): + <snip old patch, see below for latest> + +-- [[Jon]] + +> This doesn't support multiple-argument meta directives like +> `link=x rel=y`, or meta directives with special side-effects like +> `updated`. +> +> The first could be solved (if you care) by a syntax like this: +> +> meta_defaults => [ +> { copyright => "© me" }, +> { link => "about:blank", rel => "silly", }, +> ] +> +> The second could perhaps be solved by invoking `meta::preprocess` from within +> `scan` (which might be a simplification anyway), although this is complicated +> by the fact that some (but not all!) meta headers are idempotent. +> +> --[[smcv]] + +>> Thanks for your comment. I've revised the patch to use the config syntax +>> you suggest. I need to perform some more testing to make sure I've +>> addressed the issues you highlight. +>> +>> I had to patch part of IkiWiki core, the merge routine in Setup, because +>> the use of `possibly_foolish_untaint` was causing the hashrefs at the deep +>> end of the data structure to be converted into strings. The specific change +>> I've made may not be acceptable, though -- I'd appreciate someone providing +>> some feedback on that hunk! + +>>> Well, re that hunk, taint checking is currently disabled, but +>>> if the perl bug that disallows it is fixed and it is turned back on, +>>> the hash values will remain tainted, which will probably lead to +>>> problems. +>>> +>>> I'm also leery of using such a complex data structure in config. +>>> The websetup plugin would be hard pressed to provide a UI for such a +>>> data structure. (It lacks even UI for a single hash ref yet, let alone +>>> a list.) +>>> +>>> Also, it seems sorta wrong to have two so very different syntaxes to +>>> represent the same meta data. A user without a lot of experience will +>>> be hard pressed to map from a directive to this in the setup file. +>>> +>>> All of which leads me to think the setup file could just contain +>>> a text that could hold meta directives. Which generalizes really to +>>> a text that contains any directives, and is, perhaps appended to the +>>> top of every page. Which nearly generalizes to the sidebar plugin, +>>> or perhaps something more general than that... +>>> +>>> However, excessive generalization is the root of all evil, so +>>> I'm not necessarily saying that's a good idea. Indeed, my memory +>>> concerns below invalidate this idea pretty well. --[[Joey]] + + diff --git a/IkiWiki/Plugin/meta.pm b/IkiWiki/Plugin/meta.pm + index 6fe9cda..2f8c098 100644 + --- a/IkiWiki/Plugin/meta.pm + +++ b/IkiWiki/Plugin/meta.pm + @@ -13,6 +13,8 @@ sub import { + hook(type => "needsbuild", id => "meta", call => \&needsbuild); + hook(type => "preprocess", id => "meta", call => \&preprocess, scan => 1); + hook(type => "pagetemplate", id => "meta", call => \&pagetemplate); + + hook(type => "scan", id => "meta", call => \&scan) + + if $config{"meta_defaults"}; + } + + sub getsetup () { + @@ -305,6 +307,15 @@ sub match { + } + } + + +sub scan() { + + my %params = @_; + + my $page = $params{page}; + + foreach my $default (@{$config{"meta_defaults"}}) { + + preprocess(%$default, page => $page, + + destpage => $page, preview => 0); + + } + +} + + + package IkiWiki::PageSpec; + + sub match_title ($$;@) { + diff --git a/IkiWiki/Setup.pm b/IkiWiki/Setup.pm + index 8a25ecc..e4d50c9 100644 + --- a/IkiWiki/Setup.pm + +++ b/IkiWiki/Setup.pm + @@ -51,7 +51,13 @@ sub merge ($) { + $config{$c}=$setup{$c}; + } + else { + - $config{$c}=[map { IkiWiki::possibly_foolish_untaint($_) } @{$setup{$c}}] + + $config{$c}=[map { + + if(ref $_ eq 'HASH') { + + $_ + + } else { + + IkiWiki::possibly_foolish_untaint($_) + + } + + } @{$setup{$c}}]; + } + } + elsif (ref $setup{$c} eq 'HASH') { + diff --git a/doc/ikiwiki/directive/meta.mdwn b/doc/ikiwiki/directive/meta.mdwn + index 000f461..8d34ee4 100644 + --- a/doc/ikiwiki/directive/meta.mdwn + +++ b/doc/ikiwiki/directive/meta.mdwn + @@ -12,6 +12,16 @@ also specifies some additional sub-parameters. + The field values are treated as HTML entity-escaped text, so you can include + a quote in the text by writing `"` and so on. + + +You can also define site-wide defaults for meta values by including them + +in your setup file. The key used is `meta_defaults` and the value is a list + +of hashes, one per meta directive. e.g.: + + + + meta_defaults = [ + + { copyright => "Copyright 2007 by Joey Hess" }, + + { license => "GPL v2+" }, + + { link => "somepage", rel => "site entrypoint", }, + + ], + + + Supported fields: + + * title + +-- [[Jon]] + +>> Ok, I've had a bit of a think about this. There are currently 15 supported +>> meta fields. Of these: title, licence, copyright, author, authorurl, +>> and robots might make sense to define globally and override on a per-page +>> basis. +>> +>> Less so, description (due to its impact on map); openid (why would +>> someone want more than one URI to act as an openid endpoint to the same +>> place?); updated. I can almost see why someone might want to set a global +>> updated value. Almost. +>> +>> Not useful are permalink, date, stylesheet (you already have a global +>> stylesheet), link, redir, and guid. +>> +>> In other words, the limitations of my first patch that [[smcv]] outlined +>> are only relevant to defined fields that you wouldn't want to specify a +>> global default for anyway. +>> +>>> I generally agree with this. It is *possible* that meta would have a new +>>> field added, that takes parameters and make sense to use globally. +>>> --[[Joey]] +>> +>> Due to this, and the added complexity of the second patch (having to adjust +>> `IkiWiki/Setup.pm`), I think the first patch makes more sense. I've thus +>> reverted to it here. +>> +>> Is this merge-worthy? diff --git a/IkiWiki/Plugin/meta.pm b/IkiWiki/Plugin/meta.pm index b229592..3132257 100644 @@ -56,19 +207,40 @@ my github ikiwiki fork): -- [[Jon]] -> This doesn't support multiple-argument meta directives like -> `link=x rel=y`, or meta directives with special side-effects like -> `updated`. -> -> The first could be solved (if you care) by a syntax like this: -> -> meta_defaults => [ -> { copyright => "© me" }, -> { link => "about:blank", rel => "silly", }, -> ] -> -> The second could perhaps be solved by invoking `meta::preprocess` from within -> `scan` (which might be a simplification anyway), although this is complicated -> by the fact that some (but not all!) meta headers are idempotent. -> -> --[[smcv]] +>>> Merry Christmas/festive season/happy new year folks. I've been away from +>>> ikiwiki for the break, and now I've returned to watching recentchanges. +>>> Hopefully I'll be back in the mix soon, too. In the meantime, Joey, have +>>> you had a chance to look at this yet? -- [[Jon]] + +>>>> Ping :) Hi. [[Joey]], would you consider this patch for the next +>>>> ikiwiki release? -- [[Jon]] + +>>> For this to work with websetup and --dumpsetup, it needs to define the +>>> `meta_*` settings in the getsetup function. +>>>> +>>>> I think this will be problematic with the current implementation of this +>>>> patch. The datatype here is an array of hash references, with each hash +>>>> having a variable (and arbitrary) number of key/value pairs. I can't +>>>> think of an intuitive way of implementing a way of editing such a +>>>> datatype in the web interface, let alone registering the option in +>>>> getsetup. +>>>> +>>>> Perhaps a limited set of defined meta values could be exposed via +>>>> websetup (the obvious ones: author, copyright, license, etc.) -- [[Jon]] +>>> +>>> I also have some concerns about both these patches, since both throw +>>> a lot of redundant data at meta, which then stores it in a very redundant +>>> way. Specifically, meta populates a per-page `%metaheaders` hash +>>> as well as storing per-page metadata in `%pagestate`. So, if you have +>>> a wiki with 10 thousand pages, and you add a 1k site-wide license text, +>>> that will bloat the memory usage of ikiwiki by in excess of 2 +>>> megabytes. It will also cause ikiwiki to write a similar amount more data +>>> to its state file which has to be loaded back in each +>>> run. +>>> +>>> Seems that this could be managed much more efficiently by having +>>> meta special-case the site-wide settings, not store them in these +>>> per-page data structures, and just make them be used if no per-page +>>> metadata of the given type is present. --[[Joey]] +>>>> +>>>> that should be easy enough to do. I will work on a patch. -- [[Jon]] diff --git a/doc/todo/anon_push_of_comments.mdwn b/doc/todo/anon_push_of_comments.mdwn new file mode 100644 index 000000000..b472ea13f --- /dev/null +++ b/doc/todo/anon_push_of_comments.mdwn @@ -0,0 +1,14 @@ +It should be possible to use anonymous git push to post comments +(created, say, by a ikiwiki-comment program). Currently, that is not +allowed, because users cannot edit, or create internal page files. +But, comments in allowed locations are an exception to that rule, and +that exception should be communicated somehow to `IkiWiki::Receive`. +--[[Joey]] + +> Complications include: +> +> * Hard to see a way to prevent users from committing a comment that +> claims to be written by someone else. +> * `checkcontent` hooks need to be run, but can't accept a comment +> for later moderation, since it's coming in as part of a commit. +> Best they could do is reject the commit. diff --git a/doc/todo/auto-create_tag_pages_according_to_a_template.mdwn b/doc/todo/auto-create_tag_pages_according_to_a_template.mdwn index f1d33114f..7b65eba2e 100644 --- a/doc/todo/auto-create_tag_pages_according_to_a_template.mdwn +++ b/doc/todo/auto-create_tag_pages_according_to_a_template.mdwn @@ -4,7 +4,7 @@ Tags are mainly specific to the object to which they’re stuck. However, I ofte Also see: <http://madduck.net/blog/2008.01.06:new-blog/> and <http://users.itk.ppke.hu/~cstamas/code/ikiwiki/autocreatetagpage/> -[[!tag wishlist plugins/tag patch]] +[[!tag wishlist plugins/tag patch patch/core]] I would love to see this as well. -- dato @@ -15,88 +15,9 @@ A new setting is used to enable or disable auto-create tag pages, `tag_autocreat The new tag file is created during the preprocess phase. The new tag file is then complied during the change phase. -_tag.pm from version 3.01_ - - - --- tag.pm 2009-02-06 10:26:03.000000000 -0700 - +++ tag_new.pm 2009-02-06 12:17:19.000000000 -0700 - @@ -14,6 +14,7 @@ - hook(type => "preprocess", id => "tag", call => \&preprocess_tag, scan => 1); - hook(type => "preprocess", id => "taglink", call => \&preprocess_taglink, scan => 1); - hook(type => "pagetemplate", id => "tag", call => \&pagetemplate); - + hook(type => "change", id => "tag", call => \&change); - } - - sub getopt () { - @@ -36,6 +37,36 @@ - safe => 1, - rebuild => 1, - }, - + tag_autocreate => { - + type => "boolean", - + example => 0, - + description => "Auto-create the new tag pages, uses autotagpage.tmpl ", - + safe => 1, - + rebulid => 1, - + }, - +} - + - +my $autocreated_page = 0; - + - +sub gen_tag_page($) { - + my $tag=shift; - + - + my $tag_file=$tag.'.'.$config{default_pageext}; - + return if (-f $config{srcdir}.$tag_file); - + - + my $template=template("autotagpage.tmpl"); - + $template->param(tag => $tag); - + writefile($tag_file, $config{srcdir}, $template->output); - + $autocreated_page = 1; - + - + if ($config{rcs}) { - + IkiWiki::disable_commit_hook(); - + IkiWiki::rcs_add($tag_file); - + IkiWiki::rcs_commit_staged( - + gettext("Automatic tag page generation"), - + undef, undef); - + IkiWiki::enable_commit_hook(); - + } - } - - sub tagpage ($) { - @@ -47,6 +78,10 @@ - $tag=~y#/#/#s; # squash dups - } - - + if (defined $config{tag_autocreate} && $config{tag_autocreate} ) { - + gen_tag_page($tag); - + } - + - return $tag; - } - - @@ -125,4 +160,18 @@ - } - } - - +sub change(@) { - + return unless($autocreated_page); - + $autocreated_page = 0; - + - + # This refresh/saveindex is to complie the autocreated tag pages - + IkiWiki::refresh(); - + IkiWiki::saveindex(); - + - + # This refresh/saveindex is to fix the Tags link - + # With out this additional refresh/saveindex the tag link displays ?tag - + IkiWiki::refresh(); - + IkiWiki::saveindex(); - +} - + +*see git history of this page if you want the patch --[[smcv]]* - -This uses a [[template|wikitemplates]] called `autotagpage.tmpl`, here is my template file: +This uses a [[template|templates]] called `autotagpage.tmpl`, here is my template file: \[[!inline pages="link(<TMPL_VAR TAG>)" archive="yes"]] @@ -123,3 +44,227 @@ On the second extra pass, it doesn't notice that it has to update the "?"-link. } is not satisfied for the newly created tag page. I shall put debug msgs into Render.pm to find out better how it works. --Ivan Z. + +--- + +I've made another attempt at fixing this + +The current progress can be found at my [git repository][gitweb] on branch +`autotag`: + + git://git.liegesta.at/git/ikiwiki + +[gitweb]: http://git.liegesta.at/?p=ikiwiki.git;a=shortlog;h=refs/heads/autotag (gitweb for branch autotag) + +It's not entirely finished yet, but already quite usable. Testing and comments +on code quality, implementation details, as well as other patches would be +appreciated. + +Here's what it does right now: + +* enabled by setting `tag_autocreate=1` in the configuration. +* Tag pages will be created in `tagbase` from the template `autotag.tmpl`. +* Will correctly render all links, and dependencies. Well, AFAIK. +* When a tag page is deleted it will automatically recreated from template. (I +consider this a feature, not a bug) +* Requires a rebuild on first use. +* Adds a function `add_autofile()` to the plugin API, to do all this. + +Todo/Bugs: + +* Will still create a page even if there's a page other than `$tag` under +`tagbase` satisfying the tag link. (details? --[[Joey]]) +* Call from `IkiWiki.pm` to `Render.pm`, which adds a module dependency in the +wrong direction. (fixed --[[Joey]] ) +* Add files to RCS. +* Unit tests. +* Proper documentation. (fixed (mostly) --[[Joey]]) + +--[[David_Riebenbauer]] + +> Starting review of this. Some of your commits are to very delicate, +> optimised, and security-sensitive ground, so I have to look at them very +> carefully. --[[Joey]] + +>> First of, sorry that it took me so damn long to answer. I didn't lose +>> interest but it took a while for me to find the time and motivation +>> to address you suggestions. --[[David_Riebenbauer]] + +> * In the refactoring in [f3abeac919c4736429bd3362af6edf51ede8e7fe][], +> you introduced at least 2 bugs, one a possible security hole. +> Now one part of the code tests `if ($file)` and the other +> caller tests `if ($f)`. These two tests both tested `if (! defined $f)` +> before. Notice that the variable needs to be the untainted variable +> for both. Also notice that `if ($f)` fails if `$f` contains `0`, +> which is a very common perl gotcha. +> * Your refactored code changes `-l $_ || -d _` to `-l $file || -d $file`. +> The latter makes one more stat system call; note the use of a +> bare `_` in the first to make perl reuse the stat buffer. +> * (As a matter of style, could you put a space after the commas in your +> perl?) + +>> The first two points should be addressed in +>> [da5d29f95f6e693e8c14be1b896cf25cf4fdb3c0][]. And sure, I can add the +>> spaces. --[[David_Riebenbauer]] + +> I'd like to cherry-pick the above commit, once it's in shape, before +> looking at the rest in detail. So just a few other things that stood out. +> +> * Commit [4af4d26582f0c2b915d7102fb4a604b176385748][] seems unnecessary. +> `srcfile($file, 1)` already is documented to return undef if the +> file does not exist. (But without the second parameter, it throws +> an error.) + +>> You're right. I must have been some confused by some other promplem I +>> introduced then. Reverted. --[[David_Riebenbauer]] + +> * Commit [f58f3e1bec41ccf9316f37b014ce0b373c8e49e1][] adds a line +> that is intented by a space, not a tab. + +>> Sorry, That one was reverted anyway. --[[David_Riebenbauer]] + +> * Commit [f58f3e1bec41ccf9316f37b014ce0b373c8e49e1][] says that auto-added +> files will be recreated if the user deletes them. That seems bad. +> `autoindex` goes to some trouble to not recreate deleted files. + +>> I reverted the commit and addressed the issue in +>> [a358d74bef51dae31332ff27e897fe04834571e6][] and +>> [981400177d68a279f485727be3f013e68f0bf691][]. + --[[David_Riebenbauer]] + +>>> This doesn't seem to have all the refinements that autoindex has: +>>> +>>> * `autoindex` attaches the record of deletions to the `index` page, which +>>> is (nearly) guaranteed to exist; this one attaches the record of +>>> deletions to the deleted page's page state. Won't that tend to result +>>> in losing the record along with the deleted page? + +>>>> This is probably on of the harder things to do, 'cause there are (most of the +>>>> time) several pages that are responsible for the creation of a single tag page. +>>>> Of course I could attach the info to all of them. + +>>>> With current behaviour I think the information in `%pagestate` is kept around +>>>> regardless whether the corresponding page exists or not. +>>>> --[[David_Riebenbauer]] + +>>>>> Sorry, I'll try to be clearer: `autoindex` hard-codes that the index page +>>>>> of the entire wiki is the one responsible for storing the page state. That +>>>>> page isn't responsible for the creation of the tag page, it's just an +>>>>> arbitrary page that's (more or less) guaranteed to exist. --[[smcv]] + +>>>>> I don't like that [[plugins/autoindex]] has to do that, +>>>>> but `%pagestate` values are only stored for pages that exist, +>>>>> so it was necessary. (Another way to look at this is that +>>>>> `%pagestate` is not the ideal data structure.) --[[Joey]] + +>>>>>> Aha! Having looked at [[plugins/write]] again, it turns out that what this +>>>>>> feature should really use is `%wikistate`, I think? :-) --[[smcv]] + +>>>>>>> Ah, indeed, that came after I wrote autoindex. I've fixed autoindex to +>>>>>>> use it. --[[Joey]] + +>>>>> Ok, now I know what you mean. --[[David_Riebenbauer]] + +>>> * `autoindex` forgets that a page was deleted when that page is +>>> re-created + +>>>> Yes, I forgot about that and that is a bug. I'll fix that. +>>>> --[[David_Riebenbauer]] + +>>>>> In my branch, it keeps a list of autofiles that were created, +>>>>> not deleted. And I think that turns out to be necessary, really. +>>>>> However, I see no way to clean out that list on deletion and +>>>>> manual recreation -- it still needs to remember it was once an autofile, +>>>>> in order to avoid recreating it if it's deleted yet again. --[[Joey]] + +>>>>>> Are these really the semantics we want? It seems strange to me +>>>>>> that this: +>>>>>> +>>>>>> * tag a page as foo +>>>>>> * tags/foo automatically appears +>>>>>> * delete tags/foo +>>>>>> * create tags/foo manually +>>>>>> * delete tags/foo again +>>>>>> * tags/foo isn't automatically created +>>>>>> +>>>>>> isn't the same as this: +>>>>>> +>>>>>> * create tags/foo +>>>>>> * delete tags/foo +>>>>>> * tag a page as foo +>>>>>> * tags/foo automatically appears +>>>>>> +>>>>>> or even this: +>>>>>> +>>>>>> * create tags/foo +>>>>>> * tag a page as foo +>>>>>> * delete tags/foo +>>>>>> * tags/foo automatically appears (?) +>>>>>> +>>>>>> --[[smcv]] + +>>>>>>> I agree that the last of these is not desired. It could be avoided +>>>>>>> by extending the list of autofiles to include those that were not +>>>>>>> created due to the file/page already existing. +>>>>>>> +>>>>>>> Hmm, that would fix the previous scenario too. --[[Joey]] + +>>> * `autoindex` forgets that a page was deleted when it's no longer needed +>>> anyway (this may be harder for `autotag`?) + +>>>> I don't think so. AFAIK ikiwiki can detect whether there are taglinks to a page +>>>> anyway, so it should be quite easy. I'll try to implement that too. +>>>> --[[David_Riebenbauer]] + +>>> It'd probably be an interesting test of the core change to port +>>> `autoindex` to use it? (Adding the file to the RCS would be +>>> necessary to get parity with `autoindex`.) --[[smcv]] + +>>>> Good suggestion. Adding the files to RCS is on my todo list anyway. +>>>> --[[David_Riebenbauer]] + +>>>>> I think it may be better to allow the `add_autofile` caller +>>>>> to specify if it is added to RCS. In my branch, it can do +>>>>> so by just making the callback it registers call `rcs_add`; +>>>>> and I have tag do this. Other plugins might want autofiles +>>>>> that do not get checked in, conceivably. +>>>>> --[[Joey]] + +> Regarding the call from `IkiWiki.pm` to `Render.pm`, wouldn't this be +> quite easy to solve by moving `verify_src_file` to IkiWiki.pm? --[[smcv]] + +>> True. I'll do that. --[[David_Riebenbauer]] +>> Fixed in my branch --[[Joey]] + +[[!template id=gitbranch branch=origin/autotag author="[[Joey]]"]] +I've pushed an autotag branch of my own, which refactors +things a bit and fixes bugs around deletion/recreation. +I've tested it fairly thouroughly. --[[Joey]] + +[f3abeac919c4736429bd3362af6edf51ede8e7fe]: http://git.liegesta.at/?p=ikiwiki.git;a=commitdiff;h=f3abeac919c4736429bd3362af6edf51ede8e7fe (commitdiff for f3abeac919c4736429bd3362af6edf51ede8e7fe) +[4af4d26582f0c2b915d7102fb4a604b176385748]: http://git.liegesta.at/?p=ikiwiki.git;a=commitdiff;h=4af4d26582f0c2b915d7102fb4a604b176385748 (commitdiff for 4af4d26582f0c2b915d7102fb4a604b176385748) +[f58f3e1bec41ccf9316f37b014ce0b373c8e49e1]: http://git.liegesta.at/?p=ikiwiki.git;a=commitdiff;h=f58f3e1bec41ccf9316f37b014ce0b373c8e49e1 (commitdiff for f58f3e1bec41ccf9316f37b014ce0b373c8e49e1) +[da5d29f95f6e693e8c14be1b896cf25cf4fdb3c0]: http://git.liegesta.at/?p=ikiwiki.git;a=commitdiff;h=da5d29f95f6e693e8c14be1b896cf25cf4fdb3c0 (commitdiff for da5d29f95f6e693e8c14be1b896cf25cf4fdb3c0) +[a358d74bef51dae31332ff27e897fe04834571e6]: http://git.liegesta.at/?p=ikiwiki.git;a=commitdiff;h=a358d74bef51dae31332ff27e897fe04834571e6 (commitdiff for a358d74bef51dae31332ff27e897fe04834571e6) +[981400177d68a279f485727be3f013e68f0bf691]: http://git.liegesta.at/?p=ikiwiki.git;a=commitdiff;h=981400177d68a279f485727be3f013e68f0bf691 (commitdiff for 981400177d68a279f485727be3f013e68f0bf691) + +------------------- + +Even if this is already marked as done, I'd like to suggest an alternative +solution: + +Instead of creating a file that gets checked in into the RCS, the source files +could be left out and the output files be written as long as there is no +physical source file (think of a virtual underlay). Something similar would be +required to implement [[todo/alias directive]], which couldn't be easily done +by writing to the RCS as the page's contents can change depending on which +other pages claim it as an alias. --[[chrysn]] + +I agree with [[chrysn]]. In fact, is there any good reason that the core tag +plugin doesn't do this? The current usability is horrible, to the point that +I have gone 2.5 years with Ikiwiki and haven't yet started using tags. +--[Eric](http://wiki.pdxhub.org/people/eric) + +> See [[todo/transient_pages]] for progress on this. --[[smcv]] + +[[!tag done]] diff --git a/doc/todo/auto_getctime_on_fresh_build.mdwn b/doc/todo/auto_getctime_on_fresh_build.mdwn new file mode 100644 index 000000000..760c56fa1 --- /dev/null +++ b/doc/todo/auto_getctime_on_fresh_build.mdwn @@ -0,0 +1,13 @@ +[[!tag wishlist]] + +It might be a good idea to enable --gettime when `.ikiwiki` does not +exist. This way a new checkout of a `srcdir` would automatically get +ctimes right. (Running --gettime whenever a rebuild is done would be too +slow.) --[[Joey]] + +Could this be too annoying in some cases, eg, checking out a large wiki +that needs to get set up right away? --[[Joey]] + +> Not for git with the new, optimised --getctime. For other VCS.. well, +> pity they're not as fast as git ;), but it is a one-time expense... +> [[done]] --[[Joey]] diff --git a/doc/todo/auto_publish_expire.mdwn b/doc/todo/auto_publish_expire.mdwn new file mode 100644 index 000000000..7a5a17517 --- /dev/null +++ b/doc/todo/auto_publish_expire.mdwn @@ -0,0 +1,33 @@ +It could be nice to mark some page such that: + +* the page is automatically published on some date (i.e. build, linked, syndicated, inlined/mapped, etc.) +* the page is automatically unpublished at some other date (i.e. removed) + +I know that ikiwiki is a wiki compiler so that something has to refresh the wiki periodically to enforce the rules (a cronjob for instance). It seems to me that the calendar plugin rely on something similar. + +The date for publishing and expiring could be set be using some new directives; an alternative could be to expand the [[plugin/meta]] plugin with [<span/>[!meta date="auto publish date"]] and [<span/>[!meta expires="auto expire date"]]. + +--[[JeanPrivat]] + +> This is a duplicate, and expansion, of +> [[todo/tagging_with_a_publication_date]]. +> There, I suggest using a branch to develop +> prepublication versions of a site, and merge from it +> when the thing is published. +> +> Another approach I've seen used is to keep such pages in a pending/ +> directory, and move them via cron job when their publication time comes. +> But that requires some familiarity with, and access to, cron. +> +> On [[todo/tagging_with_a_publication_date]], I also suggested using meta +> date to set a page's date into the future, +> and adding a pagespec that matches only pages with dates in the past, +> which would allow filtering out the unpublished ones. +> Sounds like you are thinking along these lines, but possibly using +> something other than the page's creation or modification date to do it. +> +> I do think the general problem with that approach is that you have to be +> careful to prevent the unpublished pages from leaking out in any +> inlines, maps, etc. --[[Joey]] + +[[!tag wishlist]] diff --git a/doc/todo/auto_rebuild_on_template_change.mdwn b/doc/todo/auto_rebuild_on_template_change.mdwn new file mode 100644 index 000000000..ea990b877 --- /dev/null +++ b/doc/todo/auto_rebuild_on_template_change.mdwn @@ -0,0 +1,78 @@ +If `page.tmpl` is changed, it would be nice if ikiwiki automatically +noticed, and rebuilt all pages. If `inlinepage.tmpl` is changed, a rebuild +of all pages using it in an inline would be stellar. + +This would allow setting: + + templatedir => "$srcdir/templates", + +.. and then the [[templates]] are managed like other wiki files; and +like other wiki files, a change to them automatically updates dependent +pages. + +Originally, it made good sense not to have the templatedir inside the wiki. +Those templates can be used to bypass the htmlscrubber, and you don't want +just anyone to edit them. But the same can be said of `style.css` and +`ikiwiki.js`, which *are* in the wiki. We rely on `allowed_attachments` +being set to secure those to prevent users uploading replacements. And we +assume that users who can directly (non-anon) commit *can* edit them, and +that's ok. + +So, perhaps the easiest way to solve this [[wishlist]] would be to +make templatedir *default* to "$srcdir/templates/, and make ikiwiki +register dependencies on `page.tmpl`, `inlinepage.tmpl`, etc, as they're +used. Although, having every page declare an explicit dep on `page.tmpl` +is perhaps a bit much; might be better to implement a special case for that +one. Also, having the templates be copied to `destdir` is not desirable. +In a sense, these template would be like internal pages, except not wiki +pages, but raw files. + +The risk is that a site might have `allowed_attachments` set to +`templates/*` or `*.tmpl` something like that. I think such a configuration +is the *only* risk, and it's unlikely enough that a NEWS warning should +suffice. + +(This would also help to clear up the tricky disctinction between +wikitemplates and in-wiki templates.) + +Note also that when using templates from "$srcdir/templates/", `no_includes` +needs to be set. Currently this is done by the two plugins that use +such templates, while includes are allowed in `templatedir`. + +Have started working on this. +[[!template id=gitbranch branch=origin/templatemove author="[[Joey]]"]] + +> But would this require that templates be parseable as wiki pages? Because that would be a nuisance. --[[KathrynAndersen]] + +>> It would be better for them not to be rendered separately at all. +>> --[[Joey]] + +>>> I don't follow you. --[[KathrynAndersen]] + +>>>> If they don't render to output files, they clearly don't +>>>> need to be treated as wiki pages. (They need to be treated +>>>> as raw files anyway, because you don't want random users editing them +>>>> in the online editor.) --[[Joey]] + +>>>>> Just to be clear, the raw files would not be copied across to the output +>>>>> directory? -- [[Jon]] + +>>>>>> Without modifying ikiwiki, they'd be copied to the output directory as +>>>>>> (e.g.) http://ikiwiki.info/templates/inlinepage.tmpl; to not copy them, +>>>>>> it'd either be necessary to make them be internal pages +>>>>>> (templates/inlinepage._tmpl) or special-case them in some other way. +>>>>>> --[[smcv]] + +>>>>>>> In my branch, I left in support for the templatedir, and also +>>>>>>> /usr/share/ikiwiki/templates. So, users do not have to put their +>>>>>>> custom templates in templates/ in the wiki. If they do, +>>>>>>> the templates are copied to the destdir like other non-wiki page files +>>>>>>> are. The templates are not wiki pages, except those used by a few +>>>>>>> things like the [[plugins/template]] plugin. +>>>>>>> +>>>>>>> That seems acceptable, since users probably don't need to modify +>>>>>>> many templates, so the clutter is small. (Especially when +>>>>>>> compared to the other clutter the basewiki always puts in destdir.) +>>>>>>> This could be revisted later. --[[Joey]] + +[[done]] diff --git a/doc/todo/autoindex_should_use_add__95__autofile.mdwn b/doc/todo/autoindex_should_use_add__95__autofile.mdwn new file mode 100644 index 000000000..f3fb24c16 --- /dev/null +++ b/doc/todo/autoindex_should_use_add__95__autofile.mdwn @@ -0,0 +1,120 @@ +`add_autofile` is a generic version of [[plugins/autoindex]]'s code, +so the latter should probably use the former. --[[smcv]] + +> [[merged|done]] --[[Joey]] + +---- + +[[!template id=gitbranch branch=smcv/ready/autoindex-autofile author="[[smcv]]"]] + +I'm having trouble fixing this: + + # FIXME: some of this is probably redundant with add_autofile now, and + # the rest should perhaps be added to the autofile machinery + +By "a generic version of" above, it seems I mean "almost, but not +quite, entirely unlike". + +> As long as it's not Tea. ;) --[[Joey]] + +I tried digging through the git history for the +reasoning behind the autofile and autoindex implementations, but now I'm +mostly confused. + +## autofile + +The autofile machinery records a list of every file that has ever been proposed +as an autofile: for instance, the tag plugin has a list of every tag that +has ever been named in a \[[!tag]] or \[[!taglink]], even if no file was +actually needed (e.g. because it already existed). Checks for files that +already exist (or whatever) are deferred until after this list has been +updated, and files in this list are never auto-created again unless the wiki +is rebuilt. + +This avoids re-creating the tag `create-del` in this situation, which is +the third one that I noted on +[[todo/auto-create tag pages according to a template]]: + +* create tags/create-del manually +* tag a page as create-del +* delete tags/create-del + +and also avoids re-creating `auto-del` in this similar situation (which I +think is probably the most important one to get right): + +* tag a page as auto-del, which is created automatically +* delete tags/auto-del + +I think both of these are desirable. + +However, this infrastructure also results in the tag page not being +re-created in either of these situations (the first and second that I noted +on the other page): + +* tag a page as auto-del-create-del, which is created automatically +* delete tags/auto-del-create-del +* create tags/auto-del-create-del manually +* delete tags/auto-del-create-del again + +or + +* create tags/create-del-auto +* delete tags/create-del-auto +* tag a page as create-del-auto + +I'm less sure that these shouldn't create the tag page: we deleted the +manually-created version, but that doesn't necessarily mean we don't want +*something* to exist. + +> That could be argued, but it's a very DWIM thing. Probably best to keep +> the behavior simple and predictable, so one only needs to remember that +> when a page is deleted, nothing will ever re-create it behind ones back. +> --[[Joey]] + +>> Fair enough, I'll make autoindex do that. --s + +## autoindex + +The autoindex machinery records a more complex set. Items are added to the +set when they are deleted, but would otherwise have been added as an autoindex +(don't exist, do have children (by which I mean subpages or attachments), +and are a directory in the srcdir). They're removed if this particular run +wouldn't have added them as an autoindex (they exist, or don't have children). + +Here's what happens in situations mirroring those above. + +The "create-del" case still doesn't create the page: + +* create create-del manually +* create create-del/child +* delete create-del +* it's added to `%deleted` and not re-created + +Neither does the "auto-del" case: + +* create auto-del/child, resulting in auto-del being created automatically +* delete auto-del +* it's added to `%deleted` and not re-created + +However, unlike the generic autofile infrastructure, `autoindex` forgets +that it shouldn't re-create the deleted page in the latter two situations: + +* create auto-del-create-del/child, resulting in auto-del-create-del being + created automatically +* delete auto-del-create-del; it's added to `%deleted` and not re-created +* create auto-del-create-del manually; it's removed from `%deleted` +* delete auto-del-create-del again (it's re-created) + +and + +* create create-del-auto +* delete create-del-auto; it's not added to `%deleted` because there's no + child that would cause it to exist +* create create-del-auto/child + +> I doubt there is any good reason for this behavior. These are probably +> bugs. --[[Joey]] + +>> OK, I believe my updated branch gives `autoindex` the same behaviour +>> as auto-creation of tags. The `auto-del-create-del` and +>> `create-del-auto` use cases work the same as for tags on my demo wiki. --s diff --git a/doc/todo/avatar.mdwn b/doc/todo/avatar.mdwn index b8aa2327f..7fa3762da 100644 --- a/doc/todo/avatar.mdwn +++ b/doc/todo/avatar.mdwn @@ -1,38 +1,22 @@ [[!tag wishlist]] It would be nice if ikiwiki, particularly [[plugins/comments]] -supported user avatar icons. I was considering adding a directive for this, -as designed below. +(but also, ideally, recentchanges) supported user avatar icons. -However, there is no *good* service for mapping openids to avatars -- -openavatar has many issues, including not supporting delegated openids, and -after trying it, I don't trust it to push users toward. -Perhaps instead ikiwiki could get the email address from the openid -provider, though I think the perl openid modules don't support the openid -2.x feature that allows that. +> Update: Done for comments, but not for anything else, and the directive +> below would be a nice addition. --[[Joey]] -At the moment, working on this doesn't feel like a good use of my time. ---[[Joey]] - -Hmm.. unless is just always used a single provider (gravatar) and hashed -the openid. Then wavatars could be used to get a unique avatar per openid -at least. --[[Joey]] - ----- - -The directive displays a small avatar image for a user. Pass it the -email address, openid, or wiki username of the user. +Idea is to add a directive that displays a small avatar image for a user. +Pass it a user's the email address, openid, username, or the md5 hash +of their email address: \[[!avatar user@example.com]] \[[!avatar http://joey.kitenet.net/]] \[[!avatar user]] + \[[!avatar hash]] -The avatars are provided by various sites. For email addresses, it uses a -[gravatar](http://gravatar.com/). For openid, -[openavatar](http://www.openvatar.com/) is used. For a wiki username, the -user's email address is looked up and the gravatar for that user is -displayed. (Of course, the user has to have filled in their email address -on their Preferences page for that to work.) +These directives can then be hand-inserted onto pages, or more likely, +included in eg, a comment post via a template. An optional second parameter can be included, containing additional options to pass in the diff --git a/doc/todo/avoid_attachement_ui_if_upload_not_allowed.mdwn b/doc/todo/avoid_attachement_ui_if_upload_not_allowed.mdwn new file mode 100644 index 000000000..487915850 --- /dev/null +++ b/doc/todo/avoid_attachement_ui_if_upload_not_allowed.mdwn @@ -0,0 +1,25 @@ +Any way to make it so an edit page doesn't offer the attachment capability +unless it matches a specific user, is an admin, and/or is an allowed page? +(For now, I have it on all pages, and then it prohibits after I submit +based on the allowed_attachments.) + +> To do that, ikiwiki would have to try to match the `allowed_attachments` +> pagespec against a sort of dummy upload to the current page. Then if it +> failed, assume all real uploads would fail. Now consider a pagespec like +> "user(joey) and mimetype(audio/mpeg)" -- it'd be hard to make a dummy +> upload to test this pagespec against. +> +> So, there would need to be some sort of test mode, where terms like +> `mimetype()` always succeed. But then consider a pagespec like +> "user(joey) and !mimetype(video/mpeg)" -- if mimetype succeeds, this +> fails. +> +> So, maybe we can instead just filter out all the pagespec terms aside +> from `user()`, `ip()`, and `admin()`. Transforming that into just +> "user(joey)", which would succeed in the test. +> +> That'd work, I guess. Pulling a pagespec apart, filtering out terms, and +> putting it back together is nontrivial, but doable. +> +> Other approach would be to have a separate pagespec that explicitly +> controlls what pages to show the attachment UI on. --[[Joey]] diff --git a/doc/todo/beef_up_sidebar_to_allow_for_multiple_sidebars.mdwn b/doc/todo/beef_up_sidebar_to_allow_for_multiple_sidebars.mdwn index fb942a495..fdaa09f26 100644 --- a/doc/todo/beef_up_sidebar_to_allow_for_multiple_sidebars.mdwn +++ b/doc/todo/beef_up_sidebar_to_allow_for_multiple_sidebars.mdwn @@ -13,5 +13,113 @@ those contents instead. > In mine I just copied sidebar out and made some extra "sidebars", but they go elsewhere. Ugly hack, but it works. --[[simonraven]] +>> Here a simple [[patch]] for multiple sidebars. Not too fancy but better than having multiple copies of the sidebar plugin. --[[jeanprivat]] + +>>> I made a [[git]] branch for it [[!template id=gitbranch branch="privat/multiple_sidebars" author="[[jeanprivat]]"]] --[[jeanprivat]] + +>>>> Ping for [[Joey]]. Do you have any comment? I could improve it if there is things you do not like. I prefer to have such a feature integrated upstream. --[[JeanPrivat]] + +>>>>> The code is fine. +>>>>> +>>>>> I did think about having it examine +>>>>> the `page.tmpl` for parameters with names like `FOO_SIDEBAR` +>>>>> and automatically enable page `foo` as a sidebar in that case, +>>>>> instead of using the setup file to enable. But I'm not sure about +>>>>> that idea.. +>>>>> +>>>>> The full compliment of sidebars would be a header, a footer, +>>>>> a left, and a right sidebar. It would make sense to go ahead +>>>>> and add the parameters to `page.tmpl` so enabling each just works, +>>>>> and add whatever basic CSS makes sense. Although I don't know +>>>>> if I want to try to get a 3 column CSS going, so perhaps leave the +>>>>> left sidebar out of that. + +------------------- + +<pre> +--- /usr/share/perl5/IkiWiki/Plugin/sidebar.pm 2010-02-11 22:53:17.000000000 -0500 ++++ plugins/IkiWiki/Plugin/sidebar.pm 2010-02-27 09:54:12.524412391 -0500 +@@ -19,12 +19,20 @@ + safe => 1, + rebuild => 1, + }, ++ active_sidebars => { ++ type => "string", ++ example => qw(sidebar banner footer), ++ description => "Which sidebars must be activated and processed.", ++ safe => 1, ++ rebuild => 1 ++ }, + } + +-sub sidebar_content ($) { ++sub sidebar_content ($$) { + my $page=shift; ++ my $sidebar=shift; + +- my $sidebar_page=bestlink($page, "sidebar") || return; ++ my $sidebar_page=bestlink($page, $sidebar) || return; + my $sidebar_file=$pagesources{$sidebar_page} || return; + my $sidebar_type=pagetype($sidebar_file); + +@@ -49,11 +57,17 @@ + + my $page=$params{page}; + my $template=$params{template}; +- +- if ($template->query(name => "sidebar")) { +- my $content=sidebar_content($page); +- if (defined $content && length $content) { +- $template->param(sidebar => $content); ++ ++ my @sidebars; ++ if (defined $config{active_sidebars} && length $config{active_sidebars}) { @sidebars = @{$config{active_sidebars}}; } ++ else { @sidebars = qw(sidebar); } ++ ++ foreach my $sidebar (@sidebars) { ++ if ($template->query(name => $sidebar)) { ++ my $content=sidebar_content($page, $sidebar); ++ if (defined $content && length $content) { ++ $template->param($sidebar => $content); ++ } + } + } + } +</pre> + +---------------------------------------- +## Further thoughts about this + +(since the indentation level was getting rather high.) + +What about using pagespecs in the config to map pages and sidebar pages together? Something like this: + +<pre> + sidebar_pagespec => { + "foo/*" => 'sidebars/foo_sidebar', + "bar/* and !bar/*/*' => 'bar/bar_top_sidebar', + "* and !foo/* and !bar/*" => 'sidebars/general_sidebar', + }, +</pre> + +One could do something similar for *pageheader*, *pagefooter* and *rightbar* if desired. + +Another thing which I find compelling - but probably because I am using [[plugins/contrib/field]] - is to be able to treat the included page as if it were *part* of the page it was included into, rather than as an included page. I mean things like \[[!if ...]] would test against the page name of the page it's included into rather than the name of the sidebar/header/footer page. It's even more powerful if one combines this with field/getfield/ftemplate/report, since one could make "generic" headers and footers that could apply to a whole set of pages. + +Header example: +<pre> +#{{$title}} +\[[!ftemplate id="nice_data_table"]] +</pre> + +Footer example: +<pre> +------------ +\[[!report template="footer_trail" trail="trailpage" here_only=1]] +</pre> + +(Yes, I am already doing something like this on my own site. It's like the PmWiki concept of GroupHeader/GroupFooter) + +-- [[KathrynAndersen]] [[!tag wishlist]] diff --git a/doc/todo/beef_up_signin_page.mdwn b/doc/todo/beef_up_signin_page.mdwn new file mode 100644 index 000000000..ee322b663 --- /dev/null +++ b/doc/todo/beef_up_signin_page.mdwn @@ -0,0 +1,17 @@ +ikiwiki's signin page is too sparse for people who don't live in the Web 2.0. + +We occasionally have GNU Hurd web pages contributors wonder what they have to +do on that page. They don't know / the page doesn't explain what an *account +provider* is, and that cross-indentification (using an existing OpenID account) +is possible to begin with. And, if they don't have such an OpenID account, +it's not easily understandable that the *other* option is for creating a local +site-only account (like in the old days). + +--[[tschwinge]] + +> I agree that this would be good. It could be done by leaving +> the compact widget at the top and adding some verbose explanations +> and/or further forms below. +> +> All it takes is editing `templates/openid-selector.tmpl`, +> so I welcome suggestions. --[[Joey]] diff --git a/doc/todo/capitalize_title.mdwn b/doc/todo/capitalize_title.mdwn new file mode 100644 index 000000000..3e8366dd3 --- /dev/null +++ b/doc/todo/capitalize_title.mdwn @@ -0,0 +1,31 @@ +Here I propose an option (with a [[patch]]) to capitalize the first letter (ucfirst) of default titles : filenames and urls can be lowercase but title are displayed with a capital first character (filename = "foo.mdwn", pagetitle = "Foo"). Note that \[[!meta title]] are unaffected (no automatic capitalization). Comments please :) --[[JeanPrivat]] +<pre><code> +diff --git a/IkiWiki.pm b/IkiWiki.pm +index 6da2819..fd36ec4 100644 +--- a/IkiWiki.pm ++++ b/IkiWiki.pm +@@ -281,6 +281,13 @@ sub getsetup () { + safe => 0, + rebuild => 1, + }, ++ capitalize => { ++ type => "boolean", ++ default => undef, ++ description => "capitalize the first letter of page titles", ++ safe => 1, ++ rebuild => 1, ++ }, + userdir => { + type => "string", + default => "", +@@ -989,6 +996,10 @@ sub pagetitle ($;$) { + $page=~s/(__(\d+)__|_)/$1 eq '_' ? ' ' : "&#$2;"/eg; + } + ++ if ($config{capitalize}) { ++ $page = ucfirst $page; ++ } ++ + return $page; + } +</code></pre> diff --git a/doc/todo/cas_authentication.mdwn b/doc/todo/cas_authentication.mdwn index 8bf7042df..ed8010518 100644 --- a/doc/todo/cas_authentication.mdwn +++ b/doc/todo/cas_authentication.mdwn @@ -21,6 +21,19 @@ follows) ? > license statement at the top. I have a few questions that I'll insert > inline with the patch below. --[[Joey]] +>> I have made some corrections to this patch (my cas plugin) in order to use +>> IkiWiki 3.00 interface and take your comments into account. It should work +>> fine now. +>> +>> You can pull it from my git repo at +>> http://git.boulgour.com/bbb/ikiwiki.git/ and maybe add it to your main +>> repo. +>> +>> I will add GNU GPL copyright license statement as soon as I get some free +>> time. +>> +>> --[[/users/bbb]] + ------------------------------------------------------------------------------ diff --git a/IkiWiki/Plugin/cas.pm b/IkiWiki/Plugin/cas.pm new file mode 100644 diff --git a/doc/todo/cdate_and_mdate_available_for_templates.mdwn b/doc/todo/cdate_and_mdate_available_for_templates.mdwn new file mode 100644 index 000000000..70d8fc8c9 --- /dev/null +++ b/doc/todo/cdate_and_mdate_available_for_templates.mdwn @@ -0,0 +1,15 @@ +[[!tag wishlist]] + +`CDATE_3339`, `CDATE_822`, `MDATE_3339` and `MDATE_822` template variables would be useful for evey page, at least for my templates with Dublin Core metadata. + +I tried to pick the relevant lines of the [[inline|plugins/inline]] plugin and hack it into a custom plugin, but it failed miserably because of my obvious lack of perl litteracy... + +Anyway, I'm sure this is almost nothing... + +* `sub date_822 ($) {}` +* `sub date_3339 ($) {}` +* and something like `$template->param('cdate_822' => date_822($IkiWiki::pagectime{$page}));` + +Anyone can fill the missing lines? + +-- [[nil]] diff --git a/doc/todo/comment_moderation_feed.mdwn b/doc/todo/comment_moderation_feed.mdwn new file mode 100644 index 000000000..267706b1b --- /dev/null +++ b/doc/todo/comment_moderation_feed.mdwn @@ -0,0 +1,9 @@ +There should be a way to generate a feed that is updated whenever a new +comment needs moderation. Otherwise, it can be hard to remember to check +sites, which may rarely get comments. + +The feed should not include the comment subject or body, but could mention +the author. It would be especially handy if it was generated statically. +One way would be to generate internal pages corresponding to each comment +that needs moderation; then the feed could be constructed via a usual +inline. diff --git a/doc/todo/configurable_markdown_path.mdwn b/doc/todo/configurable_markdown_path.mdwn new file mode 100644 index 000000000..63fa2dcbd --- /dev/null +++ b/doc/todo/configurable_markdown_path.mdwn @@ -0,0 +1,64 @@ +[[!template id=gitbranch branch=wtk/mdwn author="[[wtk]]"]] + +summary +======= + +Make it easy to configure the Markdown implementation used by the +[[plugins/mdwn]] plugin. With this patch, you can set the path to an +external Markdown executable in your ikiwiki config file. If you do +not set a path, the plugin will use the usual config options to +determine which Perl module to use. + +> This adds a configuration in which a new process has to be worked +> for every single page rendered. Actually, it doesn't only add +> such a configuration, it makes it be done by *default*. +> +> Markdown is ikiwiki's default, standard renderer. A configuration +> that makes it slow will make ikiwiki look bad. +> +> I would not recommend using Gruber's perl markdown. It is old, terminally +> buggy, and unmaintained. --[[Joey]] [[!tag reviewed]] + +---- + +I wasn't trying to make an external markdown the default, I was trying +to make the currently hardcoded `/usr/bin/markdown` configurable. It +should only use an external process if `markdown_path` is set, which +it is not by default. Consider the following tests from clean checkouts: + +Current ikiwiki trunk: + + $ PERL5LIB="." time ikiwiki --setup docwiki.setup + ... + 38.73user 0.62system 1:20.90elapsed 48%CPU (0avgtext+0avgdata 103040maxresident)k + 0inputs+6472outputs (0major+19448minor)pagefaults 0swaps + +My mdwn branch: + + $ PERL5LIB="." time ikiwiki --setup docwiki.setup + ... + Markdown: Text::Markdown::markdown() + ... + 39.17user 0.73system 1:21.77elapsed 48%CPU (0avgtext+0avgdata 103072maxresident)k + 0inputs+6472outputs (0major+19537minor)pagefaults 0swaps + +My mdwn branch with `markdown_path => "/usr/bin/markdown"` added in +`docwiki.setup` (on my system, `/usr/bin/markdown` is a command-line +wrapper for `Text::Markdown::markdown`). + + $ PERL5LIB="." time ikiwiki --setup docwiki.setup + ... + Markdown: /usr/bin/markdown + ... + 175.35user 18.99system 6:38.19elapsed 48%CPU (0avgtext+0avgdata 92320maxresident)k + 0inputs+17608outputs (0major+2189080minor)pagefaults 0swaps + +So my patch doesn't make ikiwiki slow unless the user explicitly +requests an extenral markdown, which they would presumably only do to +work around bugs in their system's Perl implementation. + -- [[wtk]] + +> I was wrong about it being enabled by default, but I still don't like +> the idea of a configuration that makes ikiwiki slow on mdwn files, +> even if it is a nonstandard configuration. How hard can it be to install +> the Text::Markdown library? --[[Joey]] diff --git a/doc/todo/configurable_tidy_command_for_htmltidy.mdwn b/doc/todo/configurable_tidy_command_for_htmltidy.mdwn new file mode 100644 index 000000000..e317184b5 --- /dev/null +++ b/doc/todo/configurable_tidy_command_for_htmltidy.mdwn @@ -0,0 +1,8 @@ +[[!tag patch patch]] + +I was trying to get htmltidy to [play nicely with MathML][play]. Unfortunately, I couldn't construct a command line that I was happy with, but along the way I altered htmltidy to allow a configurable command line. This seemed like a generally useful thing, so I've published my [patch][] as a Git branch. + +[play]: http://lists.w3.org/Archives/Public/html-tidy/2006JanMar/0052.html +[patch]: http://www.physics.drexel.edu/~wking/code/git/git.php?p=ikiwiki.git&a=commitdiff&h=408ee89fd7c1dc70510385a7cf263a05862dda97&hb=e65ce4f0937eaf622846c02a9d39fa7aebe4af12 + +> Thanks, [[done]] --[[Joey]] diff --git a/doc/todo/configurable_timezones.mdwn b/doc/todo/configurable_timezones.mdwn index f8b1dbbab..36f2e9dbb 100644 --- a/doc/todo/configurable_timezones.mdwn +++ b/doc/todo/configurable_timezones.mdwn @@ -4,7 +4,4 @@ This is nice for shared hosting, and other situation where the user doesn't have > [[done]] via the ENV setting in the setup file. --[[Joey]] - -Example (ikiwiki.setup): - - ENV => { TZ => "Europe/Sofia" } +>> Now via a timezone setting that is web configurable. --[[Joey]] diff --git a/doc/todo/conflict_free_comment_merges.mdwn b/doc/todo/conflict_free_comment_merges.mdwn new file mode 100644 index 000000000..e84400c17 --- /dev/null +++ b/doc/todo/conflict_free_comment_merges.mdwn @@ -0,0 +1,23 @@ +Currently, new comments are named with an incrementing ID (comment_N). So +if a wiki has multiple disconnected servers, and comments are made to the +same page on both, merging is guaranteed to result in conflicts. + +I propose avoiding such merge problems by naming a comment with a sha1sum +of its (full) content. Keep the incrementing ID too, so there is an +-ordering. And so duplicate comments are allowed..) +So, "comment_N_SHA1". + +Note: The comment body will need to use meta title in the case where no +title is specified, to retain the current behavior of the default title +being "comment N". + +What do you think [[smcv]]? --[[Joey]] + +> I had to use md5sums, as sha1sum perl module may not be available and I +> didn't want to drag it in. But I think that's ok; this doesn't need to be +> cryptographically secure and even the chances of being able to +> purposefully cause a md5 collision and thus an undesired merge conflict +> are quite low since it modifies the input text and adds a date stamp to +> it. +> +> Anyway, I think it's good, [[done]] --[[Joey]] diff --git a/doc/todo/countdown_directive.mdwn b/doc/todo/countdown_directive.mdwn new file mode 100644 index 000000000..61c36204c --- /dev/null +++ b/doc/todo/countdown_directive.mdwn @@ -0,0 +1,5 @@ +I'd love to have a countdown directive, which would take a timestamp to count down to and generate a JavaScript timer in the page. + +Ideally I'd also like to either have parameters providing content to show before and after the time passes, or integration with existing conditional directives to do the same thing. + +[[!tag wishlist]] diff --git a/doc/todo/credentials_page.mdwn b/doc/todo/credentials_page.mdwn new file mode 100644 index 000000000..6b90af144 --- /dev/null +++ b/doc/todo/credentials_page.mdwn @@ -0,0 +1,33 @@ +pushing [[this|todo/httpauth feature parity with passwordauth]] and [[this|todo/htpasswd mirror of the userdb]] further (although rather in the [[wishlist]] priority): would it make sense for users to have a `$USER/credentials` page that is by default locked to the user and admins, where the user can state one or more of the below? + +* OpenID +* ssh public key (would require an additional mechanism for writing this to a `authorized_keys` file with appropriate environment variables or prefix that makes sure the commit is checked against the right user and that the user names agree) +* gpg public key (once there is a mechanism that relies on gpg for authentication)) +* https certificate hash (don't know details; afair the creation of such certificates is typically initiated server-side) +* password hash (this is generally considered a valuable secret; is this still true with good hashes and proper salting?) + +such a page could have a form as described in [[todo/structured page data]] and could even serve as a way of managing users. --[[chrysn]] + +> I was just thinking about something along these lines myself. The +> idea, if I understand correctly, is to allow users to have multiple +> login options all leading to the same identity. This would allow a +> user to login for example via either their Google account or their +> WordPress account, while still being identified as the same user. + +> However, I'm not sure this should be a static page (I guess you +> mean `$USER/credentials`, I don't think ‘creditentials’ actually +> exists). Something entirely managed at the CGI level is probably +> better, as it also helps keeping the data in its place (such as ssh +> public keys in `authorized_keys` etc). + +> -- GB + +>> having multiple login options leading to the same identity, and (more important to me) giving the user an easy way to review and edit them. i'm thinking a bit of foaf+ssl style "i am $USER and you can recognize me by my client certificate $CERTIFICATE" statements. +>> +>> the reason why i want this in a static place instead of cgi level is that it can be used, for example, for automatically creating htpasswd files for read-only (cgi-less) replicas of private wikis. furthermore, it all gets versioned and it can easily be seen where the data really is. the credentials have to be filed appropriately by plugins anyway, but that can happen as a part of the regular rebuild process. +>> +>> and yes, you're right about the word misusage; thanks for pointing it out and fixing it. +>> +>> --[[chrysn]] + +an issue to be considered: for ways of authentication that don't explicitly mention the user name (and that would be everything but password; especially OpenID), there has to be a way to prevent users from hijacking an admin's account. the user wouldn't get more privileges, but the admin could find himself logged in as a user instead of an admin when he logs in using his OpenID, for example. he could fix it by removing the openid from the user's ("his") page, but it has to be taken care of nevertheless. --[[chrysn]] diff --git a/doc/todo/dependency_types.mdwn b/doc/todo/dependency_types.mdwn index da9b5e6cf..4db633ead 100644 --- a/doc/todo/dependency_types.mdwn +++ b/doc/todo/dependency_types.mdwn @@ -553,29 +553,16 @@ operators. Currently, this turns into roughly: `FailReason() & SuccessReason(patch)` Let's say that the glob instead returns a HardFailReason, which when -ANDed with another object, drops their influences. (But when ORed, combines -them.) Fixes the above, but does it always work? +ANDed with another object, blocks their influences. (But when ORed, +combines them.) -"(bugs/* or link(patch)) and backlink(index)" => -`( HardFailReason() | SuccessReason(page) ) & SuccessReason(index)`` => -`SuccessReason(page & SuccessReason(index)` => -SuccessReason(page, index) => right +Question: Are all pagespec terms that return reason objects w/o any +influence info, suitable to block influence in this way? -"(bugs/* and link(patch)) or backlink(index)" => -`( HardFailReason() & SuccessReason(page) ) | SuccessReason(index)`` => -`HardFailReason() | SuccessReason(index)` => -`SuccessReason(index)` => right - -Ok so far, but: - -"!bugs/* and link(patch)" => -`!SuccessReason() | SuccessReason(bugs/foo)` => -'FailReason() | SuccessReason(bugs/foo) -`FailReason(bugs/foo)` => wrong! - -This could be fixed by adding a HardSuccessReason that glob also returns. -Maybe just a field of the object that is set if it is "hard" is a better -approach though. +To be suitable to block, a term should never change from failing to match a +page to successfully matching it, unless that page is directly changed in a +way that influences are not needed for ikiwiki to notice. But, if a term +did not meet these criteria, it would have an influence. QED. #### Influence types diff --git a/doc/todo/description_meta_param_passed_to_templates.mdwn b/doc/todo/description_meta_param_passed_to_templates.mdwn new file mode 100644 index 000000000..712471258 --- /dev/null +++ b/doc/todo/description_meta_param_passed_to_templates.mdwn @@ -0,0 +1,10 @@ +[[!tag wishlist patch]] + +I'd like to use the description parameter from [[meta|/ikiwiki/directive/meta]] directives in custom [[inline|/ikiwiki/directive/inline]] templates. I guess this could be useful to others too. + +The only change required is on [line 266](http://github.com/joeyh/ikiwiki/blob/master/IkiWiki/Plugin/meta.pm#L266) of `meta.pm` + + - foreach my $field (qw{author authorurl permalink}) { + + foreach my $field (qw{author authorurl description permalink}) { + +> Good idea, [[done]]. --[[Joey]] diff --git a/doc/todo/double-click_protection_for_form_buttons.mdwn b/doc/todo/double-click_protection_for_form_buttons.mdwn new file mode 100644 index 000000000..501be4498 --- /dev/null +++ b/doc/todo/double-click_protection_for_form_buttons.mdwn @@ -0,0 +1,5 @@ +A small piece of JS to prevent double-submitting forms would be quite nice. I seem to have developed a habit of doing this and having to resolve a merge conflict for two initial commits. -- [[Jon]] + +> By the time you see that merge conflict, the first commit has +> already successfully happened, so you can just hit cancel +> and throw away the second submit. --[[Joey]] diff --git a/doc/todo/edit_form:_no_fixed_size_for_textarea.mdwn b/doc/todo/edit_form:_no_fixed_size_for_textarea.mdwn index 4c9c2352a..77e46049f 100644 --- a/doc/todo/edit_form:_no_fixed_size_for_textarea.mdwn +++ b/doc/todo/edit_form:_no_fixed_size_for_textarea.mdwn @@ -10,7 +10,7 @@ On longer pages its not very comfortable to edit pages with such a small box. Th > } > > Perhaps you have replaced it with a modified style sheet that does not -> include that? --[[Joey]] [[!tag done]] +> include that? --[[Joey]] >> The screen shot was made with http://ikiwiki.info/ where i didn't change anything. The width is optimally used. The problem is the height. @@ -32,3 +32,21 @@ On longer pages its not very comfortable to edit pages with such a small box. Th >>> --[[Joey]] >>>>>> the javascript approach would need to work something like this: you need to know about the "bottom-most" item on the edit page, and get a handle for that object in the DOM. You can then obtain the absolute position height-wise of this element and the absolute position of the bottom of the window to determine the pixel-difference. Then, you set the height of the textarea to (current height in px) + determined-value. This needs to be re-triggered on various resize events, at least for the window and probably for other elements too. I may have a stab at this at some point. -- [[Jon]] + +Google chrome has a completly elegant fix for this problem: All textareas +have a small resize handle in a corner, that can be dragged around. No +nasty javascript needed. IMHO, this is the right solution, and I hope other +browsers emulate it. [[done]] +--[[Joey]] + +Wouldn't it be possible to just implement an integer-valued setting for this, accessible via the "Setup" wiki page? This would require a wiki regen, but such a setting would not be changed frequently I suppose. Also, Mediawiki has this implemented as a per-user setting (two settings, actually, -- number of rows and columns of the edit area); such a per-user setting would be the best possible implementation, but I'm not sure if ikiwiki already supports per-user settings. Please consider implementing this as the current 20 rows is a great PITA for any non-trivial page. + +> I don't think it would need a wiki rebuild, as the textarea is generated dynamically by the CGI when you perform a CGI action, and (as far as I know) is not cooked into any static content. -- [[Jon]] + +>> There is no need for a configuration setting for this -- to change +>> the default height from 20 rows to something else, you can just put +>> something like this in your `local.css`: --[[Joey]] + + #editcontent { + height: 50em; + } diff --git a/doc/todo/edittemplate_should_look_in_templates_directory_by_default.mdwn b/doc/todo/edittemplate_should_look_in_templates_directory_by_default.mdwn new file mode 100644 index 000000000..4bc10e432 --- /dev/null +++ b/doc/todo/edittemplate_should_look_in_templates_directory_by_default.mdwn @@ -0,0 +1,8 @@ +[[plugins/edittemplate]] looks for the specified template relative to the +page the directive appears on. Which can be handy, eg, make a +blog/mytemplate and put the directive on blog, and it will find +"mytemplate". However, it can also be confusing, since other templates +always are looked for in `templates/`. + +I think it should probably fall back to looking for `templates/$foo`. +--[[Joey]] diff --git a/doc/todo/enable-htaccess-files.mdwn b/doc/todo/enable-htaccess-files.mdwn index e302a49ed..3b9721d50 100644 --- a/doc/todo/enable-htaccess-files.mdwn +++ b/doc/todo/enable-htaccess-files.mdwn @@ -12,6 +12,13 @@ qr/(^|\/).svn\//, qr/.arch-ids\//, qr/{arch}\//], wiki_link_regexp => qr/\[\[(?:([^\]\|]+)\|)?([^\s\]#]+)(?:#([^\s\]]+))?\]\]/, +> Note that the above patch is **completely broken**. +> It removes the crucial excludes of all files starting with a dot. +> The negative regexps for htaccess have no effect, so the whole +> thing only "works" because it allows *any* file starting with a dot. +> If you applied this patch to your ikiwiki, you opened a huge security +> hole. --[[Joey]] + [[!tag patch patch/core]] This lets the site administrator have a `.htaccess` file in their underlay @@ -57,5 +64,17 @@ It should be off by default of course. --Max --- +1 I want `.htaccess` so I can rewrite some old Wordpress URLs to make feeds work again. --[[hendry]] +> Unless you cannot modify apache's configuration, you do not need htaccess +> to do that. Apache's documentation recommends against using htaccess +> unless you're a user who cannot modify the main server configuration. +> --[[Joey]] + --- +1 for various purposes (but sometimes the filename isn't `.htaccess`, so please make it configurable) --[[schmonz]] + +> I've described a workaround for one use case at the [[plugins/rsync]] [[plugins/rsync/discussion]] page. --[[schmonz]] + +--- + +[[done]], you can use the `include` setting to override the default +excludes now. Please use extreme caution when doing so. --[[Joey]] diff --git a/doc/todo/enable_arbitrary_markup_for_directives.mdwn b/doc/todo/enable_arbitrary_markup_for_directives.mdwn new file mode 100644 index 000000000..c1f0f86ed --- /dev/null +++ b/doc/todo/enable_arbitrary_markup_for_directives.mdwn @@ -0,0 +1,47 @@ +One of the good things about [PmWiki](http://www.pmwiki.org) is the ability to treat arbitrary markup as directives. +In ikiwiki, all directives have the same format: + +\[[!name arguments]] + +But with PmWiki, directives can be added to the engine (with the "Markup" hook) with the usual name and function passing, but also with a regexp which has capturing parentheses, and the results of the match are passed to the given function. +Would it be possible to alter the "preprocess" hook to have an optional regex argument which acted in a similar fashion? + +For example, one could then write a plugin which would treat + +Category: Foo, Bar + +as a tag, by using a regex such as /^Category:\s*([\w\s,]+)$/; the result "Foo, Bar" could then be further processed by the hook function. + +This could also make it easier to support more styles of markup, rather than having to do all the processing in "htmlize" and/or "filter". + +-- [[KathrynAndersen]] + +[[!taglink wishlist]] + +> Arbitrary text transformations can already be done via the filter and +> sanitize hooks. That's how the smiley and typography plugins do their +> thing. +> +> AFAICS, the only benefit to having a regexp-based-hook interface is less +> overhead in passing page content into the hooks. But that overhead is a +> small amount of the total render time. +> +> Also, I notice that smiley does such complicated things in its sanitize +> hook (ie, it looks at html context around the smilies) that a simple +> matching regexp would not be sufficient. Furthermore, typography needs to +> pass the page content into the library it uses, which does not expose +> regexps to match on. So ikiwiki's more general filtering interface seems +> to allow both of these to do things that could not be done with the +> PmWiki interface. --[[Joey]] + +>>You have some good points. I was aware of using filter, but it didn't occur to me that one could use sanitize to do processing also, probably because "sanitize" brought to mind removing harmful content rather than doing other alterations. +>>It has also occurred to me, on further thought, that if one wants one's chosen markup to actually be processed during the "preprocess" stage, that one could do so by converting the chosen markup to directive-style markup during the "filter" stage and then processing the directive during the "preprocess" stage as per usual. Is there a tag for "no longer on the wishlist?". --[[KathrynAndersen]] + +>>> Yeah, sanitize is a misleading name for the relatively few things that +>>> use it this way. +>>> +>>> While you could do a filter to preprocess step, it is a bit +>>> of a long way round, since filter always runs just before +>>> preprocess. +>>> +>>> Anyway, guess this is [[done]] --[[Joey]] diff --git a/doc/todo/feed_enhancements_for_inline_pages.mdwn b/doc/todo/feed_enhancements_for_inline_pages.mdwn new file mode 100644 index 000000000..b48c37d7b --- /dev/null +++ b/doc/todo/feed_enhancements_for_inline_pages.mdwn @@ -0,0 +1,132 @@ +[[!template id=gitbranch branch=GiuseppeBilotta/inlinestuff author="Giuseppe Bilotta"]] + +I rearranged my patchset once again, to clearly identify the origin and +motivation of each patch, which is explained in the following. + +In my ikiwiki-based website I have the following situation: + +* `$config{usedirs}` is 1 +* there are a number of subdirectories (A/, B/, C/, etc) + with pages under each of them (A/page1, A/page2, B/page3, etc) +* 'index pages' for each subdirectory: A.mdwn, B.mdwn, C.mdwn; + these are rather barebone, only contain an inline directive for their + respective subpages and become A/index.html, etc +* there is also the main index.mdwn, which inlines A.mdwn, B.mdwn, C.mdwn, + etc (i.e. the top-level index files are also inlined on the homepage) + +With the upstream `inline` plugin, the feeds for A, B, C etc are located +in `A/index.atom`, `B/index.atom`, etc; their title is the wiki name and +their main link goes to the wiki homepage rather than to their +respective subdir (e.g. I would expect `A/index.atom` to have a link to +`http://website/A` but it actually points to `http://website/`). + +This is due to them being generated from the main index page, and is +fixed by the first patch: ‘inline: base feed urls on included page +name’. As explained in the commit message for the patch itself, this is +a ‘forgotten part’ from a previous page vs destpage fix which has +already been included upstream. + +> Applied. --[[Joey]] + +>> Thanks. + +The second patch, ‘inline: improve feed title and description +management’, aligns feed title and description management by introducing +a `title` option to complement `description`, and by basing the +description on the page description if the entry is missing. If no +description is provided by either the directive parameter or the page +metadata, we use a user-configurable default based on both the page +title and wiki name rather than hard-coding the wiki name as description. + +> Reviewing, this seems ok, but I don't like that +> `feed_desc_fmt` is "safe => 0". And I question if that needs +> to be configurable at all. I say, drop that configurable, and +> only use the page meta description (or wikiname for index). +> +> Oh, and could you indent your `elsif` the same as I? --[[Joey]] + +>> I hadn't even realized that I was nesting ifs inside else clauses, +>> sorry. I think you're also right about the safety of the key, after +>> all it only gets interpolated with known, safe strings. + +>>> I did not mean to imply that I thought it safe. --[[Joey]] + +>>>> Sorry for assuming you implied that. I do think it is safe, though +>>>> (I defaulted to not safe just to err on the safe side). + +>> The question is what to do for pages that do not have a description +>> (and are not the index). With your proposal, the Atom feed subtitle +>> would turn up empty. We could make it conditional in the default +>> template, or we could have `$desc` default to `$title` if nothing +>> else is provided, but at this point I see no reason to _not_ allow +>> the user to choose a way to build a default description. + +>>> RSS requires the `<description>` element be present, it can't +>>> be conditionalized away. But I see no reason to add the complexity +>>> of an option to configure a default value for a field that +>>> few RSS consumers likely even use. That's about 3 levels below useful. +>>> --[[Joey]] + +>>>> The way I see it, there are three possibilities for non-index pages +>>>> which have no description meta: (1) we leave the +>>>> description/subtitle in feed blank, per your current proposal here +>>>> (2) we hard-code some string to put there and (3) we make the +>>>> string to put there configurable. Honestly, I think option #1 sucks +>>>> aesthetically and option #2 is conceptually wrong (I'm against +>>>> hard-coding stuff in general), which leaves option #3: however +>>>> rarely used it would be, I still think it'd be better than #2 and +>>>> less unaesthetical than #1. + +>>>> I'm also not sure what's ‘complex’ about having such an option: +>>>> it's definitely not going to get much use, but does it hurt to have +>>>> it? I could understand not wasting time putting it in, but since +>>>> the code is written already … (but then again I'm known for being a +>>>> guy who loves options). + +The third patch, ‘inline: allow assigning an id to postform/feedlink’, +does just that. I don't currently use it, but it can be particularly +useful in the postform case for example for scriptable management of +multiple postforms in the same page. + +> Applied. --[[Joey]] + +>> Thanks. + +In one of my wiki setups I had a terminating '/' in `$config{url}`. You +mention that it should not be present, but I have not seen this +requirement described anywhere. Rather than restricting the user input, +I propose a patch that prevents double slashes from appearing in links +created by `urlto()` by fixing the routine itself. + +> If this is fixed I would rather not put the overhead of fixing it in +> every call to `urlto`. And I'm not sure this is a comprehensive +> fix to every problem a trailing slash in the url could cause. --[[Joey]] + +>> Maybe something that sanitizes the config value would be better instead? +>> What is the policy about automatic changing user config? + +>>> It's impossible to do for perl-format setup files. --[[Joey]] + +>>>> Ok. In that case I think that we should document that it must be +>>>> slash-less. I'll cook up a patch in that sense. + +The inline plugin is also updated (in a separate patch) to use `urlto()` +rather than hand-coding the feed urls. You might want to keep this +change even if you discard the urlto patch. + +> IIRC, I was missing a proof that this always resulted in identical urls, +> which is necessary to prevent flooding. I need such a proof before I can +> apply that. --[[Joey]] + +>> Well, the URL would obviously change if the `$config{url}` ended in +>> slash and the `urlto` patch (or other equivalent) went into effect. + +>> Aside from that, if I read the code correctly, the only other extra +>> thing that `urlto` does is to `beautify_url_path` the `"/".$to` part, +>> and the only way this would cause the url to be altered is if the +>> feed name was "index" (which can easily happen) and +>> `$config{htmlext}` was set to something like `.rss` or +>> `.rss.1`. + +>> So there is a remote possibility that a different URL would be +>> produced. diff --git a/doc/todo/finer_control_over___60__object___47____62__s.mdwn b/doc/todo/finer_control_over___60__object___47____62__s.mdwn new file mode 100644 index 000000000..50c4d43bf --- /dev/null +++ b/doc/todo/finer_control_over___60__object___47____62__s.mdwn @@ -0,0 +1,98 @@ +IIUC, the current version of [HTML::Scrubber][] allows for the `object` tags to be either enabled or disabled entirely. However, while `object` can be used to add *code* (which is indeed a potential security hole) to a document, reading [Objects, Images, and Applets in HTML documents][objects-html] reveals that the “dangerous” are not all the `object`s, but rather those having the following attributes: + + classid %URI; #IMPLIED -- identifies an implementation -- + codebase %URI; #IMPLIED -- base URI for classid, data, archive-- + codetype %ContentType; #IMPLIED -- content type for code -- + archive CDATA #IMPLIED -- space-separated list of URIs -- + +It seems that the following attributes are, OTOH, safe: + + declare (declare) #IMPLIED -- declare but don't instantiate flag -- + data %URI; #IMPLIED -- reference to object's data -- + type %ContentType; #IMPLIED -- content type for data -- + standby %Text; #IMPLIED -- message to show while loading -- + height %Length; #IMPLIED -- override height -- + width %Length; #IMPLIED -- override width -- + usemap %URI; #IMPLIED -- use client-side image map -- + name CDATA #IMPLIED -- submit as part of form -- + tabindex NUMBER #IMPLIED -- position in tabbing order -- + +Should the former attributes be *scrubbed* while the latter left intact, the use of the `object` tag would seemingly become safe. + +Note also that allowing `object` (either restricted in such a way or not) automatically solves the [[/todo/svg]] issue. + +For Ikiwiki, it may be nice to be able to restrict [URI's][URI] (as required by the `data` and `usemap` attributes) to, say, relative and `data:` (as per [RFC 2397][]) ones as well, though it requires some more consideration. + +— [[Ivan_Shmakov]], 2010-03-12Z. + +[[wishlist]] + +> SVG can contain embedded javascript. + +>> Indeed. + +>> So, a more general tool (`XML::Scrubber`?) will be necessary to +>> refine both [XHTML][] and SVG. + +>> … And to leave [MathML][] as is (?.) + +>> — [[Ivan_Shmakov]], 2010-03-12Z. + +> The spec that you link to contains +> examples of objects that contain python scripts, Microsoft OLE +> objects, and Java. And then there's flash. I don't think ikiwiki can +> assume all the possibilities are handled securely, particularly WRT XSS +> attacks. +> --[[Joey]] + +>> I've scanned over all the `object` examples in the specification and +>> all of those that hold references to code (as opposed to data) have a +>> distinguishing `classid` attribute. + +>> While I won't assert that it's impossible to reference code with +>> `data` (and, thanks to `text/xhtml+xml` and `image/svg+xml`, it is +>> *not* impossible), throwing away any of the “insecure” +>> attributes listed above together with limiting the possible URI's +>> (i. e., only *local* and certain `data:` ones for `data` and +>> `usemap`) should make `object` almost as harmless as, say, `img`. + +>>> But with local data, one could not embed youtube videos, which surely +>>> is the most obvious use case? + +>>>> Allowing a “remote” object to render on one's page is a + security issue by itself. + Though, of course, having an explicit whitelist of URI's may make + this issue more tolerable. + — [[Ivan_Shmakov]], 2010-03-12Z. + +>>> Note that youtube embedding uses an +>>> object element with no classid. The swf file is provided via an +>>> enclosed param element. --[[Joey]] + +>>>> I've just checked a random video on YouTube and I see that the + `.swf` file is provided via an enclosed `embed` element. Whether + to allow those or not is a different issue. + — [[Ivan_Shmakov]], 2010-03-12Z. + +>> (Though it certainly won't solve the [[SVG_problem|/todo/SVG]] being +>> restricted in such a way.) + +>> Of the remaining issues I could only think of recursive +>> `object` — the one that references its container document. + +>> — [[Ivan_Shmakov]], 2010-03-12Z. + +## See also + +* [Objects, Images, and Applets in HTML documents][objects-html] +* [[plugins/htmlscrubber|/plugins/htmlscrubber]] +* [[todo/svg|/todo/svg]] +* [RFC 2397: The “data” URL scheme. L. Masinter. August 1998.][RFC 2397] +* [Uniform Resource Identifier — the free encyclopedia][URI] + +[HTML::Scrubber]: http://search.cpan.org/~podmaster/HTML-Scrubber-0.08/Scrubber.pm +[MathML]: http://en.wikipedia.org/wiki/MathML +[objects-html]: http://www.w3.org/TR/1999/REC-html401-19991224/struct/objects.html +[RFC 2397]: http://tools.ietf.org/html/rfc2397 +[URI]: http://en.wikipedia.org/wiki/Uniform_Resource_Identifier +[XHTML]: http://en.wikipedia.org/wiki/XHTML diff --git a/doc/todo/generic_insert_links.mdwn b/doc/todo/generic_insert_links.mdwn new file mode 100644 index 000000000..050f32ee7 --- /dev/null +++ b/doc/todo/generic_insert_links.mdwn @@ -0,0 +1,24 @@ +The attachment plugin's Insert Links button currently only knows +how to insert plain wikilinks and img directives (for images). + +[[wishlist]]: Generalize this, so a plugin can cause arbitrary text +to be inserted for a particular file. --[[Joey]] + +Design: + +Add an insertlinks hook. Each plugin using the hook would be called, +and passed the filename of the attachment. If it knows how to handle +the file type, it returns a the text that should be inserted on the page. +If not, it returns undef, and the next plugin is tried. + +This would mean writing plugins in order to handle links for +special kinds of attachments. To avoid that for simple stuff, +a fallback plugin could run last and look for a template +named like `templates/embed_$extension`, and insert a directive like: + + \[[!template id=embed_vp8 file=my_movie.vp8]] + +Then to handle a new file type, a user could just make a template +that expands to some relevant html. In the example above, +`templates/embed_vp8` could make a html5 video tag, possibly with some +flash fallback code even. diff --git a/doc/todo/git_attribution/discussion.mdwn b/doc/todo/git_attribution/discussion.mdwn index dfb490bc2..6905d9b4b 100644 --- a/doc/todo/git_attribution/discussion.mdwn +++ b/doc/todo/git_attribution/discussion.mdwn @@ -72,7 +72,7 @@ no determination of uniqueness) > GIT_AUTHOR_EMAIL can also be set. > > There is one thing yet to be solved, and that is how to tell the -> difference between a web commit by 'Joey Hess <joey@kitenet.net>', +> difference between a web commit by 'Joey Hess <joey\@kitenet.net>', > and a git commit by the same. I think we do want to differentiate these, > and the best way to do it seems to be to add a line to the end of the > commit message. Something like: "\n\nWeb-commit: true" @@ -94,5 +94,5 @@ no determination of uniqueness) > * github pushes to twitter ;-) > > So while I tried that way at first, I'm now leaning toward encoding the -> username in the email address. Like "user <user@web>", or -> "joey <http://joey.kitenet.net/@web>". +> username in the email address. Like "user <user\@web>", or +> "joey <http://joey.kitenet.net/\@web>". diff --git a/doc/todo/headless_git_branches.mdwn b/doc/todo/headless_git_branches.mdwn new file mode 100644 index 000000000..1dd867765 --- /dev/null +++ b/doc/todo/headless_git_branches.mdwn @@ -0,0 +1,74 @@ +Ikiwiki should really survive being asked to work with a git branch that has no existing commits. + + mkdir iki-gittest + cd iki-gittest + GIT_DIR=barerepo.git git init + git clone barerepo.git srcdir + ikiwiki --rcs=git srcdir destdir + +I've fixed this initial construction case, and, based on my testing, I've also fixed the post-update executing on a new master, and ikiwiki.cgi executing on a non-existent master cases. + +Please commit so my users stop whining at me about having clean branches to push to, the big babies. + +Summary: Change three scary loud failure cases related to empty branches into three mostly quiet success cases. + +[[!tag patch]] + +<pre> +diff --git a/IkiWiki/Plugin/git.pm b/IkiWiki/Plugin/git.pm +index cf7fbe9..e5bafcf 100644 +--- a/IkiWiki/Plugin/git.pm ++++ b/IkiWiki/Plugin/git.pm +@@ -439,17 +439,21 @@ sub git_commit_info ($;$) { + + my @opts; + push @opts, "--max-count=$num" if defined $num; +- +- my @raw_lines = run_or_die('git', 'log', @opts, +- '--pretty=raw', '--raw', '--abbrev=40', '--always', '-c', +- '-r', $sha1, '--', '.'); +- ++ my @raw_lines; + my @ci; +- while (my $parsed = parse_diff_tree(\@raw_lines)) { +- push @ci, $parsed; +- } ++ ++ # Test to see if branch actually exists yet. ++ if (run_or_non('git', 'show-ref', '--quiet', '--verify', '--', 'refs/heads/' . $config{gitmaster_branch}) ) { ++ @raw_lines = run_or_die('git', 'log', @opts, ++ '--pretty=raw', '--raw', '--abbrev=40', '--always', '-c', ++ '-r', $sha1, '--', '.'); ++ ++ while (my $parsed = parse_diff_tree(\@raw_lines)) { ++ push @ci, $parsed; ++ } + +- warn "Cannot parse commit info for '$sha1' commit" if !@ci; ++ warn "Cannot parse commit info for '$sha1' commit" if !@ci; ++ }; + + return wantarray ? @ci : $ci[0]; + } +@@ -474,7 +478,10 @@ sub rcs_update () { + # Update working directory. + + if (length $config{gitorigin_branch}) { +- run_or_cry('git', 'pull', '--prune', $config{gitorigin_branch}); ++ run_or_cry('git', 'fetch', '--prune', $config{gitorigin_branch}); ++ if (run_or_non('git', 'show-ref', '--quiet', '--verify', '--', 'refs/remotes/' . $config{gitorigin_branch} . '/' . $config{gitmaster_branch}) ) { ++ run_or_cry('git', 'merge', $config{gitorigin_branch} . '/' . $config{gitmaster_branch}); ++ } + } + } + +@@ -559,7 +566,7 @@ sub rcs_commit_helper (@) { + # So we should ignore its exit status (hence run_or_non). + if (run_or_non('git', 'commit', '-m', $params{message}, '-q', @opts)) { + if (length $config{gitorigin_branch}) { +- run_or_cry('git', 'push', $config{gitorigin_branch}); ++ run_or_cry('git', 'push', $config{gitorigin_branch}, $config{gitmaster_branch}); + } + } + +</pre> diff --git a/doc/todo/html.mdwn b/doc/todo/html.mdwn index 44f20c876..4f4542be2 100644 --- a/doc/todo/html.mdwn +++ b/doc/todo/html.mdwn @@ -1,6 +1,6 @@ Create some nice(r) stylesheets. Should be doable w/o touching a single line of code, just -editing the [[wikitemplates]] and/or editing [[style.css]]. +editing the [[templates]] and/or editing [[style.css]]. [[done]] ([[css_market]] ..) diff --git a/doc/todo/htpasswd_mirror_of_the_userdb.mdwn b/doc/todo/htpasswd_mirror_of_the_userdb.mdwn new file mode 100644 index 000000000..e4a411780 --- /dev/null +++ b/doc/todo/htpasswd_mirror_of_the_userdb.mdwn @@ -0,0 +1,29 @@ +[[!tag wishlist]] + +Ikiwiki is static, so access control for viewing the wiki must be +implemented on the web server side. Managing wiki users and access +together, we can currently + +* use [[httpauth|plugins/httpauth/]], but some [[passwordauth|plugins/passwordauth]] functionnality [[is missing|todo/httpauth_feature_parity_with_passwordauth/]]; +* use [[passwordauth|plugins/passwordauth]] plus [[an Apache `mod_perl` authentication mechanism|plugins/passwordauth/discussion/]], but this is Apache-centric and enabling `mod_perl` just for auth seems overkill. + +Moreover, when ikiwiki is just a part of a wider web project, we may want +to use the same userdb for the other parts of this project. + +I think an ikiwiki plugin which would (re)generate an htpasswd version of +the user/passwd base (better, two htpasswd files, one with only the wiki +admins and one with everyone) each time an user is added or modified would +solve this problem: + +* access control can be managed from the web server +* user management is handled by the passwordauth plugin +* htpasswd format is understood by various servers (Apache, lighttpd, nginx, ...) and languages commonly used for web development (perl, python, ruby) +* htpasswd files can be mirrored on other machines when the web site is distributed + +-- [[nil]] + +> I think this is a good idea. Although unless the password hashes that +> are stored in the userdb are compatible with htpasswd hashes, +> the htpasswd hashes will need to be stored in the userdb too. Then +> any userdb change can just regenerate the htpasswd file, dumping out +> the right kind of hashes. --[[Joey]] diff --git a/doc/todo/http_bl_support.mdwn b/doc/todo/http_bl_support.mdwn new file mode 100644 index 000000000..f7a46ee6c --- /dev/null +++ b/doc/todo/http_bl_support.mdwn @@ -0,0 +1,67 @@ +[Project Honeypot](http://projecthoneypot.org/) has an HTTP:BL API available to subscribed (it's free, accept donations) people/orgs. There's a basic perl package someone wrote, I'm including a copy here. + +[from here](http://projecthoneypot.org/board/read.php?f=10&i=112&t=112) + +> The [[plugins/blogspam]] service already checks urls against +> the surbl, and has its own IP blacklist. The best way to +> support the HTTP:BL may be to add a plugin +> [there](http://blogspam.repository.steve.org.uk/file/cc858e497cae/server/plugins/). +> --[[Joey]] + +<pre> +package Honeypot; + +use Socket qw/inet_ntoa/; + +my $dns = 'dnsbl.httpbl.org'; +my %types = ( +0 => 'Search Engine', +1 => 'Suspicious', +2 => 'Harvester', +4 => 'Comment Spammer' +); +sub query { +my $key = shift || die 'You need a key for this, you get one at http://www.projecthoneypot.org'; +my $ip = shift || do { +warn 'no IP for request in Honeypot::query().'; +return; +}; + +my @parts = reverse split /\./, $ip; +my $lookup_name = join'.', $key, @parts, $dns; + +my $answer = gethostbyname ($lookup_name); +return unless $answer; +$answer = inet_ntoa($answer); +my(undef, $days, $threat, $type) = split /\./, $answer; +my @types; +while(my($bit, $typename) = each %types) { +push @types, $typename if $bit & $type; +} +return { +days => $days, +threat => $threat, +type => join ',', @types +}; + +} +1; +</pre> + +From the page: + +> The usage is simple: + +> use Honeypot; +> my $key = 'XXXXXXX'; # your key +> my $ip = '....'; the IP you want to check +> my $q = Honeypot::query($key, $ip); + +> use Data::Dumper; +> print Dumper $q; + +Any chance of having this as a plugin? + +I could give it a go, too. Would be fun to try my hand at Perl. --[[simonraven]] + +[[!tag wishlist]] diff --git a/doc/todo/inline_raw_files.mdwn b/doc/todo/inline_raw_files.mdwn new file mode 100644 index 000000000..8228186f9 --- /dev/null +++ b/doc/todo/inline_raw_files.mdwn @@ -0,0 +1,115 @@ +[[!template id=gitbranch branch=wtk/raw_inline author="[[wtk]]"]] + +summary +======= + +Extend inlining to handle raw files (files with unrecognized extensions). + +Also raise an error in `IkiWiki::pagetype($file)` if `$file` is blank, which avoids trying to do much with missing files, etc. + +I'm using the new code in my [blog][]. + +[blog]: http://www.physics.drexel.edu/~wking/unfolding-disasters/posts/yacc2dot/ + +usage +===== + + \[[!inline pagenames="somefile.txt" template="raw" feeds="no"]] + + +> But inline already supports raw files in two ways: +> +> * setting raw=yes will cause a page to be inlined raw without +> using any template, as if it were part of the page at the location +> of the inline +> * otherwise, the file becomes an enclosure in the rss feed, for use with +> podcasting. +> +> So I don't see the point of your patch. Although since your text +> editor seems to like to make lots of whitespace changes, it's possible +> I missed something in the large quantity of noise introduced by it. +> --[[Joey]] + +>> As I understand it, setting `raw=yes` causes the page to be inlined +>> as if the page contents had appeared in place of the directive. The +>> content is then processed by whatever `htmlize()` applies to the +>> inlining page. I want the inlined page to be unprocessed, and +>> wrapped in `<pre><code>...</code></pre>` (as they are on the blog +>> post I link to above). +>> +>> Enclosures do not include the page contents at all, just a link to +>> them. I'm trying to inline the content so I can comment on it from +>> the inlining page. +>> +>> Apologies for my cluttered version history, I should have branched my +>> earlier changes off to make things clearer. I tried to isolate my +>> whitespace changes (fixes?) in c9ae012d245154c3374d155958fcb0b60fda57ce. +>> 157389355d01224b2d3c3f6e4c1eb42a20ec8a90 should hold all the content +>> changes. +>> +>> A list of other things globbed into my master branch that should have +>> been separate branches: +>> +>> * Make it easy to select a Markdown executable for mdwn.pm. +>> * Included an updated form of +>> [[Javier Rojas' linktoimgonly.pm|forum/link_to_an_image_inside_the_wiki_without_inlining_it]]. +>> * Included an updated form of +>> [Jason Blevins' mdwn_itex.pm](http://jblevins.org/git/ikiwiki/plugins.git/plain/mdwn_itex.pm). +>> * Assorted minor documentation changes. +>> +>> --[[wtk]] + +>>> I haven't heard anything in a while, so I've reorganized my version +>>> history and rebased it on the current ikiwiki head. Perhaps now it +>>> will be easier to merge or reject. Note the new branch name: +>>> `raw_inline`. I'll open separate todo items for items mentioned in my +>>> previous comment. --[[wtk]] + +---- + +Reviewing your patch the first thing I see is this: + +<pre> ++ if (! $file) { ++ error("Missing file."); ++ } +</pre> + +This fails if the filename is "0". Also, `pagetype()` +currently cannot fail; allowing it to crash the entire +wiki build if the filename is somehow undefined seems +unwise. + +I didn't look much further, because it seems to me what you're trying to do +can be better accomplished by using the highlight plugin. Assuming the raw +file you want to inline and comment on is some source-code-like thing, +which seems likely. + +Or, another way to do it would be to use the templates plugin, and make +a template there that puts an inline directive inside pre tags. + --[[Joey]] [[!tag reviewed]] + +---- + +If `pagetype()` cannot fail, then I suppose that check has to go ;). + +I was under the impression that [[plugins/highlight]] didn't support +inlining code. It looks like it supports highlighing stand-alone +files or embedded code. Perhaps I should extend it to support inlined +code instead of pushing this patch? + +> If you configure highlight to support standalone files, then you can +> inline the resulting pages and get nicely highlighted source code +> inlined into the page. --[[Joey]] + +The `raw.tmpl` included in the patch *does* include the inlined +content inside `pre` tags. The problem is that the current inline +code insists on running `htmlize()` on the content before inserting it +in the template. The heart of my patch is an altered +`get_inline_content()` that makes the `htmlize()` call dependent on a +`$read_raw` flag. If the flag is set, the raw (non-htmlized) content +is used instead. + +I just rebased my patches against the current Ikiwiki trunk (no major +changes) to make them easier to review. + --[[wtk]] diff --git a/doc/todo/latex.mdwn b/doc/todo/latex.mdwn index 4363003c1..fb273c1ab 100644 --- a/doc/todo/latex.mdwn +++ b/doc/todo/latex.mdwn @@ -9,6 +9,8 @@ of the ikiwiki [[/logo]]. > [[users/JasonBlevins]] has also a plugin for including [[LaTeX]] expressions (by means of `itex2MML`) -- [[plugins/mdwn_itex]] (look at his page for the link). --Ivan Z. +>> I've [[updated|mdwn_itex]] Jason's plugin for ikiwiki 3.x. --[[wtk]] + ---- ikiwiki could also support LaTeX as a document type, again rendering to HTML. @@ -227,5 +229,12 @@ Ah yes.. sorry forgot to update the plugin in my public_html folder %-). This wa > > --[[Joey]] +----- + +I'm using a [plugin](http://metameso.org/~joe/math/tex.pm) created by [Josef Urban](http://www.cs.ru.nl/~urban) that gets LaTeX into ikiwiki by using [LaTeXML](http://dlmf.nist.gov/LaTeXML). This could well be "the right way" to go (long term) but the plugin still does not render math expressions right, because ikiwiki is filtering out requisite header information. Examples (I recommend you use Firefox to view these!) are available [here](http://metameso.org/aa/math/) and [here](http://metameso.org/aa/simple/). Compare that last example to the [file generated by LaTeXML](http://metameso.org/~joe/math/math.xml). I posted the sources [here](http://metameso.org/aa/sources/) for easy perusal. How to get ikiwiki to use the original DOCTYPE and html fields? I could use some help getting this polished off. --[[jcorneli]] + +> update: it seems important to force the browser to think of the content as xml, e.g. [http://metameso.org/~joe/math/example.xml](http://metameso.org/~joe/math/example.xml) has the same source code as [http://metameso.org/~joe/math/example.html](http://metameso.org/~joe/math/example.html) and the former shows math working, but the latter doesn't. --[[jcorneli]] + + [[!tag soc]] [[!tag wishlist]] diff --git a/doc/todo/link_plugin_perhaps_too_general__63__.mdwn b/doc/todo/link_plugin_perhaps_too_general__63__.mdwn new file mode 100644 index 000000000..8a5fd50eb --- /dev/null +++ b/doc/todo/link_plugin_perhaps_too_general__63__.mdwn @@ -0,0 +1,25 @@ +[[!tag wishlist blue-sky]] +(This isn't important to me - I don't use MediaWiki or Creole syntax myself - +but just thinking out loud...) + +The [[ikiwiki/wikilink]] syntax IkiWiki uses sometimes conflicts with page +languages' syntax (notably, [[plugins/contrib/MediaWiki]] and [[plugins/Creole]] +want their wikilinks the other way round, like +`\[[plugins/write|how to write a plugin]]`). It would be nice if there was +some way for page language plugins to opt in/out of the normal wiki link +processing - then MediaWiki and Creole could have their own `linkify` hook +that was only active for *their* page types, and used the appropriate +syntax. + +In [[todo/matching_different_kinds_of_links]] I wondered about adding a +`\[[!typedlink to="foo" type="bar"]]` directive. This made me wonder whether +a core `\[[!link]]` directive would be useful; this could be a fallback for +page types where a normal wikilink can't be done for whatever reason, and +could also provide extension points more easily than WikiLinks' special +syntax with extra punctuation, which doesn't really scale? + +Straw-man: + + \[[!link to="ikiwiki/wikilink" desc="WikiLinks"]] + +--[[smcv]] diff --git a/doc/todo/mark_edit_as_trivial__44___identify__47__filter_on_trivial_changes.mdwn b/doc/todo/mark_edit_as_trivial__44___identify__47__filter_on_trivial_changes.mdwn new file mode 100644 index 000000000..2b2b0242e --- /dev/null +++ b/doc/todo/mark_edit_as_trivial__44___identify__47__filter_on_trivial_changes.mdwn @@ -0,0 +1,11 @@ +One feature of mediawiki which I quite like is the ability to mark a change as 'minor', or 'trivial'. This can then be used to filter the 'recentchanges' page, to only show substantial edits. + +The utility of this depends entirely on whether the editors use it properly. + +I currently use an inline on the front page of my personal homepage to show the most recent pages (by creation date) within a subsection of my site (a blog). Blog posts are rarely modified much after they are 'created' (or published - I bodge the creation time via meta when I publish a post. It might sit in draft form indefinitely), so this effectively shows only non-trivial changes. + +I would like to have a short list of the most recent modifications to the site on the front page. I therefore want to sort by modified time rather than creation time, but exclude edits that I self-identify as minor. I also only want to take a short number of items, the top 5, and display only their titles (which may be derived from filename, or set via meta again). + +I'm still thinking through how this might be achieved in an ikiwiki-suitable fashion, but I think I need a scheme to identify certain edits as trivial. This would have to work via web edits (easier: could add a check box to the edit form) and plain changes in the VCS (harder: scan for keywords in a commit message? in a VCS-agnostic fashion?) + +[[!tag wishlist]] diff --git a/doc/todo/matching_different_kinds_of_links.mdwn b/doc/todo/matching_different_kinds_of_links.mdwn index 26c5a072b..da3ea49f6 100644 --- a/doc/todo/matching_different_kinds_of_links.mdwn +++ b/doc/todo/matching_different_kinds_of_links.mdwn @@ -36,6 +36,11 @@ Besides pagespecs, the `rel=` attribute could be used for styles. --Ivan Z. > normal links.) Might be better to go ahead and add the variable to > core though. --[[Joey]] +>> I've implemented this with the data structure you suggested, except that +>> I called it `%typedlinks` instead of `%linktype` (it seemed to make more +>> sense that way). I also ported `tag` to it, and added a `tagged_is_strict` +>> config option. See below! --[[smcv]] + I saw somewhere else here some suggestions for the wiki-syntax for specifying the relation name of a link. One more suggestion---[the syntax used in Semantic MediaWiki](http://en.wikipedia.org/wiki/Semantic_MediaWiki#Basic_usage), like this: <pre> @@ -45,3 +50,147 @@ I saw somewhere else here some suggestions for the wiki-syntax for specifying th So a part of the effect of [[`\[[!taglink TAG\]\]`|plugins/tag]] could be represented as something like `\[[tag::TAG]]` or (more understandable relation name in what concerns the direction) `\[[tagged::TAG]]`. I don't have any opinion on this syntax (whether it's good or not)...--Ivan Z. + +------- + +>> [[!template id=gitbranch author="[[Simon_McVittie|smcv]]" branch=smcv/ready/link-types]] +>> [[!tag patch]] + +## Documentation for smcv's branch + +### added to [[ikiwiki/pagespec]] + +* "`typedlink(type glob)`" - matches pages that link to a given page (or glob) + with a given link type. Plugins can create links with a specific type: + for instance, the tag plugin creates links of type `tag`. + +### added to [[plugins/tag]] + +If the `tagged_is_strict` config option is set, `tagged()` will only match +tags explicitly set with [[ikiwiki/directive/tag]] or +[[ikiwiki/directive/taglink]]; if not (the default), it will also match +any other [[WikiLinks|ikiwiki/WikiLink]] to the tag page. + +### added to [[plugins/write]] + +#### `%typedlinks` + +The `%typedlinks` hash records links of specific types. Do not modify this +hash directly; call `add_link()`. The keys are page names, and the values +are hash references. In each page's hash reference, the keys are link types +defined by plugins, and the values are hash references with link targets +as keys, and 1 as a dummy value, something like this: + + $typedlinks{"foo"} = { + tag => { short_word => 1, metasyntactic_variable => 1 }, + next_page => { bar => 1 }, + }; + +Ordinary [[WikiLinks|ikiwiki/WikiLink]] appear in `%links`, but not in +`%typedlinks`. + +#### `add_link($$;$)` + + This adds a link to `%links`, ensuring that duplicate links are not + added. Pass it the page that contains the link, and the link text. + +An optional third parameter sets the link type (`undef` produces an ordinary +[[ikiwiki/WikiLink]]). + +## Review + +Some code refers to `oldtypedlinks`, and other to `oldlinktypes`. --[[Joey]] + +> Oops, I'll fix that. That must mean missing test coverage, too :-( +> --s + +>> A test suite for the dependency resolver *would* be nice. --[[Joey]] + +>>> Bug fixed, I think. A test suite for the dependency resolver seems +>>> more ambitious than I want to get into right now, but I added a +>>> unit test for this part of it... --s + +I'm curious what your reasoning was for adding a new variable +rather than using `pagestate`. Was it only because you needed +the `old` version to detect change, or was there other complexity? +--J + +> You seemed to be more in favour of adding it to the core in +> your proposal above, so I assumed that'd be more likely to be +> accepted :-) I don't mind one way or the other - `%typedlinks` +> costs one core variable, but saves one level of hash nesting. If +> you're not sure either, then I think the decision should come down +> to which one is easier to document clearly - I'm still unhappy with +> my docs for `%typedlinks`, so I'll try to write docs for it as +> `pagestate` and see if they work any better. --s + +>> On reflection, I don't think it's any better as a pagestate, and +>> the contents of pagestates (so far) aren't documented for other +>> plugins' consumption, so I'm inclined to leave it as-is, unless +>> you want to veto that. Loose rationale: it needs special handling +>> in the core to be a dependency type (I re-used the existing link +>> type), it's API beyond a single plugin, and it's really part of +>> the core parallel to pagestate rather than being tied to a +>> specific plugin. Also, I'd need to special-case it to have +>> ikiwiki not delete it from the index, unless I introduced a +>> dummy typedlinks plugin (or just hook) that did nothing... --s + +I have not convinced myself this is a real problem, but.. +If a page has a typed link, there seems to be no way to tell +if it also has a separate, regular link. `add_link` will add +to `@links` when adding a typed, or untyped link. If only untyped +links were recorded there, one could tell the difference. But then +typed links would not show up at all in eg, a linkmap, +unless it was changed to check for typed links too. +(Or, regular links could be recorded in typedlinks too, +with a empty type. (Bloaty.)) --J + +> I think I like the semantics as-is - I can't think of any +> reason why you'd want to ask the question "does A link to B, +> not counting tags and other typed links?". A typed link is +> still a link, in my mind at least. --s + +>> Me neither, let's not worry about it. --[[Joey]] + +I suspect we could get away without having `tagged_is_strict` +without too much transitional trouble. --[[Joey]] + +> If you think so, I can delete about 5 LoC. I don't particularly +> care either way; [[Jon]] expressed concern about people relying +> on the current semantics, on one of the pages requesting this +> change. --s + +>> Removed in a newer version of the branch. --s + +I might have been wrong to introduce `typedlink(tag foo)`. It's not +very user-friendly, and is more useful as a backend for other plugins +that as a feature in its own right - any plugin introducing a link +type will probably also want to have its own preprocessor directive +to set that link type, and its own pagespec function to match it. +I wonder whether to make a `typedlink` plugin that has the typedlink +pagespec match function and a new `\[[!typedlink to="foo" type="bar"]]` +though... --[[smcv]] + +> I agree, per-type matchers are more friendly and I'm not enamored of the +> multi-parameter pagespec syntax. --[[Joey]] + +>> Removed in a newer version of the branch. I re-introduced it as a +>> plugin in `smcv/typedlink`, but I don't think we really need it. --s + +---- + +I am ready to merge this, but I noticed one problem -- since `match_tagged` +now only matches pages with the tag linktype, a wiki will need to be +rebuilt on upgrade in order to get the linktype of existing tags in it +recorded. So there needs to be a NEWS item about this and +the postinst modified to force the rebuild. + +> Done, although you'll need to plug in an appropriate version number when +> you release it. Is there a distinctive reminder string you grep for +> during releases? I've used `UNRELEASED` for now. --[[smcv]] + +Also, the ready branch adds `typedlink()` to [[ikiwiki/pagespec]], +but you removed that feature as documented above. +--[[Joey]] + +> [[Done]]. --s diff --git a/doc/todo/mdwn_itex.mdwn b/doc/todo/mdwn_itex.mdwn new file mode 100644 index 000000000..3e304fa76 --- /dev/null +++ b/doc/todo/mdwn_itex.mdwn @@ -0,0 +1,22 @@ +[[!template id=gitbranch branch=wtk/mdwn_itex author="[[wtk]]"]] + +summary +======= + +Extend the [[plugins/mdwn]] plugin to support [itex][] using Jacques +Distler's [itex2MML][]. + +notes +===== + +This is an updated form of [[users/JasonBlevins]]' plugin. You can +see the plugin [in action][example] on my blog. The blog post lists a +few additional changes you may need to make to use the plugin, +including changing your page template to a MathML-friendly doctype and +disabling plugins like [[plugins/htmlscrubber]] and +[[plugins/htmltidy]] which would otherwise strip out the generated +MathML. + +[itex]: http://golem.ph.utexas.edu/~distler/blog/itex2MMLcommands.html +[itex2MML]: http://golem.ph.utexas.edu/~distler/blog/itex2MML.html +[example]: http://www.physics.drexel.edu/~wking/unfolding-disasters/posts/mdwn_itex/ diff --git a/doc/todo/mercurial.mdwn b/doc/todo/mercurial.mdwn index e71c8106a..de1f148e5 100644 --- a/doc/todo/mercurial.mdwn +++ b/doc/todo/mercurial.mdwn @@ -119,3 +119,11 @@ I have a few notes on mercurial usage after trying it out for a while: >> I think the ideal solution would be to build `$destdir/recentchanges/*` directly from the output of `hg log`. --[[buo]] >>>> That would be 100 times as slow, so I chose not to do that. --[[Joey]] + +>>>> Since this is confusing people, allow me to clarify: Ikiwiki's +>>>> recentchanges generation pulls log information directly out of the VCS as +>>>> needed. It caches it in recentchanges/* in the `scrdir`. These cache +>>>> files need not be preserved, should never be checked into VCS, and if +>>>> you want to you can configure your VCSignore file to ignore them, +>>>> just as you can configure it to ignore the `.ikiwiki` directory in the +>>>> `scrdir`. --[[Joey]] diff --git a/doc/todo/mirrorlist_with_per-mirror_usedirs_settings.mdwn b/doc/todo/mirrorlist_with_per-mirror_usedirs_settings.mdwn index 6ca9962ba..baad063ef 100644 --- a/doc/todo/mirrorlist_with_per-mirror_usedirs_settings.mdwn +++ b/doc/todo/mirrorlist_with_per-mirror_usedirs_settings.mdwn @@ -21,4 +21,76 @@ and decided this time it was really needed to implement this feature. --[[intrigeri]] +> Ping. --[[intrigeri]] + [[!tag patch]] + +>> (I'm not an ikiwiki committer, opinions may vary.) +>> +>>> In my opinion, you're an ikiwiki committer! --[[Joey]] +>> +>> This would be easier to review if there weren't a million merges from +>> master; perhaps either leave a branch as-is, or rebase it, or merge +>> only at "significant" times like after a release? +>> +>> I believe Joey's main objection to complex $config entries is that +>> it's not at all clear what [[plugins/websetup]] would do with them. +>> Would something like this make a reasonable alternative? +>> +>> $config{mirrorlist} = ["nousedirs|file:///home/intrigeri/wiki", +>> "usedirs|http://example.com/wiki", "http://example.net"]; +>> +>> From how I understand tainting, this: +>> +>> $untainted{$_} = possibly_foolish_untaint($tainted->{$_}) +>> +>> probably needs to untaint the key too: +>> +>> my $key = possibly_foolish_untaint($_); +>> $untainted{$key} = possibly_foolish_untaint($tainted->{key}); +>> +>> --[[smcv]] + +>>> You are fully right about the complex `$config` entries. I'll +>>> convert this to use what you are suggesting, i.e. what we ended up +>>> choosing for the `po_slave_languages` setting. +>>> +>>> About the merges in this branch: Joey told me once he did not care +>>> about this; moreover the `--no-merges` git log option makes it +>>> easy to filter these out. I'll try merging tagged releases only in +>>> the future, though. +>>> +>>> --[[intrigeri]] + +>>>> FWIW, I don't care about merge commits etc because I review +>>>> `git diff ...intrigeri/mirrorlist` -- and if I want to dig deeper +>>>> into the why of some code, I'll probably checkout the branch and +>>>> use git blame. +>>>> +>>>> I agree with what smcv said, my other concern though is that +>>>> this is such an edge case, that supporting it just adds clutter. +>>>> Have to wonder if it wouldn't perhaps be better to do something +>>>> using the goto plugin and cgiurl, so that the mirror doesn't have +>>>> to know about the configuration of the other mirror. --[[Joey]] + +>>>>> I have implemented something using the cgi + goto in my (history +>>>>> rewrite warning) mirrorlist branch. Please review, please pull. +>>>>> --[[intrigeri]] + +>>>>>> Ping? I've merged 3.20110321 in my `mirrorlist` branch and +>>>>>> checked it still works properly. --[[intrigeri]] + +>>>>> concerning goto/cgiurl, what about having that as the default in +>>>>> mirrorlist, but keeping ``nousedirs|file:///home/intrigeri/wiki`` and +>>>>> ``usedirs|http://example.com/wiki`` valid for cgi-less cases? +>>>>> that would keep typical installation with a clutter-less configuration, +>>>>> and support more individual setups too. +>>>>> --[[chrysn]] + +>>>>>> I would not mind. On the other hand Joey was concerned about +>>>>>> cluttering the code to support edge cases, which I fully +>>>>>> understand. The case you (chrysn) are describing being even +>>>>>> more specific than the one I was initially talking of, I think +>>>>>> this should not block the merge of the branch I have been +>>>>>> proposing. Support for the usecase you are suggesting can +>>>>>> always be added later if needed. --[[intrigeri]] diff --git a/doc/todo/more_flexible_inline_postform.mdwn b/doc/todo/more_flexible_inline_postform.mdwn index bc8bc0809..414476bd7 100644 --- a/doc/todo/more_flexible_inline_postform.mdwn +++ b/doc/todo/more_flexible_inline_postform.mdwn @@ -16,3 +16,8 @@ logical first step towards doing comment-like things with inlined pages). > Perhaps what we need is a `postform` plugin/directive that inline depends > on (automatically enables); its preprocess method could automatically be > invoked from preprocess_inline when needed. --[[smcv]] + +>> I've been looking at this stuff again. I think you are right, this would +>> be the right approach. The comments plugin could use it similarly, allowing +>> sites which desire it to have an inline comment submission form on all +>> pages with comments enabled. I'm going to take a look. -- [[Jon]] diff --git a/doc/todo/multiple_template_directories.mdwn b/doc/todo/multiple_template_directories.mdwn index c09a9595f..6a474b4f3 100644 --- a/doc/todo/multiple_template_directories.mdwn +++ b/doc/todo/multiple_template_directories.mdwn @@ -11,3 +11,63 @@ ought to do the trick. > global dir when it cannot find a template. For me, this is good enough. > And it is even documented in the man page. Sigh. I guess this could be > considered [[done]]. + +I have a use case for this, a site composed of blogs and wikis, templates divided in three categories : common, blog and wiki. The only solution I found is maintaining hard links, being able to have multiple template dirs would obviously be better. -- Changaco + +> [[plugins/underlay]] used to allow adding extra templatedirs, but Joey +> removed that functionality when he made templates search the wiki's +> own `templates` directory. +> +> You can get a 3-level hierarchy like this: +> +> * instance-specific overrides: $srcdir/templates +> * common to the entire site: a directory that is the value of all +> instances' `templatedir` parameters +> * common to every ikiwiki in the world: /usr/share/ikiwiki/templates +> (implicitly searched) +> +> (by "instance" I mean an instance of ikiwiki - a .setup file, basically.) +> +> For a more complex hierarchy you'd need the old [[plugins/underlay]] +> functionality, i.e. you'd need to (ask Joey to) revert the patch that +> removed it. For instance, if anyone has a hierarchy like this, then +> they need the old functionality back in order to split the template +> search path for the things marked `(???)`: +> +> every ikiwiki in the world (/usr/share/ikiwiki/templates) +> \--- your site (???) +> \--- your blogs (???) +> \--- travel blog ($srcdir/templates) +> \--- code blog ($srcdir/templates) +> \--- your wikis (???) +> \--- travel wiki ($srcdir/templates) +> \--- code wiki ($srcdir/templates) +> +> This looks pretty hypothetical to me, though... +> --[[smcv]] + +>> The reason I removed it is because the same functionality of having +>> multiple template directories is still present. Just put them in +>> the templates/ subdirectory of multiple underlay directories instead. +>> --[[Joey]] + +>>>Thanks, I didn't realize this was possible. Problem solved. -- Changaco + +>>>> We can consider this [[done]], then. For reference, the solution +>>>> to the hierarchy I mentioned above would be: +>>>> +>>>> all your sites have $your_underlay as an underlay +>>>> +>>>> the blogs and wikis all have $blog_underlay or $wiki_underlay +>>>> (as appropriate) as a higher priority underlay +>>>> +>>>> every ikiwiki in the world (/usr/share/ikiwiki/templates) +>>>> \--- your site ($your_underlay/templates, or templatedir) +>>>> \--- your blogs ($blog_underlay/templates) +>>>> \--- travel blog ($srcdir/templates) +>>>> \--- code blog ($srcdir/templates) +>>>> \--- your wikis ($wiki_underlay/templates) +>>>> \--- travel wiki ($srcdir/templates) +>>>> \--- code wiki ($srcdir/templates) +>>>> +>>>> --[[smcv]] diff --git a/doc/todo/multiple_templates.mdwn b/doc/todo/multiple_templates.mdwn index 72783c556..30fb8d6ee 100644 --- a/doc/todo/multiple_templates.mdwn +++ b/doc/todo/multiple_templates.mdwn @@ -1,4 +1,4 @@ -> Another useful feature might be to be able to choose a different [[template|wikitemplates]] +> Another useful feature might be to be able to choose a different [[template|templates]] > file for some pages; [[blog]] pages would use a template different from the > home page, even if both are managed in the same repository, etc. diff --git a/doc/todo/nested_preprocessor_directives.mdwn b/doc/todo/nested_preprocessor_directives.mdwn index b5080dc3c..4a2795e30 100644 --- a/doc/todo/nested_preprocessor_directives.mdwn +++ b/doc/todo/nested_preprocessor_directives.mdwn @@ -16,3 +16,50 @@ nesting, a new syntax would be needed. Maybe something xml-like? > """]] > > --[[JoshTriplett]] + +>> Yes it's definitely possible to do something like that. I'm not 100% +>> sure if it can be done in perl regexp or needs a real recursive descent +>> parser though. +>> +>> In the meantime, this is an interesting approach: +>> <https://github.com/timo/ikiwiki/commit/410bbaf141036164f92009599ae12790b1530886> +>> (the link has since been fixed twice) +>> +>> \[[!directive text=<<FOO +>> ... +>> FOO]] +>> +>> Since that's implemented, I will probably just merge it, +>> once I satisfy myself it doesn't blow up in any edge cases. +>> (It also adds triple single quotes as a third, distinct type of quotes, +>> which feels a bit redundant given the here docs.) --[[Joey]] +>> +>> Hmm, that patch changes a `m///sgx` to a `m///msgx`. Meaning +>> that any '^' or '$' inside the regexp will change behavior from matching +>> the start/end of string to matching the start/end of individual lines +>> within the string. And there is one legacy '$' which must then +>> change behavior; the "delimiter to next param". +>> +>> So, I'm not sure what behavior that will cause, but I suspect it will +>> be a bug. Unless the `\s+|$' already stops matching at a newline within +>> the string like it's whitespace. That needs more alalysis. +>> Update: seems it does, I'm fairly satisfied that is not a bug. +>> +>> Also, the patch seems incomplete, only patching the first regexp +>> but not the other two in the same function, which also are quoting-aware. --[[Joey]] +>> +>> Yes, I'm terribly sorry. I actually did edit the other two regexps, but +>> I apparently missed copying it over as well. Should have been doing this +>> in a git repo all along. Look at the new commit I put atop it that has +>> the rest as well: +>> (redacted: is now part of the commit linked to from above) +>> Also: I'm not sure any more, why I added the m modifier. It was very +>> late at night and I was getting a bit desperate (turned out, the next +>> morning, I put my extra regexes after the "unquoted value" one. heh.) +>> So, feel free to fix that. --Timo +>> +>> I've fixed the patch by rebasing, fixed the link above. I'm still not +>> sure if the m modifier for the regex is still needed (apparently I +>> didn't put it in the other regexes. Not completely sure about the +>> implications.) Am now trying to wrap my head around a test case to +>> test the new formats for a bit. --Timo diff --git a/doc/todo/openid_user_filtering.mdwn b/doc/todo/openid_user_filtering.mdwn index 8b2d0082e..6a318c4c0 100644 --- a/doc/todo/openid_user_filtering.mdwn +++ b/doc/todo/openid_user_filtering.mdwn @@ -7,3 +7,7 @@ So I suggest an ikiwiki configuration like: users => ["*.webvm.net"], Would only allow edits from openIDs of that form. + +> This kind of thing can be [[done]] now: --[[Joey]] +> +> locked_pages => "* and !user(http://*.webvm.net/)" diff --git a/doc/todo/optional_underlaydir_prefix.mdwn b/doc/todo/optional_underlaydir_prefix.mdwn new file mode 100644 index 000000000..06900a904 --- /dev/null +++ b/doc/todo/optional_underlaydir_prefix.mdwn @@ -0,0 +1,46 @@ +For security reasons, symlinks are disabled in IkiWiki. That's fair enough, but that means that some problems, which one could otherwise solve by using a symlink, cannot be solved. The specfic problem in this case is that all underlays are placed at the root of the wiki, when it could be more convenient to place some underlays in specific sub-directories. + +Use-case 1 (to keep things tidy): + +Currently IkiWiki has some javascript files in `underlays/javascript`; that directory is given as one of the underlay directories. Thus, all the javascript files appear in the root of the generated site. But it would be tidier if one could say "put the contents of *this* underlaydir under the `js` directory". + +> Of course, this could be accomplished, if we wanted to, by moving the +> files to `underlays/javascript/js`. --[[Joey]] + +Use-case 2 (a read-only external dir): + +Suppose I want to include a subset of `/usr/local/share/docs` on my wiki, say the docs about `foo`. But I want them to be under the `docs/foo` sub-directory on the generated site. Currently I can't do that. If I give `/usr/local/share/docs/foo` as an underlaydir, then the contents of that will be in the root of the site, rather than under `docs/foo`. And if I give `/usr/local/share/docs` as an underlaydir, then the contents of the `foo` dir will be under `foo`, but it will also include every other thing in `/usr/local/share/docs`. + +Since we can't use symlinks in an underlay dir to link to these directories, then perhaps one could give a specific underlay dir a specific prefix, which defines the sub-directory that the underlay should appear in. + +I'm not sure how this would be implemented, but I guess it could be configured something like this: + + prefixed_underlay => { + 'js' => '/usr/local/share/ikiwiki/javascript', + 'docs/foo' => '/usr/local/share/docs/foo', + } + +> So, let me review why symlinks are an issue. For normal, non-underlay +> pages, users who do not have filesystem access to the server may have +> commit access, and so could commit eg, a symlink to `/etc/passwd` (or +> to `/` !). The guards are there to prevent ikiwiki either exposing the +> symlink target's contents, or potentially overwriting it. +> +> Is this a concern for underlays? Most of the time, certianly not; +> the underlay tends to be something only the site admin controls. +> Not all the security checks that are done on the srcdir are done +> on the underlays, either. Most checks done on files in the underlay +> are only done because the same code handles srcdir files. The one +> exception is the test that skips processing symlinks in the underlay dir. +> (But note that the underlay directory can be a symlinkt to elsewhere +> which the srcdir, by default, cannot.) +> +> So, one way to approach this is to make ikiwiki follow directory symlinks +> inside the underlay directory. Just a matter of passing `follow => 1` to +> find. (This would still not allow individual files to be symlinks, because +> `readfile` does not allow reading symlinks. But I don't see much need +> for that.) --[[Joey]] + +>> If you think that enabling symlinks in underlay directories wouldn't be a security issue, then I'm all for it! That would be much simpler to implement, I'm sure. --[[KathrynAndersen]] + +[[!taglink wishlist]] diff --git a/doc/todo/org_mode.mdwn b/doc/todo/org_mode.mdwn new file mode 100644 index 000000000..3e9d95376 --- /dev/null +++ b/doc/todo/org_mode.mdwn @@ -0,0 +1,24 @@ +[[!template id=gitbranch branch=wtk/org author="[[wtk]]"]] + +summary +======= + +Add a plugin for handling files written in [org-mode][]. + +notes +===== + +This is an updated form of [Manoj Srivastava's plugin][MS]. You can +see the plugin [in action][example] on my blog. + +For reasons discussed in the [[reStructuredText plugin|plugins/rst]], +wikilinks and other ikiwiki markup that inserts raw HTML can cause +problems. Org-mode provides a [means for processing raw HTML][raw], +but Ikiwiki currently (as far as I know) lacks a method to escape +inserted HTML depending on which plugins will be used during the +[[htmlize phase|plugins/write#index11h3]]. + +[org-mode]: http://orgmode.org/ +[MS]: http://www.golden-gryphon.com/blog/manoj/blog/2008/06/08/Using_org-mode_with_Ikiwiki/ +[example]: http://www.physics.drexel.edu/~wking/unfolding-disasters/posts/Git/notes/ +[raw]: http://orgmode.org/manual/Quoting-HTML-tags.html diff --git a/doc/todo/pagespec_aliases.mdwn b/doc/todo/pagespec_aliases.mdwn new file mode 100644 index 000000000..2db53d545 --- /dev/null +++ b/doc/todo/pagespec_aliases.mdwn @@ -0,0 +1,93 @@ +[[!tag patch wishlist]]I quite often find myself repeating a boiler-plate +pagespec chunk, e.g. + + and !*.png and !*.jpg... + +it would be quite nice if I could conveniently bundle them together into a +pagespec "alias", and instead write + + and !image()... + +I wrote the following plugin to achieve this: + + commit f3a9dd113338fe5d2b717de1dc69679ff74e2f8d + Author: Jon Dowland <jmtd@debian.org> + Date: Tue May 3 17:40:16 2011 +0100 + + new plugin: alias.pm - pagespec aliases + + diff --git a/IkiWiki/Plugin/alias.pm b/IkiWiki/Plugin/alias.pm + new file mode 100644 + index 0000000..b8d4574 + --- /dev/null + +++ b/IkiWiki/Plugin/alias.pm + @@ -0,0 +1,47 @@ + +package IkiWiki::Plugin::alias; + + + +use warnings; + +use strict; + +use IkiWiki '3.00'; + + + +sub import { + + hook(type => "getsetup", id=> "alias", call => \&getsetup); + + hook(type => "checkconfig", id=> "alias", call => \&checkconfig); + +} + + + +sub getsetup () { + + return + + plugin => { + + description => "allows the definition of pagespec aliases", + + safe => 1, + + rebuild => 1, + + section => "misc", + + }, + + pagespec_aliases => { + + type => "string", + + example => {"image" => "*jpg or *jpeg or *png or *gif or *ico" }, + + description => "a set of mappings from alias name to pagespec", + + safe => 1, + + rebuild => 0, + + }, + +} + + + +sub checkconfig () { + + no strict 'refs'; + + no warnings 'redefine'; + + + + if ($config{pagespec_aliases}) { + + foreach my $key (keys %{$config{pagespec_aliases}}) { + + my $value = ${$config{pagespec_aliases}}{$key}; + + # XXX: validate key? + + my $subname = "IkiWiki::PageSpec::match_$key"; + + *{ $subname } = sub { + + my $path = shift; + + return IkiWiki::pagespec_match($path, $value); + + } + + } + + } + +} + + + +1; + +I need to reflect on this a bit more before I send a pull request. In +particular I imagine the strict/warnings stuff will make you puke. Also, I'm +not sure whether I should name-grab 'alias' since [[todo/alias_directive]] is +an existing wishlist item. + +Here's an example setup chunk: + + pagespec_aliases: + image: "*.png or *.jpg or *.jpeg or *.gif or *.ico" + helper: "*.css or *.js" + boring: "image() or helper()" + +The above demonstrates self-referential dynamic pagespec aliases. It doesn't work, +however, to add ' or internal()' to `boring`, for some reason. + +-- [[Jon]] + +> another useful pagespec alias for large maps: + + basewiki: "sandbox or templates or templates/* or ikiwiki or ikiwiki/* or shortcuts or recentchanges or wikiicons/*" + +> -- [[Jon]] diff --git a/doc/todo/pagespec_aliases/discussion.mdwn b/doc/todo/pagespec_aliases/discussion.mdwn new file mode 100644 index 000000000..abbe80e6a --- /dev/null +++ b/doc/todo/pagespec_aliases/discussion.mdwn @@ -0,0 +1,13 @@ +Something which is similar to aliases is the "trail" concept I use in the [[plugins/contrib/report]] plugin. (Also my "pmap" plugin, but that's only in my "experimental" branch on github). One can define a "trail" by making a report with the "doscan" option (I should probably change the name of that) and then that page has a "trail" which matches the pagespec in that report. +Then one can reference that page as a "trail" without having to reuse that pagespec. +(It's also very useful in speeding up the processing, because the matching pages have been remembered, and one doesn't have to search for them again). + +So, for example, one could make a page "all_images" and have a report (or pmap, which is simpler) like so: + + \[[!pmap pages="*.png or *.jpg or *.jpeg or *.gif or *.ico"]] + +And then later, somewhere else + + \[[!report template="images.tmpl" trail="all_images" pages="album/*"]] + +and that would show all the images under "album". diff --git a/doc/todo/passwordauth:_sendmail_interface.mdwn b/doc/todo/passwordauth:_sendmail_interface.mdwn index 29f28ca32..556240964 100644 --- a/doc/todo/passwordauth:_sendmail_interface.mdwn +++ b/doc/todo/passwordauth:_sendmail_interface.mdwn @@ -35,7 +35,7 @@ in the ikiwiki source code, where emailing is done. OK, so I'll have a look at replacing all email handling with *Email::Send*. [[!tag patch]] -*<http://www.thomas.schwinge.homeip.net/tmp/ikiwiki-sendmail.patch>* +*<http://schwinge.homeip.net/~thomas/tmp/ikiwiki-sendmail.patch>* Remaining TODOs: diff --git a/doc/todo/pingback_support.mdwn b/doc/todo/pingback_support.mdwn index b10366bda..7b3b158ee 100644 --- a/doc/todo/pingback_support.mdwn +++ b/doc/todo/pingback_support.mdwn @@ -37,3 +37,5 @@ case I will consider this done with an entry in [[tips]]; otherwise a > whenever a page is posted or edited, and gets the changed content, it can > simply scan it for urls (may have to htmlize first?), and send pings to > all urls found. --[[Joey]] + +>> Is there any update on this? This would be highly useful and is the main reason why I am not using my blog more regularly, yet. (And yes, now that git-annex is doing everything I need and more, I thought I should revisit this one, as well). -- RichiH diff --git a/doc/todo/po:_avoid_rebuilding_to_fix_meta_titles.mdwn b/doc/todo/po:_avoid_rebuilding_to_fix_meta_titles.mdwn new file mode 100644 index 000000000..9bb9c72c4 --- /dev/null +++ b/doc/todo/po:_avoid_rebuilding_to_fix_meta_titles.mdwn @@ -0,0 +1,60 @@ +Re the meta title escaping issue worked around by `change`. + +> I suppose this does not only affect meta, but other things +> at scan time too. Also, handling it only on rebuild feels +> suspicious -- a refresh could involve changes to multiple +> pages and trigger the same problem, I think. Also, exposing +> this rebuild to the user seems really ugly, not confidence inducing. +> +> So I wonder if there's a better way. Such as making po, at scan time, +> re-run the scan hooks, passing them modified content (either converted +> from po to mdwn or with the escaped stuff cheaply de-escaped). (Of +> course the scan hook would need to avoid calling itself!) +> +> (This doesn't need to block the merge, but I hope it can be addressed +> eventually..) +> +> --[[Joey]] +>> +>> I'll think about it soon. +>> +>> --[[intrigeri]] +>> +>>> Did you get a chance to? --[[Joey]] + +>>>> I eventually did, and got rid of the ugly double rebuild of pages +>>>> at build time. This involved adding a `rescan` hook. Rationale +>>>> and details are in my po branch commit messages. I believe this +>>>> new way of handling meta title escaping to be far more robust. +>>>> Moreover this new implementation is more generic, feels more +>>>> logical to me, and probably fixes other similar bugs outside the +>>>> meta plugin scope. Please have a look when you can. +>>>> --[[intrigeri]] + +>>>>> Glad you have tackled this. Looking at +>>>>> 25447bccae0439ea56da7a788482a4807c7c459d, +>>>>> I wonder how this rescan hook is different from a scan hook +>>>>> with `last => 1` ? Ah, it comes *after* the preprocess hook +>>>>> in scan mode. Hmm, I wonder if there's any reason to have +>>>>> the scan hook called before those as it does now. Reordering +>>>>> those 2 lines could avoid adding a new hook. --[[Joey]] + +>>>>>> Sure. I was fearing to break other plugins if I did so, so I +>>>>>> did not dare to. I'll try this. --[[intrigeri]] + +>>>>>>> Done in my po branch, please have a look. --[[intrigeri]] + +>>>>>>>> I've merged it. Didn't look at the po.pm changes closely; +>>>>>>>> assume they're ok. [[done]] --[[Joey]] +>>>>>>>> +>>>>>>>> My thinking about the reordering being safe is that +>>>>>>>> the relative ordering of scan and preprocess in scan mode hooks +>>>>>>>> has not been defined before, so it should be ok to define it. :) +>>>>>>>> +>>>>>>>> And as to possible breakage from things that assumed the old +>>>>>>>> ordering, such a thing would need to have a scan hook and a +>>>>>>>> preprocess in scan mode hook, and the two hooks would need to +>>>>>>>> populate the same data structure with conflicting information, +>>>>>>>> in order for there to be a problem. That seems highly unlikely +>>>>>>>> and would be pretty broken on its own. And no plugin in ikiwiki +>>>>>>>> itself has both types of hooks. --[[Joey]] diff --git a/doc/todo/po:_better_documentation.mdwn b/doc/todo/po:_better_documentation.mdwn new file mode 100644 index 000000000..6e9804df4 --- /dev/null +++ b/doc/todo/po:_better_documentation.mdwn @@ -0,0 +1,3 @@ +Maybe write separate documentation for the po plugin, depending on the +people it targets: translators, wiki administrators, hackers. This +plugin may be complex enough to deserve this. diff --git a/doc/todo/po:_better_links.mdwn b/doc/todo/po:_better_links.mdwn new file mode 100644 index 000000000..af879a56a --- /dev/null +++ b/doc/todo/po:_better_links.mdwn @@ -0,0 +1,12 @@ +Once the fix to +[[bugs/pagetitle_function_does_not_respect_meta_titles]] from +[[intrigeri]]'s `meta` branch is merged into ikiwiki upstream, the +generated links' text will be optionally based on the page titles set +with the [[meta|plugins/meta]] plugin, and will thus be translatable. +It will also allow displaying the translation status in links to slave +pages. Both were implemented, and reverted in commit +ea753782b222bf4ba2fb4683b6363afdd9055b64, which should be reverted +once [[intrigeri]]'s `meta` branch is merged. + +An integration branch, called `meta-po`, merges [[intrigeri]]'s `po` +and `meta` branches, and thus has this additional features. diff --git a/doc/todo/po:_better_translation_interface.mdwn b/doc/todo/po:_better_translation_interface.mdwn new file mode 100644 index 000000000..e66a77b85 --- /dev/null +++ b/doc/todo/po:_better_translation_interface.mdwn @@ -0,0 +1,2 @@ +Add a message-by-message translation interface to the PO plugin, +with automatic escaping of special chars. diff --git a/doc/todo/po:_remove_po_files_when_disabling_plugin.mdwn b/doc/todo/po:_remove_po_files_when_disabling_plugin.mdwn new file mode 100644 index 000000000..5d0318ae1 --- /dev/null +++ b/doc/todo/po:_remove_po_files_when_disabling_plugin.mdwn @@ -0,0 +1,13 @@ +ikiwiki now has a `disable` hook. Should the po plugin remove the po +files from the source repository when it has been disabled? + +> pot files, possibly, but the po files contain work, so no. --[[Joey]] + +>> I tried to implement this in my `po-disable` branch, but AFAIK, the +>> current rcs plugins interface provides no way to tell whether a +>> given file (e.g. a POT file in my case) is under version control; +>> in most cases, it is not, thanks to .gitignore or similar, but we +>> can't be sure. So I just can't decide it is needed to call +>> `rcs_remove` rather than a good old `unlink`. --[[intrigeri]] + +>>> I guess you could call `rcs_remove` followed by `unlink`. --[[Joey]] diff --git a/doc/todo/po:_rethink_pagespecs.mdwn b/doc/todo/po:_rethink_pagespecs.mdwn new file mode 100644 index 000000000..98c7ff655 --- /dev/null +++ b/doc/todo/po:_rethink_pagespecs.mdwn @@ -0,0 +1,40 @@ +I was suprised that, when using the map directive, a pagespec of "*" +listed all the translated pages as well as regular pages. That can +make a big difference to an existing wiki when po is turned on, +and seems generally not wanted. +(OTOH, you do want to match translated pages by +default when locking pages.) --[[Joey]] + +> Seems hard to me to sort apart the pagespec whose matching pages +> list must be restricted to pages in the master (or current?) +> language, and the ones that should not. The only solution I can see +> to this surprising behaviour is: documentation. --[[intrigeri]] + +>> Well, a sorting criteria might be that if a PageSpec is used +>> with a specified locaction, as happens whenever a PageSpec is +>> used on a page, then it should match only `currentlang()`. If it +>> is used without a location, as in the setup file, then no such limit. + +>>> Ok. --[[intrigeri]] + +>> Note that +>> `match_currentlang` currently dies if called w/o a location -- if +>> it instead was always true w/o a location, this would just mean that +>> all pagespecs should have `and currentlang()` added to them. How to +>> implement that? All I can think of doing is wrapping +>> `pagespec_translate`. + +>>> Seems doable. --[[intrigeri]] + +>> The only case I've found where it does make sense to match other +>> language pages is on `l10n.ikiwiki.info` when listing pages that +>> need translation. +>> +>> Otherwise, it can be documented, but that's not really enough; +>> a user who makes a site using auto-blog.setup and enables po will +>> get a really screwed up blog that lists translations as separate posts +>> and needs significant work to fix. I have thought about making +>> `match_currentlang` a stub in IkiWiki (done in my currentlang branch), +>> so I can use it in all the PageSpecs in the example blog etc, but I +>> can't say I love the idea. +>> --[[Joey]] diff --git a/doc/todo/po:_translation_of_directives.mdwn b/doc/todo/po:_translation_of_directives.mdwn new file mode 100644 index 000000000..89fc93620 --- /dev/null +++ b/doc/todo/po:_translation_of_directives.mdwn @@ -0,0 +1,8 @@ +If a translated page contains a directive, it may expand to some english +text, or text in whatever single language ikiwiki is configured to "speak". + +Maybe there could be a way to switch ikiwiki to speaking another language +when building a non-english page? Then the directives would get translated. + +(We also will need this in order to use translated templates, when they are +available.) diff --git a/doc/todo/po_needstranslation_pagespec.mdwn b/doc/todo/po_needstranslation_pagespec.mdwn new file mode 100644 index 000000000..45b7377ea --- /dev/null +++ b/doc/todo/po_needstranslation_pagespec.mdwn @@ -0,0 +1,12 @@ +Commit b225fdc44d4b3d in my po branch adds a `needstranslation()` +PageSpec. It makes it easy to list pages that need translation work. +Please review. --[[intrigeri]] + +> Looks good, cherry-picked. The only improvment I can +> think of is that `needstranslation(50)` could match +> only pages less than 50% translated. --[[Joey]] + +>> This improvement has been implemented as 98cc946 in my po branch. +>> --[[intrigeri]] + +[[!tag patch done]] diff --git a/doc/todo/preview_changes_before_git_commit.mdwn b/doc/todo/preview_changes_before_git_commit.mdwn new file mode 100644 index 000000000..187497cf4 --- /dev/null +++ b/doc/todo/preview_changes_before_git_commit.mdwn @@ -0,0 +1,17 @@ +ikiwiki allows to commit changes to the doc wiki over the `git://...` protocol. +It would be nice if there'd be a uniform way to view these changes before `git +push`ing. For the GNU Hurd's web pages, we include a *render_locally* script, +<http://www.gnu.org/software/hurd/render_locally>, with instructions on +<http://www.gnu.org/software/hurd/contributing/web_pages.html>, section +*Preview Changes*. With ikiwiki, one can use `make docwiki`, but that excludes +a set of pages, as per `docwiki.setup`. --[[tschwinge]] + +> `ikiwiki -setup some.setup --render file.mdwn` will build the page and +> dump it to stdout. So, for example: + + ikiwiki -setup docwiki.setup --render doc/todo/preview_changes_before_git_commit.mdwn | w3m -T text/html + +> You have to have a setup file, though it suffices to make up your own +> if you don't have the real one. Using ikiwiki.info's real setup file +> won't actually work since it uses a search plugin that gets unhappy +> if this is not in `/srv/web/ikiwiki.info`. --[[Joey]] diff --git a/doc/todo/replace_HTML::Template_with_Template_Toolkit.mdwn b/doc/todo/replace_HTML::Template_with_Template_Toolkit.mdwn index c4e78ca0b..d55fc0aa8 100644 --- a/doc/todo/replace_HTML::Template_with_Template_Toolkit.mdwn +++ b/doc/todo/replace_HTML::Template_with_Template_Toolkit.mdwn @@ -5,9 +5,14 @@ features and thus makes it rather hard to give an ikiwiki site a consistent look. If you browse the templates provided in the tarball, you'll notice that more than one of them contain the `<html>` tag, which is unnecessary. +> Note that is no longer true, and I didn't have to do such an intrusive +> change to fix it either. --[[Joey]] + Maybe it's just me, I also find HTML::Template cumbersome to use, due in part to its use of capital letters. +> Its entirely optional use of capital letters? --[[Joey]] + Finally, the software seems unmaintained: the mailing list and searchable archives linked from <http://html-template.sourceforge.net/html_template.html#frequently%20asked%20questions> @@ -58,3 +63,25 @@ Yes, Template::Toolkit is very powerful. But I think it's somehow overkill for a I'd have to agree that Template::Toolkit is overkill and personally I'm not a fan, but it is very popular (there is even a book) and the new version (3) is alleged to be much more nimble than current version. --[[ajt]] HTML::Template's HTML-like markup prevents me from editing templates in KompoZer or other WYSIWYG HTML editors. The editor tries to render the template markup rather than display it verbatim, and large parts of the template become invisible. A markup syntax that doesn't confuse editors (such as Template::Toolkit's "[% FOO %]") may promote template customization. The ability to replace the template engine would be within the spirit of ikiwiki's extensibility. --Rocco + + +I agree that being able to replace the template toolkit would be a great piece of modularity, and one I would use. If I could use the slot-based filling and the conditional logic from Template::Toolkit, we could build much more flexible inline and archivepage templates that would look different depending on where in the wiki we use them. Some of this can currently be accomplished with separate templates for each use case and a manual call to the right template in the !inline directive, but this is limited, cumbersome, and makes it difficult to reuse bits of formatting by trapping all of that information in multiple template files. -Ian + +> I don't wish HTML::Template to be *replaced* by Template::Toolkit - as +> others have said above, it's overkill for my needs. However, I also +> agree that HTML::Template has its own problems too. The idea of making +> the template system modular, with a choice of which backend to use - I +> really like that idea. It would enable me to use some other template +> system I like better, such as Text::Template or Text::NeatTemplate. But I +> think it would be a lot of work to implement, though perhaps no more work +> than making the revision-control backend modular, I guess. One would +> need to write an IkiWiki template interface that didn't care what the +> backend was, and yet is somehow still flexible enough to take advantage +> of special features of different backends. There are an *awful lot* of +> things that use templates - not just the `pagetemplate` and `template` +> plugins, but a number of others which have specialized templates of their +> own. -- [[KathrynAndersen]]a + +>> A modular template system in ikiwiki is unlikely, as template objects +>> are part of the API, notably the `pagetemplate` hook. Unless the other +>> system has a compatible template object. --[[Joey]] diff --git a/doc/todo/rewrite_ikiwiki_in_haskell.mdwn b/doc/todo/rewrite_ikiwiki_in_haskell.mdwn index 204c48cd7..48ed744b1 100644 --- a/doc/todo/rewrite_ikiwiki_in_haskell.mdwn +++ b/doc/todo/rewrite_ikiwiki_in_haskell.mdwn @@ -29,6 +29,7 @@ It's appealing for a lot of reasons, including: edit in html editors currently. - This would be a chance to make WikiLinks with link texts read "the right way round" (ie, vaguely wiki creole compatably). + *[See also [[todo/link_plugin_perhaps_too_general?]] --[[smcv]]]* - The data structures would probably be quite different. - I might want to drop a lot of the command-line flags, either requiring a setup file be used for those things, or leaving the diff --git a/doc/todo/salmon_protocol_for_comment_sharing.mdwn b/doc/todo/salmon_protocol_for_comment_sharing.mdwn new file mode 100644 index 000000000..1e56b0a8b --- /dev/null +++ b/doc/todo/salmon_protocol_for_comment_sharing.mdwn @@ -0,0 +1,21 @@ +The <a href="http://www.salmon-protocol.org/home">Salmon protocol</a> +provides for aggregating comments across sites. If a site that syndicates +a feed receives a comment on an item in that feed, it can re-post the +comment to the original source. + +> Ikiwiki does not allow comments to be posted on items it aggregates. +> So salmon protocol support would only need to handle the comment +> receiving side of the protocol. +> +> The current draft protocol document confuses me when it starts talking +> about using OAuth in the abuse prevention section, since their example +> does not show use of OAuth, and it's not at all clear to me where the +> OAuth relationship between aggregator and original source is supposed +> to come from. +> +> Their security model, which goes on to include Webfinger, +> thirdparty validation services, XRD, and Magic Signatures, looks sorta +> like they kept throwing technology, at it, hoping something will stick. :-P +> --[[Joey]] + +[[!tag wishlist]] diff --git a/doc/todo/selective_more_directive.mdwn b/doc/todo/selective_more_directive.mdwn new file mode 100644 index 000000000..2a9998205 --- /dev/null +++ b/doc/todo/selective_more_directive.mdwn @@ -0,0 +1,28 @@ +I'm setting up a blog for NaNoWriMo and other story-writing, which means long posts every day. I want to have excerpts on the front page, which link to the full length story posts. I also want a dedicated page for each story which inlines the story in full and in chronological order. I can use the "more" directive to achieve this effect on the front page but then it spoils the story page. My solution was to add a pages= parameter to the more directive to make it more selective. + + --- /usr/share/perl5/IkiWiki/Plugin/more.pm 2010-10-09 00:09:24.000000000 +0000 + +++ .ikiwiki/IkiWiki/Plugin/more.pm 2010-11-01 20:24:59.000000000 +0000 + @@ -26,7 +26,10 @@ + + $params{linktext} = $linktext unless defined $params{linktext}; + + - if ($params{page} ne $params{destpage}) { + + if ($params{page} ne $params{destpage} && + + (! exists $params{pages} || + + pagespec_match($params{destpage}, $params{pages}, + + location => $params{page}))) { + return "\n". + htmllink($params{page}, $params{destpage}, $params{page}, + linktext => $params{linktext}, + +I can now call it as + + \[[!more pages="index" linktext="Chapter 1" text=""" + etc + """]] + +I'm not entirely happy with the design, since I would rather put this information in the inline directive instead of in every story post. Unfortunately I found no way to pass parameters from the inline directive to the inlined page. + +-- [[dark]] + +> Me neither, but nor do I see a better way, so [[applied|done]]. --[[Joey]] diff --git a/doc/todo/smarter_sorting.mdwn b/doc/todo/smarter_sorting.mdwn new file mode 100644 index 000000000..901e143a7 --- /dev/null +++ b/doc/todo/smarter_sorting.mdwn @@ -0,0 +1,141 @@ +I benchmarked a build of a large wiki (my home wiki), and it was spending +quite a lot of time sorting; `CORE::sort` was called only 1138 times, but +still flagged as the #1 time sink. (I'm not sure I trust NYTProf fully +about that FWIW, since it also said 27238263 calls to `cmp_age` were +the #3 timesink, and I suspect it may not entirely accurately measure +the overhead of so many short function calls.) + +`pagespec_match_list` currently always sorts *all* pages first, and then +finds the top M that match the pagespec. That's innefficient when M is +small (as for example in a typical blog, where only 20 posts are shown, +out of maybe thousands). + +As [[smcv]] noted, It could be flipped, so the pagespec is applied first, +and then sort the smaller matching set. But, checking pagespecs is likely +more expensive than sorting. (Also, influence calculation complicates +doing that.) + +Another option, when there is a limit on M pages to return, might be to +cull the M top pages without sorting the rest. + +> The patch below implements this. +> +> But, I have not thought enough about influence calculation. +> I need to figure out which pagespec matches influences need to be +> accumulated for in order to determine all possible influences of a +> pagespec are known. +> +> The old code accumulates influences from matching all successful pages +> up to the num cutoff, as well as influences from an arbitrary (sometimes +> zero) number of failed matches. New code does not accumulate influences +> from all the top successful matches, only an arbitrary group of +> successes and some failures. +> +> Also, by the time I finished this, it was not measuarably faster than +> the old method. At least not with a few thousand pages; it +> might be worth revisiting this sometime for many more pages? [[done]] +> --[[Joey]] + +<pre> +diff --git a/IkiWiki.pm b/IkiWiki.pm +index 1730e47..bc8b23d 100644 +--- a/IkiWiki.pm ++++ b/IkiWiki.pm +@@ -2122,36 +2122,54 @@ sub pagespec_match_list ($$;@) { + my $num=$params{num}; + delete @params{qw{num deptype reverse sort filter list}}; + +- # when only the top matches will be returned, it's efficient to +- # sort before matching to pagespec, +- if (defined $num && defined $sort) { +- @candidates=IkiWiki::SortSpec::sort_pages( +- $sort, @candidates); +- } +- ++ # Find the first num matches (or all), before sorting. + my @matches; +- my $firstfail; + my $count=0; + my $accum=IkiWiki::SuccessReason->new(); +- foreach my $p (@candidates) { +- my $r=$sub->($p, %params, location => $page); ++ my $i; ++ for ($i=0; $i < @candidates; $i++) { ++ my $r=$sub->($candidates[$i], %params, location => $page); + error(sprintf(gettext("cannot match pages: %s"), $r)) + if $r->isa("IkiWiki::ErrorReason"); + $accum |= $r; + if ($r) { +- push @matches, $p; ++ push @matches, $candidates[$i]; + last if defined $num && ++$count == $num; + } + } + ++ # We have num natches, but they may not be the best. ++ # Efficiently find and add the rest, without sorting the full list of ++ # candidates. ++ if (defined $num && defined $sort) { ++ @matches=IkiWiki::SortSpec::sort_pages($sort, @matches); ++ ++ for ($i++; $i < @candidates; $i++) { ++ # Comparing candidate with lowest match is cheaper, ++ # so it's done before testing against pagespec. ++ if (IkiWiki::SortSpec::cmptwo($candidates[$i], $matches[-1], $sort) < 0 && ++ $sub->($candidates[$i], %params, location => $page) ++ ) { ++ # this could be done less expensively ++ # using a binary search ++ for (my $j=0; $j < @matches; $j++) { ++ if (IkiWiki::SortSpec::cmptwo($candidates[$i], $matches[$j], $sort) < 0) { ++ splice @matches, $j, $#matches-$j+1, $candidates[$i], ++ @matches[$j..$#matches-1]; ++ last; ++ } ++ } ++ } ++ } ++ } ++ + # Add simple dependencies for accumulated influences. +- my $i=$accum->influences; +- foreach my $k (keys %$i) { +- $depends_simple{$page}{lc $k} |= $i->{$k}; ++ my $inf=$accum->influences; ++ foreach my $k (keys %$inf) { ++ $depends_simple{$page}{lc $k} |= $inf->{$k}; + } + +- # when all matches will be returned, it's efficient to +- # sort after matching ++ # Sort if we didn't already. + if (! defined $num && defined $sort) { + return IkiWiki::SortSpec::sort_pages( + $sort, @matches); +@@ -2455,6 +2473,12 @@ sub sort_pages { + sort $f @_ + } + ++sub cmptwo { ++ $a=$_[0]; ++ $b=$_[1]; ++ $_[2]->(); ++} ++ + sub cmp_title { + IkiWiki::pagetitle(IkiWiki::basename($a)) + cmp +</pre> + +This would be bad when M is very large, and particularly, of course, when +there is no limit and all pages are being matched on. (For example, an +archive page shows all pages that match a pagespec specifying a creation +date range.) Well, in this case, it *does* make sense to flip it, limit by +pagespe first, and do a (quick)sort second. (No influence complications, +either.) + +> Flipping when there's no limit implemented, and it knocked 1/3 off +> the rebuild time of my blog's archive pages. --[[Joey]] + +Adding these special cases will be more complicated, but I think the best +of both worlds. --[[Joey]] diff --git a/doc/todo/structured_page_data.mdwn b/doc/todo/structured_page_data.mdwn index 72bfd8dea..9f21fab7f 100644 --- a/doc/todo/structured_page_data.mdwn +++ b/doc/todo/structured_page_data.mdwn @@ -1,5 +1,7 @@ This is an idea from [[JoshTriplett]]. --[[Joey]] +* See further discussion at [[forum/an_alternative_approach_to_structured_data]]. + Some uses of ikiwiki, such as for a bug-tracking system (BTS), move a bit away from the wiki end of the spectrum, and toward storing structured data about a page or instead of a page. @@ -251,6 +253,9 @@ in a large number of other cases. > dependencies between bugs from arbitrary links. >> This issue (the need for distinguished kinds of links) has also been brought up in other discussions: [[tracking_bugs_with_dependencies#another_kind_of_links]] (deps vs. links) and [[tag_pagespec_function]] (tags vs. links). --Ivan Z. +>>> And multiple link types are now supported; plugins can set the link +>>> type when registering a link, and pagespec functions can match on them. --[[Joey]] + ---- #!/usr/bin/perl diff --git a/doc/todo/support_includes_in_setup_files.mdwn b/doc/todo/support_includes_in_setup_files.mdwn new file mode 100644 index 000000000..50afb2b6b --- /dev/null +++ b/doc/todo/support_includes_in_setup_files.mdwn @@ -0,0 +1,10 @@ +I have a client server setup so I can I edit/preview on my laptop/desktop and push to a server. I therefore have two almost identical setup files that reasonably often I let get out of sync. I'd like to be able into include the common parts into the two setup files. Currently the following works, but it relies on knowing the implementation of IkiWiki::Setup::Standard + +use IkiWiki::Setup::Standard { specific stuff }; +require "/path/to/common_setup"; + +where common_setup contains a call to IkiWiki::Setup::merge + +To see that this is fragile, note that the require must come second, or ikiwiki will try to load a module called IkiWiki::Setup::merge + +DavidBremner diff --git a/doc/todo/support_link__40__.__41___in_pagespec.mdwn b/doc/todo/support_link__40__.__41___in_pagespec.mdwn new file mode 100644 index 000000000..653db1ff2 --- /dev/null +++ b/doc/todo/support_link__40__.__41___in_pagespec.mdwn @@ -0,0 +1,21 @@ +[[!tag wishlist]] + +It would be nice to have pagespecs support "link(.)" as syntax. +This would match pages that link to the page that invokes the pagespec. +The use case is a blog with tags, and having a page for each tag +which uses !inline to list all posts with the tag. + +Joey said on IRC that "probably changing the derel() function in +IkiWiki.pm is the best way to do it". + +> I implemented this suggestion in the simplest possible way, [[!taglink patch]] available [[here|http://git.oblomov.eu/ikiwiki/patch/f4a52de556436fdee00fd92ca9a3b46e876450fa]]. +> An alternative approach, very similar, would be to make the empty page parameter mean current page (e.g. `link()` would mean pages linking here). The patch would be very similar. +> -- GB + +>> Thanks for this, and also for your recent spam-fighting. +>> Huh, I was right about changing derel, didn't realize it would be +>> so obvious a change. :) Oh well, I managed to complicate it +>> some in optimisation pass.. ;) +>> +>> Note that your git-daemon on git.oblomov.eu seems down. +>> I pulled the patch from gitweb, [[done]] --[[Joey]] diff --git a/doc/todo/svg.mdwn b/doc/todo/svg.mdwn index 0a15af4cd..274ebf3e3 100644 --- a/doc/todo/svg.mdwn +++ b/doc/todo/svg.mdwn @@ -3,6 +3,7 @@ We should support SVG. In particular: * We could support rendering SVGs to PNGs when compiling the wiki. Not all browsers support SVG yet. * We could support editing SVGs via the web interface. SVG can contain unsafe content such as scripting, so we would need to whitelist safe markup. + * I am interested in seeing [svg-edit](http://code.google.com/p/svg-edit/) integrated -- [[EricDrechsel]] --[[JoshTriplett]] @@ -56,3 +57,21 @@ in the trunk if other people think it's useful. [htmlscrubber.pm]:http://xbeta.org/gitweb/?p=xbeta/ikiwiki.git;a=blob;f=IkiWiki/Plugin/htmlscrubber.pm;h=3c0ddc8f25bd8cb863634a9d54b40e299e60f7df;hb=fe333c8e5b4a5f374a059596ee698dacd755182d [diff]: http://xbeta.org/gitweb/?p=xbeta/ikiwiki.git;a=blobdiff;f=IkiWiki/Plugin/htmlscrubber.pm;h=3c0ddc8f25bd8cb863634a9d54b40e299e60f7df;hp=3bdaccea119ec0e1b289a0da2f6d90e2219b8d66;hb=fe333c8e5b4a5f374a059596ee698dacd755182d;hpb=be0b4f603f918444b906e42825908ddac78b7073 + +> Unfortuantly these links are broken. --[[Joey]] + +* * * + +Actually, there's a way to embed SVG into MarkDown sources using the [data: URI scheme][rfc2397], [like this](data:image/svg+xml;base64,PD94bWwgdmVyc2lvbj0iMS4wIiBzdGFuZGFsb25lPSJubyI/Pgo8c3ZnIHdpZHRoPSIxOTIiIGhlaWdodD0iMTkyIiB4bWxuczp4bGluaz0iaHR0cDovL3d3dy53My5vcmcvMTk5OS94bGluayIgeG1sbnM9Imh0dHA6Ly93d3cudzMub3JnLzIwMDAvc3ZnIj4KIDwhLS0gQ3JlYXRlZCB3aXRoIFNWRy1lZGl0IC0gaHR0cDovL3N2Zy1lZGl0Lmdvb2dsZWNvZGUuY29tLyAtLT4KIDx0aXRsZT5IZWxsbywgd29ybGQhPC90aXRsZT4KIDxnPgogIDx0aXRsZT5MYXllciAxPC90aXRsZT4KICA8ZyB0cmFuc2Zvcm09InJvdGF0ZSgtNDUsIDk3LjY3MTksIDk3LjY2OCkiIGlkPSJzdmdfNyI+CiAgIDxyZWN0IHN0cm9rZS13aWR0aD0iNSIgc3Ryb2tlPSIjMDAwMDAwIiBmaWxsPSIjRkYwMDAwIiBpZD0ic3ZnXzUiIGhlaWdodD0iNTYuMDAwMDAzIiB3aWR0aD0iMTc1IiB5PSI2OS42Njc5NjkiIHg9IjEwLjE3MTg3NSIvPgogICA8dGV4dCB4bWw6c3BhY2U9InByZXNlcnZlIiB0ZXh0LWFuY2hvcj0ibWlkZGxlIiBmb250LWZhbWlseT0ic2VyaWYiIGZvbnQtc2l6ZT0iMjQiIHN0cm9rZS13aWR0aD0iMCIgc3Ryb2tlPSIjMDAwMDAwIiBmaWxsPSIjZmZmZjAwIiBpZD0ic3ZnXzYiIHk9IjEwNS42NjgiIHg9Ijk5LjY3MTkiPkhlbGxvLCB3b3JsZCE8L3RleHQ+CiAgPC9nPgogPC9nPgo8L3N2Zz4=). +Of course, this way to display an image one needs to click a link, but it may be considered a feature. +— [[Ivan_Shmakov]], 2010-03-12Z. + +[rfc2397]: http://tools.ietf.org/html/rfc2397 + +> You can do the same with img src actually. +> +> If svg markup allows unsafe elements (ie, javascript), +> which it appears to, +> then this is a security hole, and the htmlscrubber +> needs to lock it down more. Darn, now I have to spend my afternoon making +> security releases! --[[Joey]] diff --git a/doc/todo/tagging_with_a_publication_date.mdwn b/doc/todo/tagging_with_a_publication_date.mdwn index 80240ec5a..39fc4e220 100644 --- a/doc/todo/tagging_with_a_publication_date.mdwn +++ b/doc/todo/tagging_with_a_publication_date.mdwn @@ -38,3 +38,34 @@ on vacation". > > > > I no longer have the original wiki for which I wanted this feature, but I can > > see using it on future ones. -- [[DonMarti]] + +>>> FWIW, for the case where one wants to update a site offline, +>>> using an ikiwiki instance on a laptop, and include some deffered +>>> posts in the push, the ad-hoc cron job type approach will be annoying. +>>> +>>> In modern ikiwiki, I guess the way to accomplish this would be to +>>> add a pagespec that matches only pages posted in the present or past. +>>> Then a page can have its post date set to the future, using meta date, +>>> and only show up when its post date rolls around. +>>> +>>> Ikiwiki will need to somehow notice that a pagespec began matching +>>> a page it did not match previously, despite said page not actually +>>> changing. I'm not sure what the best way is. +>>> +>>> * One way could be to +>>> use a needsbuild hook and some stored data about which pagespecs +>>> exclude pages in the future. (But I'm not sure how evaluating the +>>> pagespec could lead to that metadata and hook being set up.) +>>> * Another way would be to use an explicit directive to delay a +>>> page being posted. Then the directive stores the metadata and +>>> sets up the needsbuild hook. +>>> * Another way would be for ikiwiki to remember the last +>>> time it ran. It could then easily find pages that have a post +>>> date after that time, and treat them the same as it treats actually +>>> modified files. Or a plugin could do this via a needsbuild hook, +>>> probably. (Only downside to this is it would probably need to do +>>> a O(n) walk of the list of pages -- but only running an integer +>>> compare per page.) +>>> +>>> You'd still need a cron job to run ikiwiki -refresh every hour, or +>>> whatever, so it can update. --[[Joey]] diff --git a/doc/todo/toc_plugin:_set_a_header_ceiling___40__opposite_of_levels__61____41__.mdwn b/doc/todo/toc_plugin:_set_a_header_ceiling___40__opposite_of_levels__61____41__.mdwn index 547c7a80a..07d2d383c 100644 --- a/doc/todo/toc_plugin:_set_a_header_ceiling___40__opposite_of_levels__61____41__.mdwn +++ b/doc/todo/toc_plugin:_set_a_header_ceiling___40__opposite_of_levels__61____41__.mdwn @@ -1,3 +1,48 @@ It would be nice if the [[plugins/toc]] plugin let you specify a header level "ceiling" above which (or above and including which) the headers would not be incorporated into the toc. Currently, the levels=X parameter lets you tweak how deep it will go for small headers, but I'd like to chop off the h1's (as I use them for my page title) -- [[Jon]] + +> This change to toc.pm should do it. --[[KathrynAndersen]] + +> > The patch looks vaguely OK to me but it's hard to tell without +> > context. It'd be much easier to review if you used unified diff +> > (`diff -u`), which is what `git diff` defaults to - almost all +> > projects prefer to receive changes as unified diffs (or as +> > branches in their chosen VCS, which is [[git]] here). --[[smcv]] + +> > > Done. -- [[KathrynAndersen]] + +> > > > Looks like Joey has now [[merged|done]] this. Thanks! --[[smcv]] + + --- /files/git/other/ikiwiki/IkiWiki/Plugin/toc.pm 2009-11-16 12:44:00.352050178 +1100 + +++ toc.pm 2009-12-26 06:36:06.686512552 +1100 + @@ -53,8 +53,8 @@ + my $page=""; + my $index=""; + my %anchors; + - my $curlevel; + - my $startlevel=0; + + my $startlevel=($params{startlevel} ? $params{startlevel} : 0); + + my $curlevel=$startlevel-1; + my $liststarted=0; + my $indent=sub { "\t" x $curlevel }; + $p->handler(start => sub { + @@ -67,10 +67,16 @@ + + # Take the first header level seen as the topmost level, + # even if there are higher levels seen later on. + + # unless we're given startlevel as a parameter + if (! $startlevel) { + $startlevel=$level; + $curlevel=$startlevel-1; + } + + elsif (defined $params{startlevel} + + and $level < $params{startlevel}) + + { + + return; + + } + elsif ($level < $startlevel) { + $level=$startlevel; + } + +[[!tag patch]] diff --git a/doc/todo/toplevel_index.mdwn b/doc/todo/toplevel_index.mdwn index 77e315811..92cef99ac 100644 --- a/doc/todo/toplevel_index.mdwn +++ b/doc/todo/toplevel_index.mdwn @@ -1,7 +1,7 @@ Some inconsistences around the toplevel [[index]] page: * [[ikiwiki]] is a separate page; links to [[ikiwiki]] should better go to - the [[index]] though. + the index though. > At least for this wiki, I turned out to have a use for [[ikiwiki]] > pointing to a different page, though the general point might still diff --git a/doc/todo/tracking_bugs_with_dependencies.mdwn b/doc/todo/tracking_bugs_with_dependencies.mdwn index 5f3ece290..456dadad0 100644 --- a/doc/todo/tracking_bugs_with_dependencies.mdwn +++ b/doc/todo/tracking_bugs_with_dependencies.mdwn @@ -81,6 +81,9 @@ I like the idea of [[tips/integrated_issue_tracking_with_ikiwiki]], and I do so >> I saw that this issue is targeted at by the work on [[structured page data#another_kind_of_links]]. --Ivan Z. +>>> It's fixed now; links can have a type, such as "tag", or "dependency", +>>> and pagespecs can match links of a given typo. --[[Joey]] + Okie - I've had a quick attempt at this. Initial patch attached. This one doesn't quite work. And there is still a lot of debugging stuff in there. diff --git a/doc/todo/transient_pages.mdwn b/doc/todo/transient_pages.mdwn new file mode 100644 index 000000000..fe2259b40 --- /dev/null +++ b/doc/todo/transient_pages.mdwn @@ -0,0 +1,318 @@ +On [[todo/auto-create_tag_pages_according_to_a_template]], [[chrysn]] +suggests: + +> Instead of creating a file that gets checked in into the RCS, the +> source files could be left out and the output files be written as +> long as there is no physical source file (think of a virtual underlay). +> Something similar would be required to implement alias directive, +> which couldn't be easily done by writing to the RCS as the page's +> contents can change depending on which other pages claim it as an alias. + +`add_autofile` could be adapted to do this, or a similar API could be +added. + +This would also be useful for autoindex, as suggested on +[[plugins/autoindex/discussion]] and [[!debbug 544322]]. I'd also like +to use it for [[plugins/contrib/album]]. + +It could also be used for an [[todo/alias_directive]]. + +--[[smcv]] + +> All [[merged|done]] --[[Joey]] + +-------------------------- + +[[!template id=gitbranch branch=smcv/ready/transient author="[[smcv]]"]] +[[!tag patch]] + +Related branches: + +* `ready/tag-test`: an extra regression test for tags + > merged --[[Joey]] +* either `transient-relative` or `transient-relative-api`: avoid using `Cwd` + on initialization + > merged the latter --[[Joey]] +* `ready/transient-aggregate`: use for aggregate + > merged --[[Joey]] +* `ready/transient-autoindex`: optionally use for autoindex, + which is [[!debbug 544322]] (includes autoindex-autofile from + [[todo/autoindex should use add__95__autofile]]) + > merged. I do note that this interacts badly with ikiwiki-hosting's + > backup/restore/branch handling, since that does not back up the + > transientdir by default, and so autoindex will not recreate the + > "deleted" pages. I'll probably have to make it back up the transientdir + > too. --[[Joey]] +* `ready/transient-recentchanges`: use for recentchanges + > merged --[[Joey]] +* `ready/transient-tag`: optionally use for tag (includes tag-test) + > merged --[[Joey]] + +I think this branch is now enough to be useful. It adds the following: + +If the `transient` plugin is loaded, `$srcdir/.ikiwiki/transient` is added +as an underlay. I'm not sure whether this should be a plugin or core, so +I erred on the side of more plugins; I think it's "on the edge of the core", +like goto. + +Pages in the transient underlay are automatically +deleted if a page of the same name is created in the srcdir (or an underlay +closer to the srcdir in stacking order). + +With the additional `ready/transient-tag` branch, +`tag` enables `transient`, and if `tag_autocreate_commit` is set to 0 +(default 1), autocreated tags are written to the transient underlay. +There is a regression test. + +With the additional `transient-autoindex` branch, +`autoindex` uses autofiles. It also enables `transient`, and if +`autoindex_commit` is set to 0 (default 1), autoindexes are written to +the transient underlay. There is a regression test. However, this branch +is blocked by working out what the desired behaviour is, on +[[todo/autoindex_should_use_add__95__autofile]]. + +> I wonder why this needs to be configurable? I suppose that gets back to +> whether it makes sense to check these files in or not. The benefits of +> checking them in: +> +> * You can edit them from the VCS, don't have to go into the web +> interface. Of course, files from the underlays have a similar issue, +> but does it make sense to make that wart larger? +> * You can know you can build the same site with nothing missing +> even if you don't there enable autoindex or whatever. (Edge case.) + +>> I'm not sure that that's a huge wart; you can always "edit by +>> overwriting". If you're running a local clone of the wiki on your laptop +>> or whatever, you have the underlays already, and can copy from there. +>> Tag and autoindex pages have rather simple source code anyway. --s + +> The benefit of using transient pages seems to just be avoiding commit +> clutter? For files that are never committed, transient pages are a clear +> win, but I wonder if adding configuration clutter just to avoid some +> commit clutter is really worth it. + +>> According to the last section of +>> [[todo/auto-create_tag_pages_according_to_a_template]], [[chrysn]] and +>> Eric both feel rather strongly that it should be possible to +>> not commit any tags; in [[plugins/autoindex/discussion]], +>> lollipopman and [[JoeRayhawk]] both requested the same for autoindex. +>> I made it configurable because, as you point out, +>> there are also reasons why it makes sense to check these +>> automatically-created files in. I'm neutral on this, personally. +>> +>> If this is a point of contention, would you accept a branch that +>> just adds `transient` and uses it for [[plugins/recentchanges]], +>> which aren't checked in and never have been? I've split the +>> branch up in the hope that *some* of it can get merged. +>> +>>> I will be happy to merge transient-recentchanges when it's ready. +>>> I see no obstacle to merging transient-tag either, and am not +>>> really against using it for autoindex or aggregate either +>>> once they get completed. +>>> I just wanted to think through why configurability is needed. +>>> --[[Joey]] +>> +>> One potentially relevant point is that configuration clutter only +>> affects the site admin whereas commit clutter is part of the whole +>> wiki's history. --[[smcv]] + +> Anyway, the configurability +> appears subtly broken; the default is only 1 if a new setup file is +> generated. (Correction: It was not even the default then --[[Joey]]) +> With an existing setup file, the 'default' values in +> `getsetup` don't take effect, so it will default to undef, which +> is treated the same as 0. --[[Joey]] + +>> Fixed in the branches, hopefully. (How disruptive would it be to have +>> defaults take effect whenever the setup file doesn't set a value, btw? +>> It seems pretty astonishing to have them work as they do at the moment.) --s + +>>> Well, note that default is not actually a documented field in +>>> getsetup hooks at all! (It is used in IkiWiki.pm's own `getsetup()`, and +>>> the concept may have leaked out into one or two plugins (comments, +>>> transient)). +>>> +>>> Running getsetup at plugin load time is something I have considered +>>> doing. It would simplify some checkconfig hooks that just set hardcoded +>>> defaults. Although since dying is part of the getsetup hook's API, it +>>> could be problimaric. +>>> --[[Joey]] + +autoindex ignores pages in the transient underlay when deciding whether +to generate an index. + +With the additional `ready/transient-recentchanges` branch, new recent +changes go in the transient underlay; I tested this manually. + +Not done yet (in that branch, at least): + +* `remove` can't remove transient pages: this turns out to be harder than + I'd hoped, because I don't want to introduce a vulnerability in the + non-regular-file detection, so I'd rather defer that. + + > Hmm, I'd at least want that to be dealt with before this was used + > by default for autoindex or tag. --[[Joey]] + + >> I'll try to work out which of the checks are required for security + >> and which are just nice-to-have, but I'd appreciate any pointers + >> you could give. Note that my branch wasn't meant to enable either + >> by default, and now hopefully doesn't. --[[smcv]] + + >>> Opened a new bug for this, [[bugs/removal_of_transient_pages]] + >>> --[[Joey]] + +* Transient tags that don't match any pages aren't deleted: I'm not sure + that that's a good idea anyway, though. Similarly, transient autoindexes + of directories that become empty aren't deleted. + + > Doesn't seem necessary, or really desirable to do that. --[[Joey]] + + >> Good, that was my inclination too. --s + +* In my `untested/transient` branch, new aggregated files go in the + transient underlay too (they'll naturally migrate over time). I haven't + tested this yet, it's just a proof-of-concept. + + > Now renamed to `ready/transient-aggregate`; it does seem to work fine. + > --s + +> I can confirm that the behavior of autoindex, at least, is excellent. +> Haven't tried tag. Joey, can you merge transient and autoindex? --JoeRayhawk + +>> Here are some other things I'd like to think about first: --[[Joey]] +>> +>> * There's a FIXME in autoindex. +>> +>> > Right, the extra logic for preventing autoindex pages from being +>> > re-created. This is taking a while, so I'm going to leave out the +>> > autoindex part for the moment. The FIXME is only relevant +>> > because I tried to solve +>> > [[todo/autoindex should use add__95__autofile]] first, but +>> > strictly speaking, that's an orthogonal change. --s + +>> * Suggest making recentchanges unlink the transient page +>> first, and only unlink from the old location if it wasn't +>> in the transient location. Ok, it only saves 1 syscall :) +>> +>> > Is an unlink() really that expensive? But, OK, fixed in the +>> > `ready/transient-recentchanges` branch. --s + +>> >> It's not, but it's easy. :) --[[Joey]] + +>> * Similarly it's a bit worrying for performance that it +>> needs to pull in and use `Cwd` on every ikiwiki startup now. +>> I really don't see the need; `wikistatedir` should +>> mostly be absolute, and ikiwiki should not chdir in ways +>> that break it anyway. +>> +>> > The reason to make it absolute is that relative underlays +>> > are interpreted as relative to the base underlay directory, +>> > not the cwd, by `add_underlay`. +>> > +>> > The updated `ready/transient-only` branch only loads `Cwd` if +>> > the path is relative; an extra commit on branch +>> > `smcv/transient-relative` goes behind `add_underlay`'s +>> > back to allow use of a cwd-relative underlay. Which direction +>> > would you prefer? +>> > +>> > I note in passing that [[plugins/autoindex]] and `IkiWiki::Render` +>> > both need to use `Cwd` and `File::Find` on every refresh, so +>> > there's only any point in avoiding `Cwd` for runs that don't +>> > actually refresh, like simple uses of the CGI. --s + +>> >> Oh, right, I'd forgotten about the horrificness of File::Find +>> >> that required a chdir for security. Ugh. Can we just avoid +>> >> it for those simple cases then? (demand-calculate wikistatedir) +>> >> --[[Joey]] + +>> >>> The reason that transientdir needs to be absolute is that it's +>> >>> added as an underlay. +>> >>> +>> >>> We could avoid using `Cwd` by taking the extra commit from either +>> >>> `smcv/transient-relative` or `smcv/transient-relative-api`; +>> >>> your choice. I'd personally go for the latter. +>> >>> +>> >>> According to git grep, [[plugins/po]] already wants to look at +>> >>> the underlaydirs in its checkconfig hook, so I don't think +>> >>> delaying calculation of the underlaydir is viable. (I also noticed +>> >>> a bug, +>> >>> [[bugs/po:_might_not_add_translated_versions_of_all_underlays]].) +>> >>> +>> >>> `underlaydirs` certainly needs to have been calculated by the +>> >>> time `refresh` hooks finish, so `find_src_files` can use it. --s + +>> * Unsure about the use of `default_pageext` in the `change` +>> hook. Is everything in the transientdir really going +>> to use that pageext? Would it be better to look up the +>> complete source filename? +>> +>> > I've updated `ready/transient` to do a more thorough GC by +>> > using File::Find on the transient directory. This does +>> > require `File::Find` and `Cwd`, but only when pages change, +>> > and `refresh` loads both of those in that situation anyway. +>> > +>> > At the moment everything in the transientdir will either +>> > have the `default_pageext` or be internal, although I +>> > did wonder whether to make [[plugins/contrib/album]] +>> > viewer pages optionally be `html`, for better performance +>> > when there's a very large number of photos. --s + +>> >> Oh, ugh, more File::Find... Couldn't it just assume that the +>> >> transient page has the same extension as its replacement? +>> >> --[[Joey]] + +>> >>> Good idea, that'll be true for web edits at least. +>> >>> Commit added. --s + +-------------------------- + +## An earlier version + +I had a look at implementing this. It turns out to be harder than I thought +to have purely in-memory pages (several plugins want to be able to access the +source file as a file), but I did get this proof-of-concept branch +to write tag and autoindex pages into an underlay. + +This loses the ability to delete the auto-created pages (although they don't +clutter up git this way, at least), and a lot of the code in autoindex is +probably now redundant, so this is probably not quite ready for merge, but +I'd welcome opinions. + +Usage: set `tag_underlay` and/or `autoindex_underlay` to an absolute path, +which you must create beforehand. I suggest *srcdir* + `/.ikiwiki/transient`. + +Refinements that could be made if this approach seems reasonable: + +* make these options boolean, and have the path always be `.ikiwiki/transient` +* improve the `remove` plugin so it also deletes from this special underlay + +>> Perhaps it should be something more generic, so that other plugins could use it (such as "album" mentioned above). +>> The `.ikiwiki/transient` would suit this, but instead of saying "tag_underlay" or "autoindex_underlay" have "use_transient_underlay" or something like that? +>> Or to make it more flexible, have just one option "transient_underlay" which is set to an absolute path, and if it is set, then one is using a transient-underlay. +>> --[[KathrynAndersen]] + +>>> What I had in mind was more like `tag_autocreate_transient => 1` or +>>> `autoindex_transient => 1`; you might conceivably want tags to be +>>> checked in but autoindices to be transient, and it's fine for each +>>> plugin to make its own decision. Going from that to one boolean +>>> (or just always-transient if people don't think that's too +>>> astonishing) would be trivial, though. +>>> +>>> I don't think relocating the transient underlay really makes sense, +>>> except for prototyping: you only want one, and `.ikiwiki` is as good +>>> a place as any (ikiwiki already needs to be able to write there). +>>> +>>> For [[plugins/contrib/album]] I think I'd just make the photo viewer +>>> pages always-transient - you can always make a transient page +>>> permanent by editing it, after all. +>>> +>>> Do you think this approach has enough potential that I should +>>> continue to hack on it? Any thoughts on the implementation? --[[smcv]] + +>>>> Ah, now I understand what you're getting at. Yes, it makes sense to put transient pages under `.ikiwiki`. +>>>> I haven't looked at the code, but I'd be interested in seeing whether it's generic enough to be used by other plugins (such as `album`) without too much fuss. +>>>> The idea of a transient underlay gives us a desirable feature for free: that if someone edits the transient page, it is made permanent and added to the repository. +>>>> +>>>> I think the tricky thing with removing these transient underlay pages is the question of how to prevent whatever auto-generated the pages in the first place from generating them again - or, conversely, how to force whatever auto-generated those pages to regenerate them if you've changed your mind. +>>>> I think you'd need something similar to `will_render` so that transient pages would be automatically removed if whatever auto-generated them is no longer around. +>>>> -- [[KathrynAndersen]] diff --git a/doc/todo/two-way_convert_of_wikis.mdwn b/doc/todo/two-way_convert_of_wikis.mdwn new file mode 100644 index 000000000..61f02a30b --- /dev/null +++ b/doc/todo/two-way_convert_of_wikis.mdwn @@ -0,0 +1,15 @@ + +[[!tag wishlist]] + +Ok, the vision is this: Some of you will know git-svn. I want something like +git-svn,, but for wikis. I want to be able to do the following: + +1. Convert a moinmoin (or whatever) wiki to a local ikiwiki on my laptop. +2. Edit my local copy (offline). +3. Preview the changes with my local ikiwki installation + browser. +4. Push the changes back to moinmoin (or whatever) wiki. + +I know, I know, ikiwiki wasn't designed for that, but it would be really cool, +and useful and people ask for that kind of thing too. + +--[[David_Riebenbauer]] diff --git a/doc/todo/untrusted_git_push_hooks.mdwn b/doc/todo/untrusted_git_push_hooks.mdwn new file mode 100644 index 000000000..313078ce5 --- /dev/null +++ b/doc/todo/untrusted_git_push_hooks.mdwn @@ -0,0 +1,12 @@ +Re the canrename, canremove, and canedit hooks: + +Of the three, only canremove is currently checked during an untrusted +git push (a normal git push is assumed to be from a trusted user and +bypasses all checks). + +It would probably make sense to add the canedit hook to the checks done +there. Calling the canrename hook is tricky, because after all, git does +not record explicit file moves. + +The checkcontent hook is another hook not currently called there, that +probably should be. diff --git a/doc/todo/use_secure_cookies_for_ssl_logins.mdwn b/doc/todo/use_secure_cookies_for_ssl_logins.mdwn new file mode 100644 index 000000000..194db2f36 --- /dev/null +++ b/doc/todo/use_secure_cookies_for_ssl_logins.mdwn @@ -0,0 +1,36 @@ +[[!template id=gitbranch branch=smcv/ready/sslcookie-auto author="[[smcv]]"]] +[[!tag patch]] + +At the moment `sslcookie => 0` never creates secure cookies, so if you log in +with SSL, your browser will send the session cookie even over plain HTTP. +Meanwhile `sslcookie => 1` always creates secure cookies, so you can't +usefully log in over plain http. + +This branch adds `sslcookie => 0, sslcookie_auto => 1` as an option; this +uses the `HTTPS` environment variable, so if you log in over SSL you'll +get a secure session cookie, but if you log in over HTTP, you won't. +(The syntax for the setup file is pretty rubbish - any other suggestions?) + +> Does this need to be a configurable option at all? The behavior could +> just be changed in the sslcookie = 0 case. It seems sorta reasonable +> that, once I've logged in via https, I need to re-login if I then +> switch to http. + +>> Even better. I've amended the branch to have this behaviour, which +>> turns it into a one-line patch. --[[smcv]] + +> And, if your change is made, the sslcookie option could probably itself +> be dropped too -- at least I don't see a real use case for it if ikiwiki +> is more paranoid about cookies by default. + +>> I haven't done that; it might make sense to do so, but I think it'd be +>> better to leave it in as a safety-catch (or in case someone's +>> using a webserver that doesn't put `$HTTPS` in the environment). --s + +> Might be best to fix +> [[todo/want_to_avoid_ikiwiki_using_http_or_https_in_urls_to_allow_serving_both]] +> first, so that dual https/http sites can better be set up. --[[Joey]] + +>> Thanks for merging that! :-) --s + +[[merged|done]] --[[Joey]] diff --git a/doc/todo/use_templates_for_the_img_plugin.mdwn b/doc/todo/use_templates_for_the_img_plugin.mdwn new file mode 100644 index 000000000..1cee1b535 --- /dev/null +++ b/doc/todo/use_templates_for_the_img_plugin.mdwn @@ -0,0 +1,29 @@ +[[!template id=gitbranch branch=jmtd/img_use_template author="[[Jon]]"]] + +Not finished! :-) + +The patches in <http://github.com/jmtd/ikiwiki/tree/img_use_template> convert the `img.pm` plugin to use a template (by default, `img.tmpl`, varied using a `template=` parameter) rather than hard-code the generated HTML. + +I originally thought of this to solve the problem outlined in [[bugs/can't mix template vars inside directives]], before I realised I could wrap the `img` call in my pages with a template to achieve the same thing. I therefore sat on it. + +However, I since thought of another use for this, and so started implementing it. (note to self: explain this other use) + +---- + +Ok, I have managed to achieve what I wanted with stock ikiwiki, this branch might not have any more life left in it (but it has proven an interesting experiment to see how much logic could be moved from `img.pm` into a template relatively easily. Although the template is not terribly legible.) + +My ikiwiki page has a picture on the front page. I've changed that picture just once, but I would like to change it again from time to time. I also want to keep a "gallery", or at least a list, of previous pictures, and perhaps include text alongside each picture, but not on the front page. + +I've achieved this as follows + + * each index picture gets a page under "indexpics". + * the "indexpics" page has a raw inline to include them all[1] + * the front page has more-or-less the same inline, with show=1 + * each index picture page has a [[plugins/conditional]]: + * if you are being included, show the resized picture only, and link the picture to the relevant indexpic page + * else, show the picture with the default link to a full-size image, and include explanatory text. + * most of the boilerplate is hidden inside a template + +It is not quite as I envisaged it: the explanatory text would probably make sense on the indexpics "gallery" page, but since that includes the page, the wrong trouser-leg of the conditional is used. But it works quite well. Introducing a new index picture involves creating an appropriate page under indexpics and the rest happens automatically. + +[1] lie #1: the pagespec is a lot more complex as it has to exclude raw image filetypes diff --git a/doc/todo/user-defined_templates_outside_the_wiki.mdwn b/doc/todo/user-defined_templates_outside_the_wiki.mdwn new file mode 100644 index 000000000..1d72aa6a7 --- /dev/null +++ b/doc/todo/user-defined_templates_outside_the_wiki.mdwn @@ -0,0 +1,10 @@ +[[!tag wishlist]] + +The [[plugins/contrib/ftemplate]] plugin looks for templates inside the wiki +source, but also looks in the system templates directory (the one with +`page.tmpl`). This means the wiki admin can provide templates that can be +invoked via `\[[!template]]`, but don't have to "work" as wiki pages in their +own right. I think the normal [[plugins/template]] plugin could benefit from +this functionality. + +[[done]] --[[Joey]] diff --git a/doc/todo/want_to_avoid_ikiwiki_using_http_or_https_in_urls_to_allow_serving_both.mdwn b/doc/todo/want_to_avoid_ikiwiki_using_http_or_https_in_urls_to_allow_serving_both.mdwn index 65b7cd96a..6ede7f91e 100644 --- a/doc/todo/want_to_avoid_ikiwiki_using_http_or_https_in_urls_to_allow_serving_both.mdwn +++ b/doc/todo/want_to_avoid_ikiwiki_using_http_or_https_in_urls_to_allow_serving_both.mdwn @@ -1,3 +1,46 @@ +## current status + +[[done]] again! :) + +Actually, there are two places where the configured url is still hardcoded: + +1. When searching, all the links will use it. This is annoying to fix, + and we deem it not a problem. +2. When ikiwiki dies with an error, the links on the error page will + use it. Too bad :) + +------ + +## semi-old + + +* CGI pages, with the exception of edit pages, set `<base>` to + `$config{url}` + + I had to revert using `baseurl(undef)` for that, because it needs + to be a full url. + + Ideally, baseurl would return an absolute url derived from the url + being used to access the cgi, but that needs access to the CGI object, + which it does not currently have. Similarly, `misctemplate` + does not have access to the CGI object, so it cannot use it to + generate a better baseurl. Not sure yet what to do; may have to thread + a cgi parameter through all the calls to misctemplate. --[[Joey]] + + > Fixed, cgitemplate is used now. --[[Joey]] + +* Using `do=goto` to go to a comment or recentchanges item + will redirect to the `$config{url}`-based url, since the + permalinks are made to be absolute urls now. + + Fixing this would seem to involve making meta force permalinks + to absolute urls when fulling out templates, while allowing them + to be left as partial urls internally, for use by goto. --[[Joey]] + + > This reversion has now been fixed. --[[Joey]] + +## old attempt + It looks like all links in websites are absolute paths, this has some limitations: * If connecting to website via https://... all links will take you back to http:// @@ -12,18 +55,322 @@ It would be good if relative paths could be used instead, so the transport metho > "../../", and "../". The only absolute links are to CGIs and the w3c DTD. > --[[Joey]] ->> The problem is within the CGI script. The links within the HTML page are all absolute, including links to the css file. ->> Having a http links within a HTML page retrieved using https upset most browsers (I think). Also if I push cancel on the edit page in https, I end up at at http page. -- Brian May +>> The problem is within the CGI script. The links within the HTML page are all +>> absolute, including links to the css file. Having a http links within a HTML +>> page retrieved using https upset most browsers (I think). Also if I push cancel +>> on the edit page in https, I end up at at http page. -- Brian May >>> Ikiwiki does not hardcode http links anywhere. If you don't want >>> it to use such links, change your configuration to use https >>> consistently. --[[Joey]] -Errr... That is not a solution, that is a work around. ikiwiki does not hard code the absolute paths, but absolute paths are hard coded in the configuration file. If you want to serve your website so that the majority of users can see it as http, including in rss feeds (this allows proxy caches to cache the contents and has reduced load requirements), but editing is done via https for increased security, it is not possible. I have some ideas how this can be implemented (as ikiwiki has the absolute path to the CGI script and the absolute path to the destination, it should be possible to generate a relative path from one to the other), although some minor issues still need to be resolved. -- Brian May +Errr... That is not a solution, that is a work around. ikiwiki does not hard +code the absolute paths, but absolute paths are hard coded in the configuration +file. If you want to serve your website so that the majority of users can see +it as http, including in rss feeds (this allows proxy caches to cache the +contents and has reduced load requirements), but editing is done via https for +increased security, it is not possible. I have some ideas how this can be +implemented (as ikiwiki has the absolute path to the CGI script and the +absolute path to the destination, it should be possible to generate a relative +path from one to the other), although some minor issues still need to be +resolved. -- Brian May -I noticed the links to the images on <http://ikiwiki.info/recentchanges/> are also absolute, that is <http://ikiwiki.info/wikiicons/diff.png>; this seems surprising, as the change.tmpl file uses <TMPL_VAR BASEURL> -which seems to do the right thing in page.tmpl, but not for change.tmpl. Where is BASEURL set? -- Brian May +I noticed the links to the images on <http://ikiwiki.info/recentchanges/> are +also absolute, that is <http://ikiwiki.info/wikiicons/diff.png>; this seems +surprising, as the change.tmpl file uses <TMPL_VAR BASEURL> which seems +to do the right thing in page.tmpl, but not for change.tmpl. Where is BASEURL +set? -- Brian May > The use of an absolute baseurl in change.tmpl is a special case. --[[Joey]] +So I'm facing this same issue. I have a wiki which needs to be accessed on +three different URLs(!) and the hard coding of the URL from the setup file is +becoming a problem for me. Is there anything I can do here? --[[Perry]] + +> I remain puzzled by the problem that Brian is discussing. I don't see +> why you can't just set the cgiurl and url to a https url, and serve +> the site using both http and https. +> +> Just for example, <https://kitenet.net/> is an ikiwiki, and it is accessible +> via https or http, and if you use https, links will remain on https (except +> for links using the cgi, which I could fix by changing the cgiurl to https). +> +> I think it's possible ikiwiki used to have some +> absolute urls that have been fixed since Brian filed the bug. --[[Joey]] + [[wishlist]] + +---- + +[[!toggle id="smcv-https" text="Some discussion of a rejected implementation, smcv/https."]] +[[!toggleable id="smcv-https" text=""" + +[[!template id=gitbranch branch=smcv/https author="[[smcv]]"]] + +For a while I've been using a configuration where each wiki has a HTTP and +a HTTPS mirror, and updating one automatically updates the other, but +that seems unnecessarily complicated. My `https` branch adds `https_url` +and `https_cgiurl` config options which can be used to provide a HTTPS +variant of an existing site; the CGI script automatically detects whether +it was accessed over HTTPS and switches to the other one. + +This required some refactoring, which might be worth merging even if +you don't like my approach: + +* change `IkiWiki::cgiurl` to return the equivalent of `$config{cgiurl}` if + called with no parameters, and change all plugins to indirect through it + (then I only need to change that one function for the HTTPS hack) + +* `IkiWiki::baseurl` already has similar behaviour, so change nearly all + references to the `$config{url}` to call `baseurl` (a couple of references + specifically wanted the top-level public URL for Google or Blogspam rather + than a URL for the user's browser, so I left those alone) + +--[[smcv]] + +> The justification for your patch seems to be wanting to use a different +> domain, like secure.foo.com, for https? Can you really not just configure +> both url and cgiurl to use `https://secure.foo.com/...` and rely on +> relative links to keep users of `http://insecure.foo.com/` on http until +> they need to use the cgi? + +>> My problem with that is that uses of the CGI aren't all equal (and that +>> the CA model is broken). You could put CGI uses in two classes: +>> +>> - websetup and other "serious" things (for the sites I'm running, which +>> aren't very wiki-like, editing pages is also in this class). +>> I'd like to be able to let privileged users log in over +>> https with httpauth (or possibly even a client certificate), and I don't +>> mind teaching these few people how to do the necessary contortions to +>> enable something like CACert. +>> +>> - Random users making limited use of the CGI: do=goto, do=404, and +>> commenting with an OpenID. I don't think it's realistic to expect +>> users to jump through all the CA hoops to get CACert installed for that, +>> which leaves their browsers being actively obstructive, unless I either +>> pay the CA tax (per subdomain) to get "real" certificates, or use plain +>> http. +>> +>> On a more wiki-like wiki, the second group would include normal page edits. +>> +>>> I see your use case. It still seems to me that for the more common +>>> case where CA tax has been paid (getting a cert that is valid for +>>> multiple subdomains should be doable?), having anything going through the +>>> cgiurl upgrade to https would be ok. In that case, http is just an +>>> optimisation for low-value, high-aggregate-bandwidth type uses, so a +>>> little extra https on the side is not a big deal. --[[Joey]] +>> +>> Perhaps I'm doing this backwards, and instead of having the master +>> `url`/`cgiurl` be the HTTP version and providing tweakables to override +>> these with HTTPS, I should be overriding particular uses to plain HTTP... +>> +>> --[[smcv]] +>>> +>>> Maybe, or I wonder if you could just use RewriteEngine for such selective +>>> up/downgrading. Match on `do=(edit|create|prefs)`. --[[Joey]] + +> I'm unconvinced. +> +> `Ikiwiki::baseurl()."foo"` just seems to be asking for trouble, +> ie being accidentially written as `IkiWiki::baseurl("foo")`, +> which will fail when foo is not a page, but some file. + +>> That's a good point. --s + +> I see multiple places (inline.pm, meta.pm, poll.pm, recentchanges.pm) +> where it will now put the https url into a static page if the build +> happens to be done by the cgi accessed via https, but not otherwise. +> I would rather not have to audit for such problems going forward. + +>> Yes, that's a problem with this approach (either way round). Perhaps +>> making it easier to run two mostly-synched copies like I was previously +>> doing is the only solution... --s + +"""]] + +---- + +[[!template id=gitbranch branch=smcv/ready/localurl author="[[smcv]]"]] +[[!tag patch]] + +OK, here's an alternative approach, closer in spirit to what was initially +requested. I included a regression test for `urlto`, `baseurl` and `cgiurl`, +now that they have slightly more complex behaviour. + +The idea is that in the common case, the CGI and the pages will reside on the +same server, so they can use "semi-absolute" URLs (`/ikiwiki.cgi`, `/style.css`, +`/bugs/done`) to refer to each other. Most redirects, form actions, links etc. +can safely use this form rather than the fully-absolute URL. + +The initial version of the branch had config options `local_url` and +`local_cgiurl`, but they're now automatically computed by checking +whether `url` and `cgiurl` are on the same server with the the same URL +scheme. In theory you could use things like `//static.example.com/wiki/` +and `//dynamic.example.com/ikiwiki.cgi` to preserve choice of http/https +while switching server, but I don't know how consistently browsers +support that. + +"local" here is short for "locally valid", because these URLs are neither +fully relative nor fully absolute, and there doesn't seem to be a good name +for them... + +I've tested this on a demo website with the CGI enabled, and it seemed to +work nicely (there might be bugs in some plugins, I didn't try all of them). +The branch at [[todo/use secure cookies for SSL logins]] goes well with +this one. + +The `$config{url}` and `$config{cgiurl}` are both HTTP, but if I enable +`httpauth`, set `cgiauthurl` to a HTTPS version of the same site and log +in via that, links all end up in the HTTPS version. + +New API added by this branch: + +* `urlto(x, y, 'local')` uses `$local_url` instead of `$config{url}` + + > Yikes. I see why you wanted to keep it to 3 parameters (4 is too many, + > and po overrides it), but I dislike overloading the third parameter + > like that. + > + > There are fairly few calls to `urlto($foo, $bar)`, so why not + > make that always return the semi-local url form, and leave the third + > parameter for the cases that need a true fully-qualified url. + > The new form for local urls will typically be only a little bit longer, + > except in the unusual case where the cgiurl is elsewhere. --[[Joey]] + + >> So, have urlto(x, y) use `$local_url`? There are few calls, but IMO + >> they're for the most important things - wikilinks, img, map and + >> other ordinary hyperlinks. Using `$local_url` would be fine for + >> webserver-based use, but it does stop you browsing your wiki's + >> HTML over `file:///` (unless you set that as the base URL, but + >> then you can't move it around), and stops you moving simple + >> outputs (like the docwiki!) around. + >> + >> I personally think breaking the docwiki is enough to block that. + >> + >>> Well, the docwiki doesn't have an url configured at all, so I assumed + >>> it would need to fall back to current behavior in that case. I had + >>> not thought about browsing wiki's html files though, good point. + >> + >> How about this? + >> + >> * `urlto($link, $page)` with `$page` defined: relative + >> * `urlto($link, undef)`: local, starts with `/` + >> * `urlto($link)`: also local, as a side-effect + >> * `urlto($link, $anything, 1)` (but idiomatically, `$anything` is + >> normally undef): absolute, starts with `http[s]://` + >> + >> --[[smcv]] + >> + >>> That makes a great deal of sense, bravo for actually removing + >>> parameters in the common case while maintaining backwards + >>> compatability! --[[Joey]] + >>> + >>>> Done in my `localurl` branch; not tested in a whole-wiki way + >>>> yet, but I did add a regression test. I've used + >>>> `urlto(x, undef)` rather than `urlto(x)` so far, but I could + >>>> go back through the codebase using the short form if you'd + >>>> prefer. --[[smcv]] + >>> + >>> It does highlight that it would be better to have a + >>> `absolute_urlto($link)` (or maybe `absolute(urlto($link))` ) + >>> rather than the 3 parameter form. --[[Joey]] + >>> + >>> Possibly. I haven't added this. + +* `IkiWiki::baseurl` has a new second argument which works like the + third argument of `urlto` + + > I assume you have no objection to this --[[smcv]] + + >> It's so little used that I don't really care if it's a bit ugly. + >> (But I assume changes to `urlto` will follow through here anyway.) + >> --[[Joey]] + + >>> I had to use it a bit more, as a replacement for `$config{url}` + >>> when doing things like referencing stylesheets or redirecting to + >>> the top of the wiki. + >>> + >>> I ended up redoing this without the extra parameter. Previously, + >>> `baseurl(undef)` was the absolute URL; now, `baseurl(undef)` is + >>> the local path. I know you objected to me using `baseurl()` in + >>> an earlier branch, because `baseurl().$x` looks confusingly + >>> similar to `baseurl($x)` but has totally different semantics; + >>> I've generally written it `baseurl(undef)` now, to be more + >>> explicit. --[[smcv]] + +* `IkiWiki::cgiurl` uses `$local_cgiurl` if passed `local_cgiurl => 1` + + > Now changed to always use the `$local_cgiurl`. --[[smcv]] + +* `IkiWiki::cgiurl` omits the trailing `?` if given no named parameters + except `cgiurl` and/or `local_cgiurl` + + > I assume you have no objection to this --[[smcv]] + > + >> Nod, although I don't know of a use case. --[[Joey]] + + >>> The use-case is that I can replace `$config{cgiurl}` with + >>> `IkiWiki::cgiurl()` for things like the action attribute of + >>> forms. --[[smcv]] + +Fixed bugs: + +* I don't think anything except `openid` calls `cgiurl` without also + passing in `local_cgiurl => 1`, so perhaps that should be the default; + `openid` uses the `cgiurl` named parameter anyway, so there doesn't even + necessarily need to be a way to force absolute URLs? Any other module + that really needs an absolute URL could use + `cgiurl(cgiurl => $config{cgiurl}, ...)`, + although that does look a bit strange + + > I agree that makes sense. --[[Joey]] + + >> I'm not completely sure whether you're agreeing with "perhaps do this" + >> or "that looks too strange", so please disambiguate: + >> would you accept a patch that makes `cgiurl` default to a local + >> (starts-with-`/`) result? If you would, that'd reduce the diff. --[[smcv]] + + >>> Yes, I absolutely think it should default to local. (Note that + >>> if `absolute()` were implemented as suggested above, it could also + >>> be used with cgiurl if necessary.) --[[Joey]] + + >>>> Done (minus `absolute()`). --[[smcv]] + +Potential future things: + +* It occurs to me that `IkiWiki::cgiurl` could probably benefit from being + exported? Perhaps also `IkiWiki::baseurl`? + + > Possibly, see [[firm_up_plugin_interface]]. --[[Joey]] + + >> Not really part of this branch, though, so wontfix (unless you ask me + >> to do so). --[[smcv]] + +* Or, to reduce use of the unexported `baseurl` function, it might make + sense to give `urlto` a special case that references the root of the wiki, + with a trailing slash ready to append stuff: perhaps `urlto('/')`, + with usage like this? + + do_something(baseurl => urlto('/', undef, local)`); + do_something_else(urlto('/').'style.css'); + IkiWiki::redirect(urlto('/', undef, 1)); + + > AFACIS, `baseurl` is only called in 3 places so I don't think that's + > needed. --[[Joey]] + + >> OK, wontfix. For what it's worth, my branch has 6 uses in IkiWiki + >> core code (IkiWiki, CGI, Render and the pseudo-core part of editpage) + >> and 5 in plugins, since I used it for things like redirection back + >> to the top of the wiki --[[smcv]] + +merged|done --[[Joey]] (But reopened, see above.) + +---- + +Update: I had to revert part of 296e5cb2fd3690e998b3824d54d317933c595873, +since it broke openid logins. The openid object requires a complete, +not a relative cgiurl. I'm not sure if my changing that back to using +`$config{cgiurl}` will force users back to eg, the non-https version of a +site when logging in via openid. + +> Ok, changed it to use `CGI->url` to get the current absolute cgi url. --[[Joey]] diff --git a/doc/todo/web_reversion.mdwn b/doc/todo/web_reversion.mdwn new file mode 100644 index 000000000..736d674fe --- /dev/null +++ b/doc/todo/web_reversion.mdwn @@ -0,0 +1,73 @@ +Goal: Web interface to allow reverting of changes. + +Interface: + +At least at first, it will be exposed via the recentchanges +page, with revert icons next to each change. We may want a dynamic +per-page interface that goes back more than 100 changes later. + +Limiting assumptions: + +* No support for resolving conflicts in reverts; such a revert would just + fail and not happen. +* No support for reset-to-this-point; initially the interface would only + revert a single commit, and if a bunch needed to go, the user would have + to drive that one at a time. + +Implementation plan: + +* `rcs_revert` hook that takes a revision to revert. +* CGI: `do=revert&rev=foo` +* recentchanges plugin adds above to recentchanges page +* prompt user to confirm (to avoid spiders doing reverts), + check that user is allowed to make the change, commit reversion, + and refresh site. + +Peter Gammie has done an initial implementation of the above. +[[!template id=gitbranch branch=peteg/revert author="[[peteg]]"]] + +>> It is on a separate branch now. --[[peteg]] + +> Review: --[[Joey]] +> +> The revert commit will not currently say what web user did the revert. +> This could be fixed by doing a --no-commit revert first and then using +> rcs_commit_staged. +>> Fixed, I think. --[[peteg]] +> +> So I see one thing I completly forgot about is `check_canedit`. Avoiding users +> using reverting to make changes they would normally not be allowed to do is +> tricky. I guess that a easy first pass would be to only let admins do it. +> That would be enough to get the feature out there.. +> +> I'm thinking about having a `rcs_preprevert`. It would take a rev and look +> at what changes reverting it would entail, and return the same data +> structure that `rcs_recieve` does. This could be done by using `git revert +> --no-commit`, and then examining the changes, and then `git reset` to drop +> them. +>> We can use the existing `git_commit_info` with the patch ID - no need to touch the working directory. -- [[peteg]] +> +> Then the code that is currently in IkiWiki/Receive.pm, that calls +> `check_canedit` and `check_canremove` to test the change, can be +> straightforwardly refactored out, and used for checking reverts too. +>> Wow, that was easy. :-) -- [[peteg]] +> +> (The data from `rcs_preprevert` could also be used for a confirmation +> prompt -- it doesn't currently include enough info for diffs, but at +> least could have a list of changed files.) +> +> Note that it's possible for a git repo to have commits that modify wiki +> files in a subdir, and code files elsewhere. `rcs_preprevert` should +> detect changes outside the wiki dir, and fail, like `rcs_receive` does. +>> Taken care of by refactoring `rcs_receive` in `git.pm` +>> I've tested it lightly in my single-user setup. It's a little nasty that the `attachment` plugin +>> gets used to check whether attachments are allowed -- there really should be a hook for that. +>>> I agree, but have not figured out a way to make a hook work yet. +>>> --[[Joey]] +>> +>> Please look it over and tell me what else needs fixing... -- [[peteg]] + +>>> I have made my own revert branch and put a few^Wseveral fixes in there. +>>> All merged to master now! --[[Joey]] + +[[done]] diff --git a/doc/todo/wrapperuser.mdwn b/doc/todo/wrapperuser.mdwn new file mode 100644 index 000000000..4c42b046f --- /dev/null +++ b/doc/todo/wrapperuser.mdwn @@ -0,0 +1,7 @@ +ikiwiki's .setup file can specify wrappergroup, and ikiwiki will set the group +of the wrappers accordingly. Having had people encounter difficulty before +when trying to do the same thing with users (for instance, making all wrappers +6755 ikiwiki:ikiwiki), I think it would help to have "wrapperuser". This could +only actually take effect if building the wrappers as root (not really the best +plan), but ikiwiki could at least warn if wrapperuser does not match the user +the wrapper will end up with. diff --git a/doc/usage.mdwn b/doc/usage.mdwn index 0c618de5c..b9516d740 100644 --- a/doc/usage.mdwn +++ b/doc/usage.mdwn @@ -32,14 +32,22 @@ These options control the mode that ikiwiki operates in. * --setup setupfile - In setup mode, ikiwiki reads the config file, which is really a perl - program that can call ikiwiki internal functions. - The default action when --setup is specified is to automatically generate wrappers for a wiki based on data in a setup file, and rebuild the wiki. If you only want to build any changed pages, you can use --refresh with --setup. +* --changesetup setupfile + + Reads the setup file, adds any configuration changes specified by other + options, and writes the new configuration back to the setup file. Also + updates any configured wrappers. In this mode, the wiki is not fully + rebuilt, unless you also add --rebuild. + + Example, to enable some plugins: + + ikiwiki --changesetup ~/ikiwiki.setup --plugin goodstuff --plugin calendar + * --dumpsetup setupfile Causes ikiwiki to write to the specified setup file, dumping out @@ -50,6 +58,14 @@ These options control the mode that ikiwiki operates in. If used with --setup --refresh, this makes it also update any configured wrappers. +* --clean + + This makes ikiwiki clean up by removing any files it generated in the + `destination` directory, as well as any configured wrappers, and the + `.ikiwiki` state directory. This is mostly useful if you're running + ikiwiki in a Makefile to build documentation and want a corresponding + `clean` target. + * --cgi Enable [[CGI]] mode. In cgi mode ikiwiki runs as a cgi script, and @@ -112,10 +128,11 @@ also be configured using a setup file. * --templatedir dir - Specify the directory that the page [[templates|wikitemplates]] are stored in. + Specify the directory that [[templates|templates]] are stored in. Default is `/usr/share/ikiwiki/templates`, or another location as configured at build time. If the templatedir is changed, missing templates will still - be searched for in the default location as a fallback. + be searched for in the default location as a fallback. Templates can also be + placed in the "templates/" subdirectory of the srcdir. Note that if you choose to copy and modify ikiwiki's templates, you will need to be careful to keep them up to date when upgrading to new versions of @@ -226,6 +243,12 @@ also be configured using a setup file. Specifies a rexexp of source files to exclude from processing. May be specified multiple times to add to exclude list. +* --include regexp + + Specifies a rexexp of source files, that would normally be excluded, + but that you wish to include in processing. + May be specified multiple times to add to include list. + * --adminuser name Specifies a username of a user (or, if openid is enabled, an openid) @@ -249,8 +272,8 @@ also be configured using a setup file. Makes ikiwiki look in the specified directory first, before the regular locations when loading library files and plugins. For example, if you set - libdir to "/home/you/.ikiwiki/", you can install a Foo.pm plugin as - "/home/you/.ikiwiki/IkiWiki/Plugin/Foo.pm". + libdir to "/home/you/.ikiwiki/", you can install a foo.pm plugin as + "/home/you/.ikiwiki/IkiWiki/Plugin/foo.pm". * --discussion, --no-discussion @@ -306,26 +329,28 @@ also be configured using a setup file. intercepted. If you enable this option then you must run at least the CGI portion of ikiwiki over SSL. -* --getctime +* --gettime, --no-gettime - Pull creation time for each new page out of the revision control - system. This rarely used option provides a way to get the real creation - times of items in weblogs, such as when building a wiki from a new - VCS checkout. It is unoptimised and quite slow. It is best used - with --rebuild, to force ikiwiki to get the ctime for all pages. + Extract creation and modification times for each new page from the + the revision control's log. This is done automatically when building a + wiki for the first time, so you normally do not need to use this option. * --set var=value This allows setting an arbitrary configuration variable, the same as if it - were set via a setup file. Since most options can be configured - using command-line switches, you will rarely need to use this, but it can be - useful for the odd option that lacks a command-line switch. + were set via a setup file. Since most commonly used options can be + configured using command-line switches, you will rarely need to use this. + +* --set-yaml var=value + + This is like --set, but it allows setting configuration variables that + use complex data structures, by passing in a YAML document. # EXAMPLES * ikiwiki --setup my.setup - Completly (re)build the wiki using the specified setup file. + Completely (re)build the wiki using the specified setup file. * ikiwiki --setup my.setup --refresh @@ -345,6 +370,10 @@ also be configured using a setup file. This controls what C compiler is used to build wrappers. Default is 'cc'. +* CFLAGS + + This can be used to pass options to the C compiler when building wrappers. + # SEE ALSO * [[ikiwiki-mass-rebuild]](8) diff --git a/doc/users/BerndZeimetz.mdwn b/doc/users/BerndZeimetz.mdwn new file mode 100644 index 000000000..cf21dc585 --- /dev/null +++ b/doc/users/BerndZeimetz.mdwn @@ -0,0 +1,8 @@ +See [wiki.debian.org/BerndZeimetz](http://wiki.debian.org/BerndZeimetz) for details. + +<pre> + Bernd Zeimetz Debian GNU/Linux Developer + http://bzed.de http://www.debian.org + GPG Fingerprints: 06C8 C9A2 EAAD E37E 5B2C BE93 067A AD04 C93B FF79 + ECA1 E3F2 8E11 2432 D485 DD95 EB36 171A 6FF9 435F +</pre> diff --git a/doc/users/David_Riebenbauer.mdwn b/doc/users/David_Riebenbauer.mdwn index 372a28588..d7469696e 100644 --- a/doc/users/David_Riebenbauer.mdwn +++ b/doc/users/David_Riebenbauer.mdwn @@ -1,2 +1,8 @@ Runs ikiwiki on his [homepage](http://liegesta.at/) and can be reached through <davrieb@liegesta.at> + +## Branches in his [[git]] repository ## + +* `autotag` +([browse](http://git.liegesta.at/?p=ikiwiki.git;a=shortlog;h=refs/heads/autotag)) +See [[todo/auto-create_tag_pages_according_to_a_template]] diff --git a/doc/users/Edward_Betts.mdwn b/doc/users/Edward_Betts.mdwn index b32927a1c..61d6150ef 100644 --- a/doc/users/Edward_Betts.mdwn +++ b/doc/users/Edward_Betts.mdwn @@ -1,9 +1,4 @@ My watchlist: -[[!inline archive="yes" sort="mtime" atom="yes" pages=" -todo/allow_wiki_syntax_in_commit_messages* -todo/shortcut_with_different_link_text* -todo/structured_page_data* -tips/convert_mediawiki_to_ikiwiki* -"]] +[[!inline archive="yes" sort="mtime" atom="yes" pages="todo/allow_wiki_syntax_in_commit_messages* or todo/shortcut_with_different_link_text* or todo/structured_page_data* or tips/convert_mediawiki_to_ikiwiki*"]] diff --git a/doc/users/KarlMW/discussion.mdwn b/doc/users/KarlMW/discussion.mdwn index 9117abcab..4a111a3f9 100644 --- a/doc/users/KarlMW/discussion.mdwn +++ b/doc/users/KarlMW/discussion.mdwn @@ -23,3 +23,5 @@ things that need changing then I will probably need help/guidance. >> I suspect that asciidoc can't really be made to play nice to the extent that I would want casual users/abusers to have it as a markup option on a live wiki - it's fine for a personal site where you can look at the output before putting it online, but I think it would be a hideously gaping integrity hole for anything more than that. However, for a personal site (as I am using it), it does seem to have its uses. >> I'll keep an eye on the format_escape plugin, and assuming it is accepted into ikiwiki, will see if I can apply it to asciidoc. --[[KarlMW]] + +Is there any way to enable latexmath rendering? It semes that ikiwiki strips the necessary javascript and/or style sheet information from the HTML page generated by asciidoc. --Peter diff --git a/doc/users/KathrynAndersen.mdwn b/doc/users/KathrynAndersen.mdwn new file mode 100644 index 000000000..8e827b0da --- /dev/null +++ b/doc/users/KathrynAndersen.mdwn @@ -0,0 +1,8 @@ +* aka [[rubykat]] +* <http://kerravonsen.dreamwidth.org> +* <http://www.katspace.org> (uses IkiWiki!) +* <http://github.com/rubykat> +* Also an active [PmWiki](http://www.pmwiki.org) user, interested in having the best of both worlds. + +Has written the following plugins: +[[!map pages="!*/Discussion and ((link(users/KathrynAndersen) or link(users/rubykat)) and plugins/*) "]] diff --git a/doc/users/KathrynAndersen/discussion.mdwn b/doc/users/KathrynAndersen/discussion.mdwn new file mode 100644 index 000000000..4f2790c39 --- /dev/null +++ b/doc/users/KathrynAndersen/discussion.mdwn @@ -0,0 +1,20 @@ +Had a look at your site. Sprawling, individualistic, using ikiwiki in lots of +ways. Makes me happy. :) I see that I have let a lot of contrib plugins +pile up. I will try to get to these. I'm particularly interested in +your use of yaml+fields. Encourage you to go ahead with any others you +have not submitted here, like pmap. (Unless it makes more sense to submit +that as a patch to the existing map plugin.) --[[Joey]] + +> Thanks. I would have put more up, but I didn't want to until they were properly documented, and other things have taken a higher priority. + +> I think pmap is probably better as a separate plugin, because it has additional dependencies (HTML::LinkList) which people might not want to have to install. + +>> One approach commonly used in ikiwiki is to make such optional features +>> be enabled by a switch somewhere, and 'eval q{use Foo}` so the module +>> does not have to be loaded unless the feature is used. --[[Joey]] + +>>> Unfortunately, HTML::LinkList isn't an optional feature for pmap; that's what it uses to create the HTML for the map. --[[KathrynAndersen]] + +> The "includepage" plugin I'm not sure whether it is worth releasing or not; it's basically a cut-down version of "inline", because the inline plugin is so complicated and has so many options, I felt more at ease to have something simpler. + +> --[[KathrynAndersen]] diff --git a/doc/users/NicolasLimare.mdwn b/doc/users/NicolasLimare.mdwn index 003449d40..56a950f7e 100644 --- a/doc/users/NicolasLimare.mdwn +++ b/doc/users/NicolasLimare.mdwn @@ -1,9 +1 @@ -Nicolas (nil) uses ikiwiki on a site/wiki/blog/something... and feels this approach much more comfortable than the usual web-only ones. - -He didn't touch any perl code before using ikiwiki, ant that was the first opportunity to propose tiny patches. - -Actualy, he would have felt much more comfortable with a python ikiwiki... :) - -Can be reached at nicolas at limare.net - -By the way, I can make translations to french if needed. And maybe to japanese.
\ No newline at end of file +[[!meta redir="nil"]] diff --git a/doc/users/Oblomov.mdwn b/doc/users/Oblomov.mdwn new file mode 100644 index 000000000..be6e666cb --- /dev/null +++ b/doc/users/Oblomov.mdwn @@ -0,0 +1 @@ +Getting started with Ikiwiki, like the git backend a lot, would like to see a dynamic version of it. diff --git a/doc/users/Perry.mdwn b/doc/users/Perry.mdwn new file mode 100644 index 000000000..d10b8621f --- /dev/null +++ b/doc/users/Perry.mdwn @@ -0,0 +1 @@ +Just another IkiWiki user. diff --git a/doc/users/Remy.mdwn b/doc/users/Remy.mdwn new file mode 100644 index 000000000..5cde4c43d --- /dev/null +++ b/doc/users/Remy.mdwn @@ -0,0 +1 @@ +Test page diff --git a/doc/users/Will.mdwn b/doc/users/Will.mdwn index 043203dc3..1956263e0 100644 --- a/doc/users/Will.mdwn +++ b/doc/users/Will.mdwn @@ -2,7 +2,7 @@ I started using Ikiwiki as a way to replace [Trac](http://trac.edgewall.org/) wh Lately I've been using Ikiwiki for other things and seem to be scratching a few itches here and there. :) -I generally use my [[ikiwiki/openid]] login when editing here: <http://www.cse.unsw.edu.au/~willu/> +I generally use my [[ikiwiki/openid]] login when editing here: <http://www.cse.unsw.edu.au/~willu/> or <http://www.google.com/profiles/will.uther>. I have a git repository for some of my IkiWiki code: <http://www.cse.unsw.edu.au/~willu/ikiwiki.git>. @@ -13,14 +13,16 @@ Unless otherwise specified, any code that I post to this wiki I release under th ------ +Disabling these as I'm not using them much any more... + ### Open Bugs: -[[!inline pages="link(users/Will) and bugs/* and !bugs/done and !bugs/discussion and !link(patch) and !link(bugs/done) and !bugs/*/*" archive="yes" feeds="no" ]] +\[[!inline pages="link(users/Will) and bugs/* and !bugs/done and !bugs/discussion and !link(patch) and !link(bugs/done) and !bugs/*/*" archive="yes" feeds="no" ]] ### Open ToDos: -[[!inline pages="link(users/Will) and todo/* and !todo/done and !todo/discussion and !link(patch) and !link(todo/done) and !bugs/*/*" archive="yes" feeds="no" ]] +\[[!inline pages="link(users/Will) and todo/* and !todo/done and !todo/discussion and !link(patch) and !link(todo/done) and !bugs/*/*" archive="yes" feeds="no" ]] ### Unapplied Patches: -[[!inline pages="link(users/Will) and (todo/* or bugs/*) and !bugs/done and !bugs/discussion and !todo/done and !todo/discussion and link(patch) and !link(bugs/done) and !link(todo/done) and !bugs/*/*" archive="yes" feeds="no" ]] +\[[!inline pages="link(users/Will) and (todo/* or bugs/*) and !bugs/done and !bugs/discussion and !todo/done and !todo/discussion and link(patch) and !link(bugs/done) and !link(todo/done) and !bugs/*/*" archive="yes" feeds="no" ]] diff --git a/doc/users/adamshand.mdwn b/doc/users/adamshand.mdwn index 6127a8d70..5273c6439 100644 --- a/doc/users/adamshand.mdwn +++ b/doc/users/adamshand.mdwn @@ -1,5 +1,15 @@ [[!meta title="Adam Shand"]] -New IkiWiki user, long time wiki user. :-) +New ikiwiki user (well not really "new" anymore), long time wiki user. :-) <http://adam.shand.net/iki/> + +[[!map pages="link(AdamShand)"]] + +<!-- for map bug +## Correct? (No extra ULs) +\[[!map pages="setup*" show="title"]] + +## Bug? (Extra UL for each LI) +\[[!map pages="tagged(done) and tagged(patch)" show="title"]] +--> diff --git a/doc/users/anarcat.wiki b/doc/users/anarcat.wiki new file mode 100644 index 000000000..7ef474ed6 --- /dev/null +++ b/doc/users/anarcat.wiki @@ -0,0 +1 @@ +Hello! I'm anarcat. See [[https://wiki.koumbit.net/TheAnarcat]] to know more about me. diff --git a/doc/users/blipvert.mdwn b/doc/users/blipvert.mdwn new file mode 100644 index 000000000..7c4a24ba1 --- /dev/null +++ b/doc/users/blipvert.mdwn @@ -0,0 +1 @@ +<http://github.com/blipvert> diff --git a/doc/users/chrysn.mdwn b/doc/users/chrysn.mdwn new file mode 100644 index 000000000..0daa3b2b9 --- /dev/null +++ b/doc/users/chrysn.mdwn @@ -0,0 +1,4 @@ +* **name**: chrysn +* **website**: <http://christian.amsuess.com/> +* **uses ikiwiki for**: a bunch of internal documentation / organization projects +* **likes ikiwiki because**: it is a distributed organization tool that pretends to be a web app for the non-programmers out there diff --git a/doc/users/dark.mdwn b/doc/users/dark.mdwn new file mode 100644 index 000000000..e1d06d0b0 --- /dev/null +++ b/doc/users/dark.mdwn @@ -0,0 +1,3 @@ +[[!meta title="Richard Braakman"]] + +Lars Wirzenius convinced me to try ikiwiki for blogging :) diff --git a/doc/users/ericdrechsel.mdwn b/doc/users/ericdrechsel.mdwn new file mode 100644 index 000000000..2efb7039c --- /dev/null +++ b/doc/users/ericdrechsel.mdwn @@ -0,0 +1 @@ +[My homewiki profile](http://wiki.shared.dre.am/people/eric/) diff --git a/doc/users/fmarier.mdwn b/doc/users/fmarier.mdwn new file mode 100644 index 000000000..ecf342697 --- /dev/null +++ b/doc/users/fmarier.mdwn @@ -0,0 +1,6 @@ +# François Marier + +Free Software and Debian Developer. Lead developer of [Libravatar](http://www.libravatar.org) + +* [Blog](http://feeding.cloud.geek.nz) +* [Identica](http://identi.ca/fmarier) / [Twitter](http://twitter.com/fmarier) diff --git a/doc/users/harishcm.mdwn b/doc/users/harishcm.mdwn index 47f28c83c..292a3bfad 100644 --- a/doc/users/harishcm.mdwn +++ b/doc/users/harishcm.mdwn @@ -1 +1 @@ -Using ikiwiki for my yet to be publish personal website :) +Using ikiwiki for my personal website <http://harish.19thsc.com> diff --git a/doc/users/ivan_shmakov.mdwn b/doc/users/ivan_shmakov.mdwn new file mode 100644 index 000000000..4123e0fc6 --- /dev/null +++ b/doc/users/ivan_shmakov.mdwn @@ -0,0 +1,50 @@ +… To put it short: an Ikiwiki newbie. + +[Emacs]: http://www.gnu.org/software/emacs/ +[Lynx]: http://lynx.isc.org/ + +## Wikis + +Currently, I run a couple of Ikiwiki instances. Namely: + +* <http://lhc.am-1.org/lhc/> + — to hold random stuff written by me, my colleagues, + students, etc. + +* <http://rsdesne.am-1.org/rsdesne-2010/> + — for some of the materials related to the + “Remote Sensing in Education, Science and National + Economy” (2010-03-29 … 2010-04-10, Altai State + University) program I've recently participated in as + an instructor. + +## Preferences + +I prefer to use [Lynx][] along with [Emacs][] (via +`emacsclient`) to work with the wikis. (Note the “Local +variables” section below.) + +The things I dislike in the wiki engines are: + +* the use of home-brew specialized version control systems + — while there're a lot of much more developed general + purpose ones; + +* oversimplified syntax + — which (to some extent) precludes more sophisticated + forms of automated processing; in particular, this forces one + to reformat the material, once complete, to, say, prepare a + book, or an article, or slides. + +Out of all the wiki engines I'm familiar with, only Ikiwiki is +free of the first of these. I hope that it will support more +elaborate syntaxes eventually. + +---- + + Local variables: + mode: markdown + coding: utf-8 + fill-column: 64 + ispell-dictionary: "american" + End: diff --git a/doc/users/jasonblevins.mdwn b/doc/users/jasonblevins.mdwn index b50e4844a..e4a459e30 100644 --- a/doc/users/jasonblevins.mdwn +++ b/doc/users/jasonblevins.mdwn @@ -1,66 +1,46 @@ [[!meta title="Jason Blevins"]] -I'm currently hosting a private ikiwiki for keeping research notes -which, with some patches and a plugin (below), will -convert inline [[todo/LaTeX]] expressions to [[MathML]]. I'm working towards a -patchset and instructions for others to do the same. - -I've setup a test ikiwiki [here](http://xbeta.org/colab/) where I've -started keeping a few notes on my progress. There is an example of -inline [[todo/SVG]] on the homepage (note that the logo scales along with the -font size). There are a few example mathematical expressions in the -[sandbox](http://xbeta.org/colab/sandbox/). The MathML is generated -automatically from inline LaTeX expressions using an experimental -plugin I'm working on. - -My (also MathML-enabled) homepage: <http://jblevins.org/> (still using -Blosxom...maybe one day I'll convert it to ikiwiki...) - -Current ikiwki issues of interest: - - * [[bugs/recentchanges_feed_links]] - * [[bugs/HTML_inlined_into_Atom_not_necessarily_well-formed]] - * [[plugins/toc/discussion]] - * [[todo/BibTeX]] - * [[todo/svg]] - * [[todo/Option_to_make_title_an_h1?]] - * [[bugs/SVG_files_not_recognized_as_images]] +I am a former Ikiwiki user who wrote several plugins and patches +related to MathML, [[SVG|todo/svg]], and [[todo/syntax highlighting]]. +Some related links and notes are archived below. + +Homepage: <http://jblevins.org/> ## Plugins -These plugins are experimental. Use them at your own risk. Read the -perldoc documentation for more details. Patches and suggestions are -welcome. +The following [plugins](http://jblevins.org/projects/ikiwiki/) +are no longer maintained, but please feel free to use, modify, and +redistribute them. Read the corresponding perldoc documentation for +more details. - * [mdwn_itex][] - Works with the [[`mdwn`|plugins/mdwn]] plugin to convert inline [[todo/LaTeX]] - expressions to [[MathML]] using `itex2MML`. + * [mdwn_itex][] - Works with the [[`mdwn`|plugins/mdwn]] plugin to convert + inline [[todo/LaTeX]] expressions to MathML using `itex2MML`. * [h1title][] - If present, use the leading level 1 Markdown header to set the page title and remove it from the page body. * [code][] - Whole file and inline code snippet [[todo/syntax highlighting]] via GNU Source-highlight. The list of supported file extensions is - configurable. There is also some preliminary [documentation][code-doc]. - See the [FortranWiki](http://fortranwiki.org) for examples. + configurable. - * [metamail][] - a plugin for loading metadata from [[email]]-style + * [metamail][] - a plugin for loading metadata from email-style headers at top of a file (e.g., `title: Page Title` or `date: November 2, 2008 11:14 EST`). - * [pandoc][] - [[ikiwiki/Markdown]] page processing via [Pandoc](http://johnmacfarlane.net/pandoc/) (a Haskell library for converting from one markup format to another). [[todo/LaTeX]] and + * [pandoc][] - [[ikiwiki/Markdown]] page processing via + [Pandoc](http://johnmacfarlane.net/pandoc/) (a Haskell library for + converting from one markup format to another). [[todo/LaTeX]] and [[reStructuredText|plugins/rst]] are optional. * [path][] - Provides path-specific template conditionals such as `IS_HOMEPAGE` and `IN_DIR_SUBDIR`. - [mdwn_itex]: http://code.jblevins.org/ikiwiki/plugins.git/plain/mdwn_itex.pm - [h1title]: http://code.jblevins.org/ikiwiki/plugins.git/plain/h1title.pm - [code]: http://code.jblevins.org/ikiwiki/plugins.git/plain/code.pm - [code-doc]: http://code.jblevins.org/ikiwiki/plugins.git/plain/code.text - [metamail]: http://code.jblevins.org/ikiwiki/plugins.git/plain/metamail.pm - [pandoc]: http://code.jblevins.org/ikiwiki/plugins.git/plain/pandoc.pm - [path]: http://code.jblevins.org/ikiwiki/plugins.git/plain/path.pm - + [mdwn_itex]: http://jblevins.org/git/ikiwiki/plugins.git/plain/mdwn_itex.pm + [h1title]: http://jblevins.org/git/ikiwiki/plugins.git/plain/h1title.pm + [code]: http://jblevins.org/projects/ikiwiki/code + [metamail]: http://jblevins.org/git/ikiwiki/plugins.git/plain/metamail.pm + [pandoc]: http://jblevins.org/git/ikiwiki/plugins.git/plain/pandoc.pm + [path]: http://jblevins.org/git/ikiwiki/plugins.git/plain/path.pm ## MathML and SVG support @@ -105,5 +85,5 @@ optimal solution is to force users to preview the page before saving. That way if someone introduces invalid XHTML then they can't save the page in the first place (unless they post directly to the right URL). - [template-patch]: http://xbeta.org/gitweb/?p=xbeta/ikiwiki.git;a=blobdiff;f=templates/page.tmpl;h=380ef699fa72223744eb5c1ee655fb79aa6bce5b;hp=9084ba7e11e92a10528b2ab12c9b73cf7b0f40a7;hb=416d5d1b15b94e604442e4e209a30dee4b77b684;hpb=ececf4fb8766a4ff7eff943b3ef600be81a0df49 - [cgi-patch]: http://xbeta.org/gitweb/?p=xbeta/ikiwiki.git;a=commitdiff;h=fa538c375250ab08f396634135f7d79fce2a9d36 + [template-patch]: http://jblevins.org/git/ikiwiki.git/commit/?h=xbeta&id=416d5d1b15b94e604442e4e209a30dee4b77b684 + [cgi-patch]: http://jblevins.org/git/ikiwiki.git/commit/?id=fa538c375250ab08f396634135f7d79fce2a9d36 diff --git a/doc/users/jasonriedy.mdwn b/doc/users/jasonriedy.mdwn new file mode 100644 index 000000000..c94e8e4be --- /dev/null +++ b/doc/users/jasonriedy.mdwn @@ -0,0 +1 @@ +I'm over [thattaway](http://lovesgoodfood.com/jason), although sometimes more easily caught [on identi.ca](http://identi.ca/jasonriedy). diff --git a/doc/users/jaywalk.mdwn b/doc/users/jaywalk.mdwn new file mode 100644 index 000000000..31b3e0c4d --- /dev/null +++ b/doc/users/jaywalk.mdwn @@ -0,0 +1,5 @@ +Jonatan Walck. Home page: [jonatan.walck.se](http://jonatan.walck.se) + +## Contact ## +* jonatan at walck dot se +* [I2P-Bote key](http://jonatan.walck.i2p/docs/i2p-bote.txt) [what is this?](http://i2pbote.i2p.to/) diff --git a/doc/users/jcorneli.mdwn b/doc/users/jcorneli.mdwn new file mode 100644 index 000000000..6c11eac09 --- /dev/null +++ b/doc/users/jcorneli.mdwn @@ -0,0 +1,3 @@ +I am a Ph. D. student at the Knowledge Media Institute of The Open University, UK. I'm also on the board of directors of [PlanetMath.org](http://planetmath.org), and a contributor to the [Planetary](http://trac.mathweb.org/planetary) project where we are rebuilding PlanetMath's backend (features will include some with significant inspiration from ikiwiki). + +My personal homepage is here: [http://metameso.org/~joe](http://metameso.org/~joe) diff --git a/doc/users/jeanprivat.mdwn b/doc/users/jeanprivat.mdwn new file mode 100644 index 000000000..4d75a9867 --- /dev/null +++ b/doc/users/jeanprivat.mdwn @@ -0,0 +1 @@ +Jean Privat is <jean@pryen.org>. diff --git a/doc/users/jerojasro.mdwn b/doc/users/jerojasro.mdwn new file mode 100644 index 000000000..e2e620d3f --- /dev/null +++ b/doc/users/jerojasro.mdwn @@ -0,0 +1,3 @@ +Javier Rojas + +I keep a personal [wiki](http://devnull.li/~jerojasro/wiki) and my [blog](http://devnull.li/~jerojasro/blog) in ikiwiki. diff --git a/doc/users/jmtd.mdwn b/doc/users/jmtd.mdwn new file mode 100644 index 000000000..a816baf6f --- /dev/null +++ b/doc/users/jmtd.mdwn @@ -0,0 +1 @@ +[[!meta redir=users/jon]] diff --git a/doc/users/jogo.mdwn b/doc/users/jogo.mdwn index 2a6577990..e8068a10f 100644 --- a/doc/users/jogo.mdwn +++ b/doc/users/jogo.mdwn @@ -1,3 +1,5 @@ -I'm looking at Ikiwiki, searching the best Wiki. The only other one I've found is [werc](http://werc.cat-v.org/). + * An [economic game](http://sef.matabio.net/) in french, which [use](http://sef.matabio.net/wiki/) IkiWiki. + * Some [plugins](http://www.matabio.net/tcgi/hg/IkiPlugins/file/). + * An alternative [base wiki](http://www.matabio.net/tcgi/hg/FrIkiWiki/file/) in french. email: `jogo matabio net`. diff --git a/doc/users/jon.mdwn b/doc/users/jon.mdwn index 54f1ac383..eb01821ff 100644 --- a/doc/users/jon.mdwn +++ b/doc/users/jon.mdwn @@ -4,20 +4,35 @@ team-documentation management system for a small-sized group of UNIX sysadmins. * my edits should appear either as 'Jon' (if I've used - [[tips/untrusted_git_push]]) or 'jmtd.net' (or once upon a time - 'alcopop.org/me/openid/' or 'jondowland'). + [[tips/untrusted_git_push]]); 'jmtd.net', 'jmtd.livejournal.com', + 'jmtd' if I've forgotten to set my local git config properly, + or once upon a time 'alcopop.org/me/openid/' or 'jondowland'. * My [homepage](http://jmtd.net/) is powered by ikiwiki I gave a talk at the [UK UNIX User's Group](http://www.ukuug.org/) annual -[Linux conference](http://www.ukuug.org/events/linux2008/) about organising -system administrator documentation. Roughly a third of this talk was -discussing IkiWiki in some technical detail and suggesting it as a good piece -of software for this task. +[Linux conference](http://www.ukuug.org/events/linux2008/) in 2008 about +organising system administrator documentation. Roughly a third of this talk +was discussing IkiWiki in some technical detail and suggesting it as a good +piece of software for this task. * slides at <http://www.staff.ncl.ac.uk/jon.dowland/unix/docs/>. I am also working on some ikiwiki hacks: +* [[todo/allow site-wide meta definitions]] +* Improving the means by which you can migrate from mediawiki to + IkiWiki. See [[tips/convert mediawiki to ikiwiki]] and the + [[plugins/contrib/mediawiki]] plugin. + +I am mostly interested in ikiwiki usability issues: + + * [[bugs/the login page is unclear when multiple methods exist]] + * [[bugs/backlinks onhover thing can go weird]] + * [[todo/CSS classes for links]] + * [[todo/adjust commit message for rename, remove]] + +The following I have been looking at, but are on the back-burner: + * an alternative approach to [[plugins/comments]] (see [[todo/more flexible inline postform]] for one piece of the puzzle; <http://dev.jmtd.net/comments/> for some investigation into making the post @@ -25,9 +40,21 @@ I am also working on some ikiwiki hacks: * a system for [[forum/managing_todo_lists]] (see also [[todo/interactive todo lists]] and <http://dev.jmtd.net/outliner/> for the current WIP). +* a `tag2` plugin, which does the same thing as [[plugins/tag]], but + does not sit on top of [[ikiwiki/wikilink]]s, so does not result in + bugs such as [[bugs/tagged() matching wikilinks]]. Code for this lives + in my github `tag2` branch: <http://github.com/jmtd/ikiwiki> -I am currently mostly interested in ikiwiki usability issues: +Penultimately, the following are merely half-formed thoughts: - * [[bugs/the login page is unclear when multiple methods exist]] - * [[bugs/backlinks onhover thing can go weird]] - * [[todo/CSS classes for links]] + * adding and removing tags to pages via the edit form by ticking and + unticking checkboxes next to a tag name (rather than entering the + directive into the text of the page directly) + * perhaps the same for meta + * I'd like to make profiling ikiwiki in action very easy for newcomers. + Perhaps even a plugin that created a file /profile or similar on build. + +Finally, backlinks (since I have issues with the current backlinks +implementation, see [[bugs/backlinks onhover thing can go weird]]): + +[[!inline pages="link(users/Jon)" archive="yes" feeds="no"]] diff --git a/doc/users/joshtriplett.mdwn b/doc/users/joshtriplett.mdwn index f85c068c3..29178057c 100644 --- a/doc/users/joshtriplett.mdwn +++ b/doc/users/joshtriplett.mdwn @@ -6,9 +6,10 @@ Email: `josh@{joshtriplett.org,freedesktop.org,kernel.org,psas.pdx.edu}`. Proud user of ikiwiki. -Currently working on scripts to convert MoinMoin and TWiki wikis to -ikiwikis backed by a git repository, including full history. -Available from the following repositories, though not well-documented: +Worked on scripts to convert MoinMoin and TWiki wikis to ikiwikis backed by a +git repository, including full history. Used for a couple of wikis, and now no +longer maintained, but potentially still useful. Available from the following +repositories, though not well-documented: git clone git://svcs.cs.pdx.edu/git/wiki2iki/moin2iki git clone git://svcs.cs.pdx.edu/git/wiki2iki/html-wikiconverter diff --git a/doc/users/joshtriplett/discussion.mdwn b/doc/users/joshtriplett/discussion.mdwn new file mode 100644 index 000000000..16e9be057 --- /dev/null +++ b/doc/users/joshtriplett/discussion.mdwn @@ -0,0 +1,66 @@ +Can we please have a very brief HOWTO? + +I have a Moin wiki in /var/www/wiki and want to create an IkIwiki clone of it in /var/www/ikiwiki backed by a git repos in /data/ikiwiki. + +I tried: + + mkdir /var/www/ikiwiki + mkdir /data/ikiwiki + PATH=.:/usr/lib/git-core:$PATH ./moin2iki /data/ikiwiki http://localhost/wiki + +Help please!but this failed. (BTW, I don't usually put . in my PATH). The failure appears to be that the converter doesn't actually create an ikiwiki instance, but appears to want to update one: + + fatal: ambiguous argument 'master': unknown revision or path not in the working tree. + Use '--' to separate paths from revisions + fatal: ambiguous argument 'master': unknown revision or path not in the working tree. + Use '--' to separate paths from revisions + fatal: Not a valid object name master + Traceback (most recent call last): + File "/home/peterc/src/moin2iki/git-map", line 125, in <module> + if __name__ == "__main__": sys.exit(main(sys.argv[1:])) + File "/home/peterc/src/moin2iki/git-map", line 117, in main + print git_map_file('commit', new_head) + File "/home/peterc/src/moin2iki/git-map", line 33, in git_map_file + f(inproc.stdout, outproc.stdin, sha, arg) + File "/home/peterc/src/moin2iki/git-map", line 64, in handle_commit + string, tree = lines.pop(0).split() + IndexError: pop from empty list + +OK, so I created one: + + ikiwiki --setup /etc/ikiwiki/auto.setup + ..... +This process created several files and directories in my home directory: + + wiki.git/ + public_html/wiki/ + wiki.setup + .ikiwiki/ + +Following the instructions on the setup page, I did: + mv wiki.git /data/ikiwiki + ( cd /data/ikiwiki; git clone -l wiki.git wiki; ) + mv .ikiwiki /data/ikiwiki/ikiwiki + mv ~/public_html/wiki /var/ikiwiki/ + +then did again + + PATH=.:/usr/lib/git-core:$PATH ./moin2iki /data/ikiwiki/wiki http://www/wiki + +and saw no output, and no change to the filesystem. + +I'm totally confused. It looks as though the script calls moin2git iff the target directory isn't there, but the script fails in interesting ways if it is. + +The other thing I saw was: + + 2009-12-04 09:00:31,542 WARNING MoinMoin.log:139 using logging configuration read from built-in fallback in MoinMoin.log module! + Traceback (most recent call last): + File "./moin2git", line 128, in <module> + if __name__ == '__main__': main(*sys.argv[1:]) + File "./moin2git", line 43, in main + r = request.RequestCLI() + AttributeError: 'module' object has no attribute 'RequestCLI' + +Moin version is 1.8.5 + +Help please! diff --git a/doc/users/justint.mdwn b/doc/users/justint.mdwn new file mode 100644 index 000000000..23db51566 --- /dev/null +++ b/doc/users/justint.mdwn @@ -0,0 +1 @@ +Casual ikiwiki user. diff --git a/doc/users/nil.mdwn b/doc/users/nil.mdwn new file mode 100644 index 000000000..e1826cec6 --- /dev/null +++ b/doc/users/nil.mdwn @@ -0,0 +1,8 @@ +nil first used ikiwiki on a site/wiki/blog/something... and felt this approach much more comfortable than the usual web-only ones. +Since then, ikiwiki is a kind of swiss army knife when it comes to build anything for the web. + +Can be reached at nicolas at limare.net + +The current big ikiwiki-powered project is <http://www.ipol.im> + +TODO: document "how to split public/edition interfaces" diff --git a/doc/users/rubykat.mdwn b/doc/users/rubykat.mdwn new file mode 100644 index 000000000..f37d13306 --- /dev/null +++ b/doc/users/rubykat.mdwn @@ -0,0 +1 @@ +See [[KathrynAndersen]]. diff --git a/doc/users/schmonz.mdwn b/doc/users/schmonz.mdwn index 7ebd8311c..ec282c990 100644 --- a/doc/users/schmonz.mdwn +++ b/doc/users/schmonz.mdwn @@ -1,3 +1,5 @@ -[Amitai Schlair](http://www.columbia.edu/~ays2105/) recently discovered ikiwiki and finds himself using it for all sorts of things. His attempts at contributing: +[Amitai Schlair](http://www.netbsd.org/~schmonz/) recently discovered ikiwiki and finds himself using it for all sorts of things. His attempts at contributing: [[!map pages="!*/Discussion and ((link(users/schmonz) and plugins/*) or rcs/cvs)"]] + +I've also written a plugin for [WIND authentication](http://www.columbia.edu/acis/rad/authmethods/wind/), which may or may not be of general utility. diff --git a/doc/users/simonraven.mdwn b/doc/users/simonraven.mdwn index 5fc24711e..13681a674 100644 --- a/doc/users/simonraven.mdwn +++ b/doc/users/simonraven.mdwn @@ -1,8 +1,7 @@ ## personal/site info -New ikiwiki site at my web site, blog, kisikew.org home site, for indigenews, and our indigenous-centric wiki (mostly East Coast/Woodlands area). Mediawiki stuff was imported successfully (as noted on this web site). - +Have several ikiwiki-based sites at my web site, blog, kisikew.org home site, for indigenews, and our indigenous-centric wiki (mostly East Coast/Woodlands area). ## ikiwiki branch at github -Maintain my own branch, partly to learn about VCS, git, ikiwiki, Debian packaging, and Perl. I don't recommend anyone pull from it, as I use third-party plugins included on this site that people may not want in a default installation of ikiwiki. This is why I don't push to Joey's -- so it's nothing personal, I just don't want to mess things up for other people, from my mistakes and stumbles. +Maintain my own branch, partly to learn about VCS, git, ikiwiki, Debian packaging, and Perl. Thinking of removing most 3rd-party plugins (found in contrib/). Have some custom plugins to support dual bottom-of-the-page "sidebars" and an attempt at supporting HTTPBL (see projecthoneypot.org). diff --git a/doc/users/sunny256.mdwn b/doc/users/sunny256.mdwn new file mode 100644 index 000000000..faf829358 --- /dev/null +++ b/doc/users/sunny256.mdwn @@ -0,0 +1,15 @@ +I'm Øyvind A. Holm, a Norwegian guy who's been in love with \*NIX-like operating systems since I first tried [QNX](http://www.qnx.com/) in 1987. +Then, after playing around with [Coherent](http://en.wikipedia.org/wiki/Coherent_%28operating_system%29) for a while, I finally got on the Linux bandwagon at kernel 1.2.8 in 1995. + +I live in Bergen, Norway, at [N 60.37436° E 5.3471°](http://www.openstreetmap.org/?mlat=60.374252&mlon=5.34722&zoom=16&layers=M), to be specific. +I'm quite passionate about Open Source in general, freedom of speech, music and science. +I'm a photo enthusiast, musician now and then, atheist and some kind of anarchist. + +Most of the places on the Net I hang around are listed on my [Google profile](http://www.google.com/profiles/sunny256). + +I discovered ikiwiki on 2011-02-15, and immediately clicked with it. +One week later it had replaced everything on [my web server](http://www.sunbase.org), after using some homegrown CMS written in Perl for years, [Mediawiki](http://www.mediawiki.org), [Drupal](http://drupal.org) and whatnot. +Seems as I've found the perfect system at last. +Thanks for creating it, Joey. + +I have a clone of the ikiwiki repository at <https://github.com/sunny256/ikiwiki> where patches go. diff --git a/doc/users/svend.mdwn b/doc/users/svend.mdwn index 69d83584f..712a0d3e7 100644 --- a/doc/users/svend.mdwn +++ b/doc/users/svend.mdwn @@ -1,4 +1,4 @@ [[!meta title="Svend Sorensen"]] -* [website](http://www.ciffer.net/~svend/) -* [blog](http://www.ciffer.net/~svend/blog/) +* [website](http://ciffer.net/~svend/) +* [blog](http://ciffer.net/~svend/blog/) diff --git a/doc/users/tschwinge.mdwn b/doc/users/tschwinge.mdwn index bb5cef6a6..435208a71 100644 --- a/doc/users/tschwinge.mdwn +++ b/doc/users/tschwinge.mdwn @@ -1,11 +1,151 @@ [[!meta title="Thomas Schwinge"]] # Thomas Schwinge -<tschwinge@gnu.org> -<http://www.thomas.schwinge.homeip.net/> +<thomas@schwinge.name> +<http://schwinge.homeip.net/~thomas/> I have converted the [GNU Hurd](http://www.gnu.org/software/hurd/)'s previous web pages and previous wiki pages to a *[[ikiwiki]]* system; and all that while preserving the previous content's history, which was stored in a CVS repository for the HTML web pages and a TWiki RCS repository for the wiki; see <http://www.gnu.org/software/hurd/colophon.html>. + +# Issues to Work On + +## Stability of Separate Builds + +The goal is that separate builds of the same source files should yield the +exactly same HTML code (of course, except for changes due to differences in +Markdown rendering, for example). + + * Timestamps -- [[forum/ikiwiki__39__s_notion_of_time]], [[forum/How_does_ikiwiki_remember_times__63__]] + + Git set's the current *mtime* when checking out files. The result is that + <http://www.gnu.org/software/hurd/contact_us.html> and + <http://www.bddebian.com:8888/~hurd-web/contact_us/> show different *Last + edited* timestamps. + + This can either be solved by adding a facility to Git to set the + checked-out files' *mtime* according to the *AuthorDate* / *CommitDate* + (which one...), or doing that retroactively with the + <http://www.gnu.org/software/hurd/set_mtimes> script before building, or + with a ikiwiki-internal solution. + + * HTML character entities + + <http://www.gnu.org/software/hurd/purify_html> + +### \[[!map]] behavior + +The \[[!map]] on, for example, +<http://www.gnu.org/software/hurd/tag/open_issue_hurd.html>, should not show +the complete hierarchy of pages, but instead just the pages that actually *do* +contain the \[[!tag open_issue_hurd]]. + +> `tagged(open_issue_hurd)` in its pagespec should do that. --[[Joey]] + +>> Well, that's exactly what this page contains: \[[!map +>> pages="tagged(open_issue_hurd) and !open_issues and !*/discussion" +>> show=title]] +>> +>> This is currently rendered as can be seen on +>> <http://www.gnu.org/software/hurd/tag/open_issue_hurd.html>, but I'd imagine +>> it to be rendered by **only** linking to the pages that actually do contain +>> the tag, (**only** the outer leaf ones, which are *capturing stdout and +>> stderr*, *ramdisk*, *syncfs*, ...; but **not** to *hurd*, *debugging*, +>> *translator*, *libstore*, *examples*, ...). Otherwise, the way it's being +>> rendered at the moment, it appears to the reader that *hurd*, *debugging*, +>> *translator*, *libstore*, *examples*, ... were all tagged, too, and not only +>> the outer ones. + +## Anchors -- [[ikiwiki/wikilink/discussion]] + +## Default Content for Meta Values -- [[plugins/contrib/default_content_for___42__copyright__42___and___42__license__42__]] + +This will decrease to be relevant, as we're going to add copyright and +licensing headers to every single file. + +## [[bugs/img vs align]] + +## Texinfo -- [[plugins/contrib/texinfo]] + +Not very important. Have to consider external commands / files / security (see +[[plugins/teximg]] source code)? + +## Shortcuts -- [[plugins/shortcut/discussion]] + +## \[[!meta redir]] -- [[todo/__42__forward__42__ing_functionality_for_the_meta_plugin]] + +Implement a checker that makes sure that no pages that use \[[!meta redir]] +redirect to another page (and are thus considered legacy pages for providing +stable URLs, for example) are linked to from other wiki pages. This is useful +w.r.t. backlinks. Alternative, the backlinks to the \[[!meta redir]]-using +pages could perhaps be passed on to the referred-to page? + +> I found that backlinks was an easy way to find such links to such pages. +> (Although the redirection made it hard to see the backlinks!) --[[Joey]] + +## \[[!meta redir]] -- tell what's going on + +Add functionality that a text like *this page's content has moved to [new +page]; in a few seconds you'll be redirected thither* is displayed on every +page that uses \[[!meta redir]]. + +## Sendmail -- [[todo/passwordauth:_sendmail_interface]] + +## [[bugs/Broken Parentlinks]] + +## Modifying [[plugins/inline]] for showing only an *appetizer* + +Currently ikiwiki's inline plugin will either show the full page or nothing of +it. Often that's too much. One can manually use the [[plugins/toggle]] plugin +-- see the *News* section on <http://www.gnu.org/software/hurd/>. Adding a new +mode to the inline plugin to only show an *appetizer* ending with *... (read +on)* after a customizable amount of characters (or lines) would be a another +possibility. The *... (read on)* would then either toggle the full content +being displayed or link to the complete page. + +> You're looking for [[plugins/more]] (or possibly a way to do that automatically, +> I suppose. --[[Joey]] + +## Prefix For the HTML Title + +The title of each page (as in `<html><head><title>`...) should be prefixed with +*GNU Project - GNU Hurd -*. We can either do this directly in `page.tmpl`, or +create a way to modify the `TITLE` template variable suitably. + +## [[plugins/inline]] feedfile option + +Not that important. Git commit b67632cdcdd333cf0a88d03c0f7e6e62921f32c3. This +would be nice to have even when *not* using *usedirs*. Might involve issues as +discussed in *N-to-M Mapping of Input and Output Files* on +[[plugins/contrib/texinfo]]. + +## Unverified -- these may be bugs, but have yet to be verified + + * ikiwiki doesn't change its internal database when \[[!meta date]] / + \[[!meta updated]] are added / removed, and thusly these meta values are + not promulgated in RSS / Atom feeds. + + > I would rather see this filed as a bug, but FWIW, the problem + > is probably that meta does not override the mdate_3339 + > template variable used by the atom and rss templates. + > (Meta does store ctime directly in the ikiwiki database, but cannot + > store mtime in \%pagemtime because it would mess up detection of when + > actual file mtimes change.) --[[Joey]] + + * Complicated issue w.r.t. *no text was copied in this page* + ([[plugins/cutpaste]]) in RSS feed (only; not Atom?) under some conditions + (refresh only, but not rebuild?). Perhaps missing to read in / parse some + files? + [[Reported|bugs/Error:_no_text_was_copied_in_this_page_--_missing_page_dependencies]]. + + * [[plugins/recentchanges]] + + * Creates non-existing links to changes. + + * Invalid *directory link* with `--usedirs`. + + * Doesn't honor `$timeformat`. + + * Does create `recentchangees.*` files even if that is overridden. diff --git a/doc/users/tupyakov_vladimir.mdwn b/doc/users/tupyakov_vladimir.mdwn new file mode 100644 index 000000000..95f85adc2 --- /dev/null +++ b/doc/users/tupyakov_vladimir.mdwn @@ -0,0 +1 @@ +всем привет! diff --git a/doc/users/weakish.mdwn b/doc/users/weakish.mdwn index 30a14d303..a5f252e75 100644 --- a/doc/users/weakish.mdwn +++ b/doc/users/weakish.mdwn @@ -1,3 +1,3 @@ email: weakish@gmail.com -openid: <http://weakish.pigro.net> +website: <http://weakish.github.com> diff --git a/doc/users/wentasah.mdwn b/doc/users/wentasah.mdwn new file mode 100644 index 000000000..3363c1d8f --- /dev/null +++ b/doc/users/wentasah.mdwn @@ -0,0 +1,9 @@ +My homepage: <http://rtime.felk.cvut.cz/~sojka/> + +My other ikiwikis: + +- <http://support.dce.felk.cvut.cz/osp/> +- <http://support.dce.felk.cvut.cz/psr/> +- <http://frsh-forb.sourceforge.net/> +- <http://orte.sourceforge.net/> +- <http://ortcan.sourceforge.net/> diff --git a/doc/users/wtk.mdwn b/doc/users/wtk.mdwn new file mode 100644 index 000000000..a34473577 --- /dev/null +++ b/doc/users/wtk.mdwn @@ -0,0 +1,6 @@ +[[!meta title="W. Trevor King"]] + +* Git branch: `wtk`. +* [Ikiwiki-based blog][blog] + +[blog]: http://www.physics.drexel.edu/~wking/unfolding-disasters/ diff --git a/doc/users/xma.mdwn b/doc/users/xma.mdwn deleted file mode 100644 index 89f2ff74c..000000000 --- a/doc/users/xma.mdwn +++ /dev/null @@ -1,28 +0,0 @@ -[[!meta title="Xavier Maillard"]] -# Xavier Maillard - -I just started using [[ikiwiki]] for my own webspace at http://maillard.mobi/~xma/wiki - -I am learning how to effectively use it. - -Anyway, [[ikiwiki]] is really *awesome* ! - -## More about me - -I am CLI user living in the linux console. More precisely, I live in an [[GNU_Emacs]] frame all day long. My main computer is an EeePC 901 running Slackware GNU/Linux 12.1. I do not have X installed (too lazy) but when in X, I am running an instance of [[CLFSWM]]. - -## Contacting me - -Various channels to contact me: - -- mail: xma@gnu.org -- jabber: xma01@jabber.fr -- mobile: +33 621-964-362 (I only anwser to people I know though) - -Voila. - -## Plans - -I am planning to make a presentation of [[ikiwiki]]to my [local LUG](http://lolica.org) for our next montly meeting. Any help would be greatly appreciated. - -We are discussing to replace our old unmaintained (and unmaintainable) [SPIP](http://spip.net) website with a wiki. This is why I would like using ikiwiki ;) diff --git a/doc/w3mmode.mdwn b/doc/w3mmode.mdwn index 3afee5c9b..04e37ba04 100644 --- a/doc/w3mmode.mdwn +++ b/doc/w3mmode.mdwn @@ -1,5 +1,5 @@ It's possible to use all of ikiwiki's web features (page editing, etc) in -the `w3m` web browser without using a web server. `w3m` supports local CGI +the [`w3m`](http://w3m.sourceforge.net/) web browser without using a web server. `w3m` supports local CGI scripts, and ikiwiki can be set up to run that way. This requires some special configuration: diff --git a/doc/wikiicons/revert.png b/doc/wikiicons/revert.png Binary files differnew file mode 100644 index 000000000..c39e65c33 --- /dev/null +++ b/doc/wikiicons/revert.png diff --git a/doc/wikitemplates.mdwn b/doc/wikitemplates.mdwn deleted file mode 100644 index 6c0480cea..000000000 --- a/doc/wikitemplates.mdwn +++ /dev/null @@ -1,50 +0,0 @@ -ikiwiki uses the HTML::Template module as its template engine. This -supports things like conditionals and loops in templates and is pretty easy -to learn. - -The aim is to keep almost all html out of ikiwiki and in the templates. - -It ships with some basic templates which can be customised. These are -located in /usr/share/ikiwiki/templates by default. - -* `page.tmpl` - Used for displaying all regular wiki pages. -* `misc.tmpl` - Generic template used for any page that doesn't - have a custom template. -* `editpage.tmpl` - Create/edit page. -* `change.tmpl` - Used to create a page describing a change made to the wiki. -* `passwordmail.tmpl` - Not a html template, this is used to - generate a mail with an url the user can use to reset their password. -* `rsspage.tmpl` - Used for generating rss feeds for [[blogs|blog]]. -* `rssitem.tmpl` - Used for generating individual items on rss feeds. -* `atompage.tmpl` - Used for generating atom feeds for blogs. -* `atomitem.tmpl` - Used for generating individual items on atom feeds. -* `inlinepage.tmpl` - Used for adding a page inline in a blog - page. -* `archivepage.tmpl` - Used for listing a page in a blog archive page. -* `microblog.tmpl` - Used for showing a microblogging post inline. -* `blogpost.tmpl` - Used for a form to add a post to a blog (and a rss/atom links) -* `feedlink.tmpl` - Used to add rss/atom links if blogpost.tmpl is not used. -* `aggregatepost.tmpl` - Used by the [[plugins/aggregate]] plugin to create - a page for a post. -* `searchform.tmpl` - Used by the [[plugins/search]] plugin to add a search - form to wiki pages. -* `searchquery.tmpl` - This is an omega template, used by the - [[plugins/search]] plugin. -* `comment.tmpl` - This template is used to display a comment - by the [[plugins/comments]] plugin. -* `editcomment.tmpl` - This template is the comment post form for the - [[plugins/comments]] plugin. -* `commentmoderation.tmpl` - This template is used to produce the comment - moderation form. -* `recentchanges.tmpl` - This template is used for listing a change - on the RecentChanges page. - -The [[plugins/pagetemplate]] plugin can allow individual pages to use a -different template than `page.tmpl`. - -The [[plugins/template]] plugin also uses templates, though those -[[templates]] are stored in the wiki and inserted into pages. - -The [[plugins/edittemplate]] plugin is used to make new pages default to -containing text from a template, which can be filled as out the page is -edited. diff --git a/doc/wikitemplates/discussion.mdwn b/doc/wikitemplates/discussion.mdwn deleted file mode 100644 index f97444e5f..000000000 --- a/doc/wikitemplates/discussion.mdwn +++ /dev/null @@ -1,46 +0,0 @@ -## Place for local templates -Where does one put any locally modified templates for an individual ikiwiki? --Ivan Z. - -> You can put them whereever you like; the `templatedir` controls -> where ikiwiki looks for them. --[[Joey]] - -Thank you for your response! My question arose out of my intention to make -custom templates for a wiki--specifically suited for the kind of content -it will have--so, that would mean I would want to distribute them through -git together with other content of the wiki. So, for this case the -separation of conceptually ONE thing (the content, the templates, and the -config option which orders to use these templates) into THREE separate -files/repos (the main content repo, the repo with templates, and the config -file) is not convenient: instead of distributing a single repo, I have to -tell people to take three things if they want to replicate this wiki. How -would you solve this inconvenience? Perhaps, a default location of the -templates *inside* the source repo would do?--Ivan Z. - -> I would avoid putting the templates in a subdirectory of the ikiwiki srcdir. -> (I'd also avoid putting the ikiwiki setup file there.) -> While it's safe to do either in some cases, there are configurations where -> it's unsafe. For example, a malicious user could use attachment handling to -> replace those files with their own, bad versions. -> -> So, two ideas for where to put the templatedir and ikiwiki setup. - -> * The easiest option is to put your wiki content in a subdirectory -> ("wiki", say) and point `srcdir` at that. -> then you can have another subdirectory for the wikitemplates, -> and put the setup file at the top. -> * Another option if using git would be to have a separate branch, -> in the same git repository, that holds wikitemplates and the setup file. -> Then you check out the repository once to make the `srcdir` available, -> and have a second checkout, of the other branch, to make the other stuff -> available. -> -> Note that with either of these methods, you have to watch out if -> giving other direct commit access to the repository. They could -> still edit the setup file and templates, so only trusted users should -> be given access. (It is, however, perfectly safe to let people edit -> the wiki via the web, and is even safe to configure -> [[tips/untrusted_git_push]] to such a repository.) --[[Joey]] - -Thanks, that's a nice and simple idea: to have a subdirectory! I'll try it. --Ivan Z. - -A [[!taglink wish|wishlist]]: the ikiwiki program could be improved so that it follows the same logic as git in looking for its config: it could ascend directories until it finds an `.ikiwiki/` directory with `.ikiwiki/setup` and then uses that configuration. Now I'm tired to always type `ikiwiki --setup path/to/the/setup --refresh` when working in my working clone of the sources; I'd like to simply type `ikiwiki` instead, and let it find the setup file. The default location to look for templates could also be made to be a sibling of the setup file: `.ikiwiki/templates/`. --Ivan Z. |