summaryrefslogtreecommitdiff
path: root/doc/todo
diff options
context:
space:
mode:
authorAmitai Schlair <schmonz@magnetic-babysitter.(none)>2009-08-30 03:02:15 -0400
committerAmitai Schlair <schmonz@magnetic-babysitter.(none)>2009-08-30 03:02:15 -0400
commitc36d2fa896e9ea42c0b6b0135ba04b4f4f60950f (patch)
tree22f1314d8e974c73bde4970c97d497628f2a1465 /doc/todo
parent5e94e973eeb4ba75d9c37bd801278f686f0977c3 (diff)
parent517432273b96fc9e6bad9b7667ef6d1b04c699ee (diff)
Merge branch 'master' of git://github.com/joeyh/ikiwiki
Diffstat (limited to 'doc/todo')
-rw-r--r--doc/todo/Raw_view_link.mdwn4
-rw-r--r--doc/todo/Restrict_page_viewing.mdwn36
-rw-r--r--doc/todo/Suggested_location_should_be_subpage_if_siblings_exist.mdwn2
-rw-r--r--doc/todo/l10n.mdwn61
-rw-r--r--doc/todo/language_definition_for_the_meta_plugin.mdwn21
-rw-r--r--doc/todo/mirrorlist_with_per-mirror_usedirs_settings.mdwn24
-rw-r--r--doc/todo/optimize_simple_dependencies.mdwn97
-rw-r--r--doc/todo/should_optimise_pagespecs.mdwn209
-rw-r--r--doc/todo/tmplvars_plugin/discussion.mdwn1
-rw-r--r--doc/todo/tracking_bugs_with_dependencies.mdwn4
10 files changed, 408 insertions, 51 deletions
diff --git a/doc/todo/Raw_view_link.mdwn b/doc/todo/Raw_view_link.mdwn
index fd64074c2..b62a9022b 100644
--- a/doc/todo/Raw_view_link.mdwn
+++ b/doc/todo/Raw_view_link.mdwn
@@ -4,7 +4,7 @@ The configuration setting for Mercurial could be something like this:
rawurl => "http://localhost:8000//raw-file/tip/\[[file]]",
-> What I want to do when I want to see if the raw source is either
+> What I do when I want to see if the raw source is either
> click on the edit link, or click on history and navigate to it in the
> history browser (easier to do in viewvc than in gitweb, IIRC).
> Not that I'm opposed to the idea of a plugin that adds a Raw link
@@ -14,4 +14,6 @@ The configuration setting for Mercurial could be something like this:
>> to gitweb/etc. I think Will's patch is a good approach, and have improved
>> on it a bit in a git branch.
+>>> Since that is merged in now, I'm marking this [[done]] --[[Joey]]
+
[[!tag wishlist]]
diff --git a/doc/todo/Restrict_page_viewing.mdwn b/doc/todo/Restrict_page_viewing.mdwn
new file mode 100644
index 000000000..089d27fff
--- /dev/null
+++ b/doc/todo/Restrict_page_viewing.mdwn
@@ -0,0 +1,36 @@
+I'd like to have some pages of my wiki to be only viewable by some users.
+
+I could use htaccess for that, but it would force the users to have
+2 authentication mecanisms, so I'd prefer to use openID for that too.
+
+* I'm thinking of adding a "show" parameter to the cgi script, thanks
+ to a plugin similar to goto.
+* When called, it would check the credential using the session stuff
+ (that I don't understand yet).
+* If not enough, it would serve a 403 error of course.
+* If enough, it would read the file locally on the server side and
+ return this as a content.
+
+Then, I'd have to generate the private page the regular way with ikiwiki,
+and prevent apache from serving them with an appropriate and
+much more maintainable htaccess file.
+
+-- [[users/emptty]]
+
+> While I'm sure a plugin could do this, it adds so much scalability cost
+> and is so counter to ikiwiki's design.. Have you considered using the
+> [[plugins/httpauth]] plugin to unify around htaccess auth? --[[Joey]]
+
+>> I'm not speaking of rendering the pages on demand, but to serve them on demand.
+>> They would still be compiled the regular way;
+>> I'll have another look at [[plugins/httpauth]] but I really like the openID whole idea.
+>> --[[emptty]]
+
+>>> How about
+>>> [mod_auth_openid](http://trac.butterfat.net/public/mod_auth_openid), then?
+>>> A plugin for ikiwiki to serve its own pages is far afield from ikiwiki's roots,
+>>> as Joey pointed out, but might be a neat option to have anyway -- for unifying
+>>> authentication across views and edits, for systems not otherwise running
+>>> web servers, for systems with web servers you don't have access to, and
+>>> doubtless for other purposes. Such a plugin would add quite a bit of flexibility,
+>>> and in that sense (IMO, of course) it'd be in the spirit of ikiwiki. --[[schmonz]]
diff --git a/doc/todo/Suggested_location_should_be_subpage_if_siblings_exist.mdwn b/doc/todo/Suggested_location_should_be_subpage_if_siblings_exist.mdwn
index 0fb14bafe..4489dd5d2 100644
--- a/doc/todo/Suggested_location_should_be_subpage_if_siblings_exist.mdwn
+++ b/doc/todo/Suggested_location_should_be_subpage_if_siblings_exist.mdwn
@@ -23,4 +23,4 @@ that we're at the root of a (sub-)hierarchy.
>> This sounds like WONTFIX to me? --[[smcv]]
-[[!tag wishlist]]
+[[!tag wishlist done]]
diff --git a/doc/todo/l10n.mdwn b/doc/todo/l10n.mdwn
index bba103494..fed3f63b6 100644
--- a/doc/todo/l10n.mdwn
+++ b/doc/todo/l10n.mdwn
@@ -1,3 +1,19 @@
+ikiwiki should be fully internationalized.
+
+----
+
+As to the hardcoded strings in ikiwiki, I've internationalized the program,
+and there is a po/ikiwiki.pot in the source that can be translated.
+--[[Joey]]
+
+----
+
+> The now merged po plugin handles l10n of wiki pages. The only missing
+> piece now is l10n of the templates.
+> --[[Joey]]
+
+----
+
From [[Recai]]:
> Here is my initial work on ikiwiki l10n infrastructure (I'm sending it
> before finalizing, there may be errors).
@@ -54,48 +70,3 @@ kullanıcının ait IP adresi: <TMPL_VAR REMOTE_ADDR>[2] <TMPL_VAR WIKIURL>
This could be easily worked around in tmpl_process3, but I wouldn't like to
maintain a separate utility.
-----
-
-As to the hardcoded strings in ikiwiki, I've internationalized the program,
-and there is a po/ikiwiki.pot in the source that can be translated.
---[[Joey]]
-
-----
-
-Danish l10n of templates and basewiki is available with the following commands:
-
- git clone http://source.jones.dk/ikiwiki.git newsite
- cd newsite
- make
-
-Updates are retrieved with this single command:
-
- make
-
-l10n is maintained using po4a for basewiki, smiley and templates - please send me PO files if you
-translate to other languagess than the few I know about myself: <dr@jones.dk>
-
-As upstream ikiwiki is now maintained in GIT too, keeping the master mirror in sync with upstream
-could probably be automated even more - but an obstacle seems to be that content is not maintained
-separately but as an integral part of upstream source (GIT seems to not support subscribing to
-only parts of a repository).
-
-For example use, here's how to roll out a clone of the [Redpill support site](http://support.redpill.dk/):
-
- mkdir -p ~/public_cgi/support.redpill.dk
- git clone git://source.jones.dk/bin
- bin/localikiwikicreatesite -o git://source.redpill.dk/support rpdemo
-
-(Redpill support is inspired by <http://help.riseup.net> but needs to be reusable for several similarly configured networks)
-
---[[JonasSmedegaard]]
-
-> I don't understand at all why you're using git the way you are.
->
-> I think that this needs to be reworked into a patch against the regular
-> ikiwiki tree, that adds the po4a stuff needed to generate the pot files for the
-> basewiki and template content, as well as the stuff that generates the
-> translated versions of those from the po files.
->
-> The now merged po plugin also handles some of this.
-> --[[Joey]]
diff --git a/doc/todo/language_definition_for_the_meta_plugin.mdwn b/doc/todo/language_definition_for_the_meta_plugin.mdwn
index 90bfbef3b..44fcf32bb 100644
--- a/doc/todo/language_definition_for_the_meta_plugin.mdwn
+++ b/doc/todo/language_definition_for_the_meta_plugin.mdwn
@@ -95,7 +95,24 @@ This may be useful for sites with a few pages in different languages, but no ful
>>> Now that po has been merged, this patch should probably also be adapted
>>> so that the po plugin forces the meta::lang of every page to what po
->>> thinks it should be. Perhaps [[the_special_po_pagespecs|ikiwiki/pagespec/po]]
->>> should also work with meta-assigned languages? --[[smcv]]
+>>> thinks it should be. --[[smcv]]
+
+>>>> Agreed, users of the po plugin would greatly benefit from it.
+>>>> Seems doable. --[[intrigeri]]
+
+>>> Perhaps [[the_special_po_pagespecs|ikiwiki/pagespec/po]] should
+>>> also work with meta-assigned languages? --[[smcv]]
+
+>>>> Yes. But then, these special pagespecs should be moved outside of
+>>>> [[plugins/po]], as they could be useful to anyone using the
+>>>> currently discussed patch even when not using the po plugin.
+>>>>
+>>>> We could add these pagespecs to the core and make them use
+>>>> a simple language-guessing system based on a new hook. Any plugin
+>>>> that implements such a hook could decide whether it should
+>>>> overrides the language guessed by another one, and optionally use
+>>>> the `first`/`last` options (e.g. the po plugin will want to be
+>>>> authoritative on the pages of type po, and will then use
+>>>> `last`). --[[intrigeri]]
[[!tag wishlist patch plugins/meta translation]]
diff --git a/doc/todo/mirrorlist_with_per-mirror_usedirs_settings.mdwn b/doc/todo/mirrorlist_with_per-mirror_usedirs_settings.mdwn
new file mode 100644
index 000000000..6ca9962ba
--- /dev/null
+++ b/doc/todo/mirrorlist_with_per-mirror_usedirs_settings.mdwn
@@ -0,0 +1,24 @@
+I've got a wiki that is built at two places:
+
+* a static copy, aimed at being viewed without any web server, using
+ a web browser's `file:///` urls => usedirs is disabled to get nice
+ and working links
+* an online copy, with usedirs enabled in order to benefit from the
+ language negotiation using the po plugin
+
+I need to use mirrorlist on the static copy, so that one can easily
+reach the online, possibly updated, pages. But as documented, "pages are
+assumed to exist in the same location under the specified url on each
+mirror", so the generated urls are wrong.
+
+My `mirrorlist` branch contains a patch that allows one to configure usedirs
+per-mirror. Note: the old configuration format is still supported, so this should
+not break existing wikis.
+
+OT: as a bonus, this branch contains a patch to support {hashes,arrays} of
+{hashes,arrays} in `$config`, which I missed a bit when writing the po plugin,
+and decided this time it was really needed to implement this feature.
+
+--[[intrigeri]]
+
+[[!tag patch]]
diff --git a/doc/todo/optimize_simple_dependencies.mdwn b/doc/todo/optimize_simple_dependencies.mdwn
new file mode 100644
index 000000000..91e184c29
--- /dev/null
+++ b/doc/todo/optimize_simple_dependencies.mdwn
@@ -0,0 +1,97 @@
+[[!template id=gitbranch branch=smcv/ready/depends-exact author="[[smcv]]"]]
+
+I'm still trying to optimize ikiwiki for a site using
+[[plugins/contrib/album]], and checking which pages depend on which pages
+is still taking too long. Here's another go at fixing that, using [[Will]]'s
+suggestion from [[todo/should_optimise_pagespecs]]:
+
+> A hash, by itself, is not optimal because
+> the dependency list holds two things: page names and page specs. The hash would
+> work well for the page names, but you'll still need to iterate through the page specs.
+> I was thinking of keeping a list and a hash. You use the list for pagespecs
+> and the hash for individual page names. To make this work you need to adjust the
+> API so it knows which you're adding. -- [[Will]]
+
+If you have P pages and refresh after changing C of them, where an average
+page has E dependencies on exact page names and D other dependencies, this
+branch should drop the complexity of checking dependencies from
+O(P * (D+E) * C) to O(C + P*E + P*D*C). Pages that use inline or map have
+a large value for E (e.g. one per inlined page) and a small value for D (e.g.
+one per inline).
+
+Benchmarking:
+
+Test 1: a wiki with about 3500 pages and 3500 photos, and a change that
+touches about 350 pages and 350 photos
+
+Test 2: the docwiki (about 700 objects not excluded by docwiki.setup, mostly
+pages), docwiki.setup modified to turn off verbose, and a change that touches
+the 98 pages of plugins/*.mdwn
+
+In both tests I rebuilt the wiki with the target ikiwiki version, then touched
+the appropriate pages and refreshed.
+
+Results of test 1: without this branch it took around 5:45 to rebuild and
+around 5:45 again to refresh (so rebuilding 10% of the pages, then deciding
+that most of the remaining 90% didn't need to change, took about as long as
+rebuilding everything). With this branch it took 5:47 to rebuild and 1:16
+to refresh.
+
+Results of test 2: rebuilding took 14.11s without, 13.96s with; refreshing
+three times took 7.29/7.40/7.37s without, 6.62/6.56/6.63s with.
+
+(This benchmarking was actually done with my [[plugins/contrib/album]] branch,
+since that's what the huge wiki needs; that branch doesn't alter core code
+beyond the ready/depends-exact branch point, so the results should be
+equally valid.)
+
+--[[smcv]]
+
+> Now [[merged|done]] --[[smcv]]
+
+----
+
+> We discussed this on irc; I had some worries that things may have been
+> switched to `add_depends_exact` that were not pure page names. My current
+> feeling is it's all safe, but who knows. It's easy to miss something.
+> Which makes me think this is not a good interface.
+>
+> Why not, instead, make `add_depends` smart. If it's passed something
+> that is clearly a raw page name, it can add it to the exact depends hash.
+> Else, add it to the pagespec hash. You can tell if it's a pure page name
+> by matching on `$config{wiki_file_regexp}`.
+
+>> Good thinking. Done in commit 68ce514a 'Auto-detect "simple dependencies"',
+>> with a related bugfix in e8b43825 "Force %depends_exact to lower case".
+>>
+>> Performance impact: Test 2 above takes 0.2s longer to rebuild (probably
+>> from all the calls to lc, which are, however, necessary for correctness)
+>> and has indistinguishable performance for a refresh.
+>>
+>> Test 1 took about 6 minutes to rebuild and 1:25 to refresh; those are
+>> pessimistic figures, since I've added 90 more photos and 90 more pages
+>> (both to the wiki as a whole, and the number touched before refreshing)
+>> since testing the previous version of this branch. --[[smcv]]
+
+> Also I think there may be little optimisation value left in
+> 7227c2debfeef94b35f7d81f42900aa01820caa3, since the "regular" dependency
+> lists will be much shorter.
+
+>> You're probably right, but IMO it's not worth reverting it - a set (hash with
+>> dummy values) is still the right data structure. --[[smcv]]
+
+> Sounds like inline pagenames has an already exstant bug WRT
+> pages moving, which this should not make worse. Would be good to verify.
+
+>> If you mean the standard "add a better match for a link-like construct" bug
+>> that also affects sidebar, then yes, it does have the bug, but I'm pretty
+>> sure this branch doesn't make it any worse. I could solve this at the cost
+>> of making pagenames less useful for interactive use, by making it not
+>> respect [[ikiwiki/subpage/LinkingRules]], but instead always interpret
+>> its paths as relative to the top of the wiki - that's actually all that
+>> [[plugins/contrib/album]] needs. --[[smcv]]
+
+> Re coding, it would be nice if `refresh()` could avoid duplicating
+> the debug message, etc in the two cases. --[[Joey]]
+
+>> Fixed in commit f805d566 "Avoid duplicating debug message..." --[[smcv]]
diff --git a/doc/todo/should_optimise_pagespecs.mdwn b/doc/todo/should_optimise_pagespecs.mdwn
index 3ccef62fe..4b4e267f0 100644
--- a/doc/todo/should_optimise_pagespecs.mdwn
+++ b/doc/todo/should_optimise_pagespecs.mdwn
@@ -90,6 +90,8 @@ I can think about reducung the size of my wiki source and making it available on
>> rather than a single pagespec. This does turn out to be faster, although
>> not as much as I'd like. --[[smcv]]
+>>> [[Merged|done]] --[[smcv]]
+
>>> I just wanted to note that there is a whole long discussion of dependencies and pagespecs on the [[todo/tracking_bugs_with_dependencies]] page. -- [[Will]]
>>>> Yeah, I had a look at that (as the only other mention of `pagespec_merge`).
@@ -105,4 +107,211 @@ I can think about reducung the size of my wiki source and making it available on
>>>>> ikiwiki-transition, but that copy doesn't have to be optimal or support
>>>>> future features like [[tracking_bugs_with_dependencies]]). --[[smcv]]
+---
+
+Some questions on your optimize-depends branch. --[[Joey]]
+
+In saveindex it still or'd together the depends list, but the `{depends}`
+field seems only useful for backwards compatability (ie, ikiwiki-transition
+uses it still), and otherwise just bloats the index.
+
+> If it's acceptable to declare that downgrading IkiWiki requires a complete
+> rebuild, I'm happy with that. I'd prefer to keep the (simple form of the)
+> transition done automatically during a load/save cycle, rather than
+> requiring ikiwiki-transition to be run; we should probably say in NEWS
+> that the performance increase won't fully apply until the next
+> rebuild. --[[smcv]]
+
+>> It is acceptable not to support downgrades.
+>> I don't think we need a NEWS file update since any sort of refresh,
+>> not just a full rebuild, will cause the indexdb to be loaded and saved,
+>> enabling the optimisation. --[[Joey]]
+
+>>> A refresh will load the current dependencies from `{depends}` and save
+>>> them as-is as a one-element `{dependslist}`; only a rebuild will replace
+>>> the single complex pagespec with a long list of simpler pagespecs.
+>>> --[[smcv]]
+
+Is an array the right data structure? `add_depends` has to loop through the
+array to avoid dups, it would be better if a hash were used there. Since
+inline (and other plugins) explicitly add all linked pages, each as a
+separate item, the list can get rather long, and that single add_depends
+loop has suddenly become O(N^2) to the number of pages, which is something
+to avoid..
+
+> I was also thinking about this (I've been playing with some stuff based on the
+> `remove-pagespec-merge` branch). A hash, by itself, is not optimal because
+> the dependency list holds two things: page names and page specs. The hash would
+> work well for the page names, but you'll still need to iterate through the page specs.
+> I was thinking of keeping a list and a hash. You use the list for pagespecs
+> and the hash for individual page names. To make this work you need to adjust the
+> API so it knows which you're adding. -- [[Will]]
+
+> I wasn't thinking about a lookup hash, just a dedup hash, FWIW.
+> --[[Joey]]
+
+>> I was under the impression from previous code review that you preferred
+>> to represent unordered sets as lists, rather than hashes with dummy
+>> values. If I was wrong, great, I'll fix that and it'll probably go
+>> a bit faster. --[[smcv]]
+
+>>> It depends, really. And it'd certianly make sense to benchmark such a
+>>> change. --[[Joey]]
+
+>>>> Benchmarked, below. --[[smcv]]
+
+Also, since a lot of places are calling add_depends in a loop, it probably
+makes sense to just make it accept a list of dependencies to add. It'll be
+marginally faster, probably, and should allow for better optimisation
+when adding a lot of depends at once.
+
+> That'd be an API change; perhaps marginally faster, but I don't
+> see how it would allow better optimisation if we're de-duplicating
+> anyway? --[[smcv]]
+
+>> Well, I was thinking that it might be sufficient to build a `%seen`
+>> hash of dependencies inside `add_depends`, if the places that call
+>> it lots were changed to just call it once. Of course the only way to
+>> tell is benchmarking. --[[Joey]]
+
+>>> It doesn't seem that it significantly affects performance either way.
+>>> --[[smcv]]
+
+In Render.pm, we now have a triply nested loop, which is a bit
+scary for efficiency. It seems there should be a way to
+rework this code so it can use the optimised `pagespec_match_list`,
+and/or hoist some of the inner loop calculations (like the `pagename`)
+out.
+
+> I don't think the complexity is any greater than it was: I've just
+> moved one level of "loop" out of the generated Perl, to be
+> in visible code. I'll see whether some of it can be hoisted, though.
+> --[[smcv]]
+
+>> The call to `pagename` is the only part I can see that's clearly
+>> run more often than before. That function is pretty inexpensive, but..
+>> --[[Joey]]
+
+>>> I don't see anything that can be hoisted without significant refactoring,
+>>> actually. Beware that there are two pagename calls in the loop: one for
+>>> `$f` (which is the page we might want to rebuild), and one for `$file`
+>>> (which is the changed page that it might depend on). Note that I didn't
+>>> choose those names!
+>>>
+>>> The three loops are over source files, their lists of dependency pagespecs,
+>>> and files that might have changed. I see the following things we might be
+>>> doing redundantly:
+>>>
+>>> * If `$file` is considered as a potential dependency for more than
+>>> one `$f`, we evaluate `pagename($file)` more than once. Potential fix:
+>>> cache them (this turns out to save about half a second on the docwiki,
+>>> see below).
+>>> * If several pages depend on the same pagespec, we evaluate whether each
+>>> changed page matches that pagespec more than once: however, we do so
+>>> with a different location parameter every time, so repeated calls are,
+>>> in the general case, the only correct thing to do. Potential fix:
+>>> perhaps special-case "page x depends on page y and nothing else"
+>>> (i.e. globs that have no wildcards) into a separate hash? I haven't
+>>> done anything in this direction.
+>>> * Any preparatory work done by pagespec_match (converting the pagespec
+>>> into Perl, mostly?) is done in the inner loop; switching to
+>>> pagespec_match_list (significant refactoring) saves more than half a
+>>> second on the docwiki.
+>>>
+>>> --[[smcv]]
+
+Very good catch on img/meta using the wrong dependency; verified in the wild!
+(I've cherry-picked those bug fixes.)
+
+----
+
+Benchmarking results: I benchmarked by altering docwiki.setup to switch off
+verbose, running "make clean && ./Makefile.PL && make", and timing one rebuild
+of the docwiki followed by three refreshes. Before each refresh I used
+`touch plugins/*.mdwn` to have something significant to refresh.
+
+I'm assuming that "user" CPU time is the important thing here (system time was
+relatively small in all cases, up to 0.35 seconds per run).
+
+master at the time of rebasing: 14.20s to rebuild, 10.04/12.07/14.01s to
+refresh. I think you can see the bug clearly here - the pagespecs are getting
+more complicated every time!
+
+> I can totally see a bug here, and it's one I didn't think existed. Ie,
+> I thought that after the first refresh, the pagespec should stabalize,
+> and what it stabalized to was probably unnecessarily long, but not
+> growing w/o bounds!
+>
+> a) Explains why ikiwiki.info has been so slow lately. Well that and some
+> other things that overloaded the system.
+> b) Suggests to me we will probably want to force a rebuild on upgrade
+> when fixing this (via the mechanism in the postinst).
+>
+> I've investigated why the pagespecs keep growing: When page A changes,
+> its old depends are cleared. Then
+> page B that inlines A gets rebuilt, and its old depends are also cleared.
+> But page B also inlines page C; which means C gets re-rendered. And this
+> happens w/o its old depends being cleared, so C's depends are doubled.
+> --[[Joey]]
+
+After the initial optimization: 14.27s to rebuild, 8.26/8.33/8.26 to refresh.
+Success!
+
+Not pre-joining dependencies actually took about ~0.2s more; I don't know why.
+I'm worried that duplicates will just build up (again) in less simple cases,
+though, so 0.2s is probably a small price to pay for that not happening (it
+might well be experimental error, for that matter).
+
+> It's weird that the suggested optimisations to
+> `add_depends` had no effect. So, the commit message to
+> b6fcb1cb0ef27e5a63184440675d465fad652acf is actually wrong.. ? --[[Joey]]
+
+>> I'll try benchmarking again on the non-public wiki where I had the 4%
+>> speedup. The docwiki is so small that 4% is hard to measure... --[[smcv]]
+
+Not saving {depends} to the index, using a hash instead of a list to
+de-duplicate, and allowing add_depends to take an arrayref instead of a single
+pagespec had no noticable positive or negative effect on this test.
+
+> I see e4cd168ebedd95585290c97ff42234344bfed46c is still in your branch
+> though. I don't like using an arrayref, it could just take `($page, @depends)`.
+> and I don't see the need to keep it if it doesn't currently help.
+
+>> I'll drop it. --[[smcv]]
+
+> Is there any reason to keep 7227c2debfeef94b35f7d81f42900aa01820caa3
+> if it doesn't improve speed?
+> --[[Joey]]
+
+>> I'll try benchmarking on a more complex wiki and see whether it has a
+>> positive or negative effect. It does avoid being O(n**2) in number of
+>> dependencies. --[[smcv]]
+
+Memoizing the results of pagename brought the rebuild time down to 14.06s
+and the refresh time down to 7.96/7.92/7.92, a significant win.
+
+> Ok, that seems safe to memoize. (It's a real function and it isn't
+> called with a great many inputs.) Why did you chose to memoize it
+> explicitly rather than adding it to the memoize list at the top?
+
+>> It does depend on global variables, so using Memoize seemed like asking for
+>> trouble. I suppose what I did is equivalent to Memoize though... --[[smcv]]
+
+Refactoring to use pagespec_match_list looks more risky from a code churn
+point of view; rebuild now takes 14.35s, but refresh is only 7.30/7.29/7.28,
+another significant win.
+
+--[[smcv]]
+
+> I had mostly convinced myself that
+> `pagespec_match_list` would not lead to a speed gain here. My reasoning
+> was that you want to stop after finding one match, while `pagespec_match_list`
+> checks all pages for matches. So what we're seeing is that
+> on a rebuild, `@changed` is all pages, and not short-circuiting leads
+> to unnecessary work. OTOH, on refresh, `@changed` is small and I suppose
+> `pagespec_match_list`'s other slight efficiencies win out somehow.
+>
+> Welcome to the "I made ikiwiki twice as fast
+> and all I got was this lousy git sha1sum" club BTW :-) --[[Joey]]
+
[[!tag wishlist patch patch/core]]
diff --git a/doc/todo/tmplvars_plugin/discussion.mdwn b/doc/todo/tmplvars_plugin/discussion.mdwn
new file mode 100644
index 000000000..93cb9b414
--- /dev/null
+++ b/doc/todo/tmplvars_plugin/discussion.mdwn
@@ -0,0 +1 @@
+I find this plugin quite usefull. But one thing, I would like to be able to do is set a tmplvar e.g. in a sidebar so that it affects all Pages this sidebar is used in. --martin
diff --git a/doc/todo/tracking_bugs_with_dependencies.mdwn b/doc/todo/tracking_bugs_with_dependencies.mdwn
index a198530fc..bfdbf0875 100644
--- a/doc/todo/tracking_bugs_with_dependencies.mdwn
+++ b/doc/todo/tracking_bugs_with_dependencies.mdwn
@@ -410,8 +410,8 @@ account all comments above (which doesn't mean it is above reproach :) ). --[[W
>>>>> then the last definition (baz) takes precedence.
>>>>> In the process of writing this I think I've come up with a way to change this back the way it was, still using closures. -- [[Will]]
->>> Alternatively, my [[remove-pagespec-merge|should_optimise_pagespecs]]
->>> branch solves this, in a Gordian knot sort of way :-) --[[smcv]]
+>>> My [[remove-pagespec-merge|should_optimise_pagespecs]] branch has now
+>>> solved all this by deleting the offending function :-) --[[smcv]]
>> Secondly, it seems that there are two types of dependency, and ikiwiki
>> currently only handles one of them. The first type is "Rebuild this