summaryrefslogtreecommitdiff
path: root/doc/tips
diff options
context:
space:
mode:
Diffstat (limited to 'doc/tips')
-rw-r--r--doc/tips/Importing_posts_from_Wordpress.mdwn91
-rw-r--r--doc/tips/add_chatterbox_to_blog.mdwn3
-rw-r--r--doc/tips/comments_feed.mdwn11
-rw-r--r--doc/tips/convert_mediawiki_to_ikiwiki.mdwn163
-rw-r--r--doc/tips/convert_mediawiki_to_ikiwiki/discussion.mdwn55
-rw-r--r--doc/tips/dot_cgi.mdwn2
-rw-r--r--doc/tips/dot_cgi/discussion.mdwn10
-rw-r--r--doc/tips/follow_wikilinks_from_inside_vim.mdwn47
-rw-r--r--doc/tips/github.mdwn2
-rw-r--r--doc/tips/howto_limit_to_admin_users.mdwn9
-rw-r--r--doc/tips/htaccess_file.mdwn27
-rw-r--r--doc/tips/html5.mdwn27
-rw-r--r--doc/tips/ikiwiki_on_Mac_OS_X_Snow_Leopard.mdwn184
-rw-r--r--doc/tips/ikiwiki_on_Mac_OS_X_Snow_Leopard/discussion.mdwn1
-rw-r--r--doc/tips/inside_dot_ikiwiki.mdwn7
-rw-r--r--doc/tips/inside_dot_ikiwiki/discussion.mdwn7
-rw-r--r--doc/tips/integrated_issue_tracking_with_ikiwiki.mdwn2
-rw-r--r--doc/tips/laptop_wiki_with_git.mdwn20
-rw-r--r--doc/tips/nearlyfreespeech.mdwn3
-rw-r--r--doc/tips/nearlyfreespeech/discussion.mdwn11
-rw-r--r--doc/tips/optimising_ikiwiki.mdwn188
-rw-r--r--doc/tips/parentlinks_style.mdwn21
-rw-r--r--doc/tips/psgi.mdwn21
-rw-r--r--doc/tips/spam_and_softwaresites.mdwn87
-rw-r--r--doc/tips/spam_and_softwaresites/discussion.mdwn8
-rw-r--r--doc/tips/switching_to_usedirs.mdwn4
-rw-r--r--doc/tips/untrusted_git_push.mdwn10
-rw-r--r--doc/tips/vim_and_ikiwiki.mdwn28
-rw-r--r--doc/tips/vim_syntax_highlighting.mdwn22
-rw-r--r--doc/tips/yaml_setup_files.mdwn12
30 files changed, 1051 insertions, 32 deletions
diff --git a/doc/tips/Importing_posts_from_Wordpress.mdwn b/doc/tips/Importing_posts_from_Wordpress.mdwn
index 59330caa4..1ea82b862 100644
--- a/doc/tips/Importing_posts_from_Wordpress.mdwn
+++ b/doc/tips/Importing_posts_from_Wordpress.mdwn
@@ -1,9 +1,13 @@
Use case: You want to move away from Wordpress to Ikiwiki as your blogging/website platform, but you want to retain your old posts.
-[This](http://git.chris-lamb.co.uk/?p=ikiwiki-wordpress-import.git) is a simple tool that generates [git-fast-import](http://www.kernel.org/pub/software/scm/git/docs/git-fast-import.html)-compatible data from a WordPress export XML file. It retains creation time of each post, so you can use Ikiwiki's <tt>--getctime</tt> to get the preserve creation times on checkout.
+[This](http://git.chris-lamb.co.uk/?p=ikiwiki-wordpress-import.git) is a simple tool that generates [git-fast-import](http://www.kernel.org/pub/software/scm/git/docs/git-fast-import.html)-compatible data from a WordPress export XML file.
WordPress categories are mapped onto Ikiwiki tags. The ability to import comments is planned.
+The script uses the [BeautifulSoup][] module.
+
+[BeautifulSoup]: http://www.crummy.com/software/BeautifulSoup/
+
-----
I include a modified version of this script. This version includes the ability to write \[[!tag foo]] directives, which the original intended, but didn't actually do.
@@ -11,3 +15,88 @@ I include a modified version of this script. This version includes the ability t
-- [[users/simonraven]]
[[ikiwiki-wordpress-import]]
+
+-----
+
+Perhaps slightly insane, but here's an XSLT style sheet that handles my pages. It's basic, but sufficient to get started.
+Note that I had to break up the ikiwiki meta strings to post this.
+
+-- JasonRiedy
+
+ <?xml version="1.0" encoding="UTF-8"?>
+ <xsl:stylesheet version="2.0"
+ xmlns:xsl="http://www.w3.org/1999/XSL/Transform"
+ xmlns:content="http://purl.org/rss/1.0/modules/content/"
+ xmlns:wp="http://wordpress.org/export/1.0/">
+
+ <xsl:output method="text"/>
+ <xsl:output method="text" name="txt"/>
+
+ <xsl:variable name='newline'><xsl:text>
+ </xsl:text></xsl:variable>
+
+ <xsl:template match="channel">
+ <xsl:apply-templates select="item[wp:post_type = 'post']"/>
+ </xsl:template>
+
+ <xsl:template match="item">
+ <xsl:variable name="idnum" select="format-number(wp:post_id,'0000')" />
+ <xsl:variable name="basename"
+ select="concat('wp-posts/post-',$idnum)" />
+ <xsl:variable name="filename"
+ select="concat($basename, '.html')" />
+ <xsl:text>Creating </xsl:text>
+ <xsl:value-of select="concat($filename, $newline)" />
+ <xsl:result-document href="{$filename}" format="txt">
+ <xsl:text>[[</xsl:text><xsl:text>meta title="</xsl:text>
+ <xsl:value-of select="replace(title, '&quot;', '&amp;ldquo;')"/>
+ <xsl:text>"]]</xsl:text><xsl:value-of select="$newline"/>
+ <xsl:text>[[</xsl:text><xsl:text>meta date="</xsl:text>
+ <xsl:value-of select="pubDate"/>
+ <xsl:text>"]]</xsl:text><xsl:value-of select="$newline"/>
+ <xsl:text>[[</xsl:text><xsl:text>meta updated="</xsl:text>
+ <xsl:value-of select="pubDate"/>
+ <xsl:text>"]]</xsl:text> <xsl:value-of select="$newline"/>
+ <xsl:value-of select="$newline"/>
+ <xsl:value-of select="content:encoded"/>
+ <xsl:text>
+
+ </xsl:text>
+ <xsl:apply-templates select="category[@domain='tag' and not(@nicename)]">
+ <xsl:sort select="name()"/>
+ </xsl:apply-templates>
+ </xsl:result-document>
+ <xsl:apply-templates select="wp:comment">
+ <xsl:sort select="date"/>
+ <xsl:with-param name="basename">$basename</xsl:with-param>
+ </xsl:apply-templates>
+ </xsl:template>
+
+ <xsl:template match="wp:comment">
+ <xsl:param name="basename"/>
+ <xsl:variable name="cnum" select="format-number(wp:comment_id, '000')" />
+ <xsl:variable name="filename" select="concat($basename, '/comment_', $cnum, '._comment')"/>
+ <xsl:variable name="nickname" select="concat(' nickname=&quot;', wp:comment_author, '&quot;')" />
+ <xsl:variable name="username" select="concat(' username=&quot;', wp:comment_author_url, '&quot;')" />
+ <xsl:variable name="ip" select="concat(' ip=&quot;', wp:comment_author_IP, '&quot;')" />
+ <xsl:variable name="date" select="concat(' date=&quot;', wp:comment_date_gmt, '&quot;')" />
+ <xsl:result-document href="{$filename}" format="txt">
+ <xsl:text>[[</xsl:text><xsl:text>comment format=html</xsl:text><xsl:value-of select="$newline"/>
+ <xsl:value-of select="$nickname"/>
+ <xsl:value-of select="$username"/>
+ <xsl:value-of select="$ip"/>
+ <xsl:value-of select="$date"/>
+ <xsl:text>subject=""</xsl:text><xsl:value-of select="$newline"/>
+ <xsl:text>content="""</xsl:text><xsl:value-of select="$newline"/>
+ <xsl:value-of select="wp:comment_content"/>
+ <xsl:value-of select="$newline"/>
+ <xsl:text>"""]]</xsl:text><xsl:value-of select="$newline"/>
+ </xsl:result-document>
+ </xsl:template>
+
+ <xsl:template match="category">
+ <xsl:text>[</xsl:text><xsl:text>[</xsl:text><xsl:text>!tag "</xsl:text><xsl:value-of select="."/><xsl:text>"]]</xsl:text>
+ <xsl:value-of select="$newline"/>
+ </xsl:template>
+
+ </xsl:stylesheet>
diff --git a/doc/tips/add_chatterbox_to_blog.mdwn b/doc/tips/add_chatterbox_to_blog.mdwn
index aa35b9331..e07e36b07 100644
--- a/doc/tips/add_chatterbox_to_blog.mdwn
+++ b/doc/tips/add_chatterbox_to_blog.mdwn
@@ -18,4 +18,7 @@ from there, like I have on [my blog](http://kitenet.net/~joey/blog/)
show=5 feeds=no]]
"""]]
+* To filter out `@-replies`, append "and !*@*" to the [[ikiwiki/PageSpec]].
+ The same technique can be used for other filtering.
+
Note: Works best with ikiwiki 3.10 or better.
diff --git a/doc/tips/comments_feed.mdwn b/doc/tips/comments_feed.mdwn
index 6f8137256..3d6a8c449 100644
--- a/doc/tips/comments_feed.mdwn
+++ b/doc/tips/comments_feed.mdwn
@@ -3,8 +3,15 @@ blog can have comments added to them. Pages with comments even have special
feeds that can be used to subscribe to those comments. But you'd like to
add a feed that contains all the comments posted to any page. Here's how:
- \[[!inline pages="internal(*/comment_*)" template=comment]]
+ \[[!inline pages="comment(*)" template=comment]]
The special [[ikiwiki/PageSpec]] matches all comments. The
-[[template|wikitemplates]] causes the comments to be displayed formatted
+[[template|templates]] causes the comments to be displayed formatted
nicely.
+
+---
+
+It's also possible to make a feed of comments that are held pending
+moderation.
+
+ \[[!inline pages="comment_pending(*)" template=comment]]
diff --git a/doc/tips/convert_mediawiki_to_ikiwiki.mdwn b/doc/tips/convert_mediawiki_to_ikiwiki.mdwn
index f03703b46..38de01109 100644
--- a/doc/tips/convert_mediawiki_to_ikiwiki.mdwn
+++ b/doc/tips/convert_mediawiki_to_ikiwiki.mdwn
@@ -1,4 +1,159 @@
-[[sabr]] explains how to [import MediaWiki content into
-git](http://u32.net/Mediawiki_Conversion/index.html?updated), including
-full edit hostory. The [[plugins/contrib/mediawiki]] plugin can then be
-used by ikiwiki to build the wiki.
+[[!toc levels=2]]
+
+Mediawiki is a dynamically-generated wiki which stores it's data in a
+relational database. Pages are marked up using a proprietary markup. It is
+possible to import the contents of a Mediawiki site into an ikiwiki,
+converting some of the Mediawiki conventions into Ikiwiki ones.
+
+The following instructions describe ways of obtaining the current version of
+the wiki. We do not yet cover importing the history of edits.
+
+Another set of instructions and conversion tools (which imports the full history)
+can be found at <http://github.com/mithro/media2iki>
+
+## Step 1: Getting a list of pages
+
+The first bit of information you require is a list of pages in the Mediawiki.
+There are several different ways of obtaining these.
+
+### Parsing the output of `Special:Allpages`
+
+Mediawikis have a special page called `Special:Allpages` which list all the
+pages for a given namespace on the wiki.
+
+If you fetch the output of this page to a local file with something like
+
+ wget -q -O tmpfile 'http://your-mediawiki/wiki/Special:Allpages'
+
+You can extract the list of page names using the following python script. Note
+that this script is sensitive to the specific markup used on the page, so if
+you have tweaked your mediawiki theme a lot from the original, you will need
+to adjust this script too:
+
+ import sys
+ from xml.dom.minidom import parse, parseString
+
+ dom = parse(sys.argv[1])
+ tables = dom.getElementsByTagName("table")
+ pagetable = tables[-1]
+ anchors = pagetable.getElementsByTagName("a")
+ for a in anchors:
+ print a.firstChild.toxml().\
+ replace('&amp;','&').\
+ replace('&lt;','<').\
+ replace('&gt;','>')
+
+Also, if you have pages with titles that need to be encoded to be represented
+in HTML, you may need to add further processing to the last line.
+
+Note that by default, `Special:Allpages` will only list pages in the main
+namespace. You need to add a `&namespace=XX` argument to get pages in a
+different namespace. (See below for the default list of namespaces)
+
+Note that the page names obtained this way will not include any namespace
+specific prefix: e.g. `Category:` will be stripped off.
+
+### Querying the database
+
+If you have access to the relational database in which your mediawiki data is
+stored, it is possible to derive a list of page names from this. With mediawiki's
+MySQL backend, the page table is, appropriately enough, called `table`:
+
+ SELECT page_namespace, page_title FROM page;
+
+As with the previous method, you will need to do some filtering based on the
+namespace.
+
+### namespaces
+
+The list of default namespaces in mediawiki is available from <http://www.mediawiki.org/wiki/Manual:Namespace#Built-in_namespaces>. Here are reproduced the ones you are most likely to encounter if you are running a small mediawiki install for your own purposes:
+
+[[!table data="""
+Index | Name | Example
+0 | Main | Foo
+1 | Talk | Talk:Foo
+2 | User | User:Jon
+3 | User talk | User_talk:Jon
+6 | File | File:Barack_Obama_signature.svg
+10 | Template | Template:Prettytable
+14 | Category | Category:Pages_needing_review
+"""]]
+
+## Step 2: fetching the page data
+
+Once you have a list of page names, you can fetch the data for each page.
+
+### Method 1: via HTTP and `action=raw`
+
+You need to create two derived strings from the page titles: the
+destination path for the page and the source URL. Assuming `$pagename`
+contains a pagename obtained above, and `$wiki` contains the URL to your
+mediawiki's `index.php` file:
+
+ src=`echo "$pagename" | tr ' ' _ | sed 's,&,&amp;,g'`
+ dest=`"$pagename" | tr ' ' _ | sed 's,&,__38__,g'`
+
+ mkdir -p `dirname "$dest"`
+ wget -q "$wiki?title=$src&action=raw" -O "$dest"
+
+You may need to add more conversions here depending on the precise page titles
+used in your wiki.
+
+If you are trying to fetch pages from a different namespace to the default,
+you will need to prefix the page title with the relevant prefix, e.g.
+`Category:` for category pages. You probably don't want to prefix it to the
+output page, but you may want to vary the destination path (i.e. insert an
+extra directory component corresponding to your ikiwiki's `tagbase`).
+
+### Method 2: via HTTP and `Special:Export`
+
+Mediawiki also has a special page `Special:Export` which can be used to obtain
+the source of the page and other metadata such as the last contributor, or the
+full history, etc.
+
+You need to send a `POST` request to the `Special:Export` page. See the source
+of the page fetched via `GET` to determine the correct arguments.
+
+You will then need to write an XML parser to extract the data you need from
+the result.
+
+### Method 3: via the database
+
+It is possible to extract the page data from the database with some
+well-crafted queries.
+
+## Step 3: format conversion
+
+The next step is to convert Mediawiki conventions into Ikiwiki ones.
+
+### categories
+
+Mediawiki uses a special page name prefix to define "Categories", which
+otherwise behave like ikiwiki tags. You can convert every Mediawiki category
+into an ikiwiki tag name using a script such as
+
+ import sys, re
+ pattern = r'\[\[Category:([^\]]+)\]\]'
+
+ def manglecat(mo):
+ return '\[[!tag %s]]' % mo.group(1).strip().replace(' ','_')
+
+ for line in sys.stdin.readlines():
+ res = re.match(pattern, line)
+ if res:
+ sys.stdout.write(re.sub(pattern, manglecat, line))
+ else: sys.stdout.write(line)
+
+## Step 4: Mediawiki plugin
+
+The [[plugins/contrib/mediawiki]] plugin can be used by ikiwiki to interpret
+most of the Mediawiki syntax.
+
+## External links
+
+[[sabr]] used to explain how to [import MediaWiki content into
+git](http://u32.net/Mediawiki_Conversion/index.html?updated), including full
+edit history, but as of 2009/10/16 that site is not available. A copy of the
+information found on this website is stored at <http://github.com/mithro/media2iki>
+
+
diff --git a/doc/tips/convert_mediawiki_to_ikiwiki/discussion.mdwn b/doc/tips/convert_mediawiki_to_ikiwiki/discussion.mdwn
index e2eb56d47..4a7163eae 100644
--- a/doc/tips/convert_mediawiki_to_ikiwiki/discussion.mdwn
+++ b/doc/tips/convert_mediawiki_to_ikiwiki/discussion.mdwn
@@ -1,3 +1,15 @@
+20100428 - I just wrote a simple ruby script which will connect to a mysql server and then recreate the pages and their revision histories with Grit. It also does one simple conversion of equals titles to pounds. Enjoy!
+
+<http://github.com/docunext/mediawiki2gitikiwiki>
+
+-- [[users/Albert]]
+
+----
+
+I wrote a script that will download all the latest revisions of a mediawiki site. In short, it does a good part of the stuff required for the migration: it downloads the goods (ie. the latest version of every page, automatically) and commits the resulting structure. There's still a good few pieces missing for an actual complete conversion to ikiwiki, but it's a pretty good start. It only talks with mediawiki through HTTP, so no special access is necessary. The downside of that is that it will not attempt to download every revision for performance reasons. The code is here: http://anarcat.ath.cx/software/mediawikigitdump.git/ See header of the file for more details and todos. -- [[users/Anarcat]] 2010-10-15
+
+----
+
The u32 page is excellent, but I wonder if documenting the procedure here
would be worthwhile. Who knows, the remote site might disappear. But also
there are some variations on the approach that might be useful:
@@ -13,9 +25,31 @@ Also, some detail on converting mediawiki transclusion to ikiwiki inlines...
-- [[users/Jon]]
+----
+
> "Who knows, the remote site might disappear.". Right now, it appears to
> have done just that. -- [[users/Jon]]
+I have manage to recover most of the site using the Internet Archive. What
+I was unable to retrieve I have rewritten. You can find a copy of the code
+at <http://github.com/mithro/media2iki>
+
+> This is excellent news. However, I'm still keen on there being a
+> comprehensive and up-to-date set of instructions on *this* site. I wouldn't
+> suggest importing that material into ikiwiki like-for-like (not least for
+> [[licensing|freesoftware]] reasons), but it's excellent to have it available
+> for reference, especially since it (currently) is the only set of
+> instructions that gives you the whole history.
+>
+> The `mediawiki.pm` that was at u32.net is licensed GPL-2. I'd like to see it
+> cleaned up and added to IkiWiki proper (although I haven't requested this
+> yet, I suspect the way it (ab)uses linkify would disqualify it at present).
+>
+> I've imported Scott's initial `mediawiki.pm` into a repository at
+> <http://github.com/jmtd/mediawiki.pm> as a start.
+> -- [[Jon]]
+
+----
The iki-fast-load ruby script from the u32 page is given below:
@@ -612,3 +646,24 @@ Mediawiki.pm - A plugin which supports mediawiki format.
}
1
+
+----
+
+Hello. Got ikiwiki running and I'm planning to convert my personal
+Mediawiki wiki to ikiwiki so I can take offline copies around. If anyone
+has an old copy of the instructions, or any advice on where to start I'd be
+glad to hear it. Otherwise I'm just going to chronicle my journey on the
+page.--[[users/Chadius]]
+
+> Today I saw that someone is working to import wikipedia into git.
+> <http://www.gossamer-threads.com/lists/wiki/foundation/181163>
+> Since wikipedia uses mediawiki, perhaps his importer will work
+> on mediawiki in general. It seems to produce output that could be
+> used by the [[plugins/contrib/mediawiki]] plugin, if the filenames
+> were fixed to use the right extension. --[[Joey]]
+
+>> Here's another I found while browsing around starting from the link you gave Joey<br />
+>> <http://github.com/scy/levitation><br />
+>> As I don't run mediawiki anymore, but I still have my xz/gzip-compressed XML dumps,
+>> it's certainly easier for me to do it this way; also a file or a set of files is easier to lug
+>> around on some medium than a full mysqld or postgres master and relevant databases.
diff --git a/doc/tips/dot_cgi.mdwn b/doc/tips/dot_cgi.mdwn
index da55c1f1c..42a0aa7bf 100644
--- a/doc/tips/dot_cgi.mdwn
+++ b/doc/tips/dot_cgi.mdwn
@@ -26,6 +26,8 @@ configuration changes should work anywhere.
Or, if you've put it in a `~/public_html`, edit
`/etc/apache2/mods-available/userdir.conf`.
+ You may also want to install some dependencies to enable CGI in apache2 setup as: `libcgi-formbuilder-perl` and `libcgi-session-perl`.
+
* You may also want to enable the [[plugins/404]] plugin.
To make apache use it, the apache config file will need a further
modification to make it use ikiwiki's CGI as the apache 404 handler.
diff --git a/doc/tips/dot_cgi/discussion.mdwn b/doc/tips/dot_cgi/discussion.mdwn
index 124b9edff..a8854565c 100644
--- a/doc/tips/dot_cgi/discussion.mdwn
+++ b/doc/tips/dot_cgi/discussion.mdwn
@@ -34,3 +34,13 @@ there), and so I need to choose the more secure solution. --Ivan Z.
>> The easiest way though is probably
>> to add your ssh key to the special user's `.ssh/authorized_keys`
>> and push that way. --[[Joey]]
+
+## apache2 - run from userdir
+Followed instructions but couldn't get it right to run from user dir (running ubuntu jaunty),
+Finally got it working once I've sym linked as follow (& restarted apache):
+\# ln -s ../mods-available/userdir.load .
+\# ln -s ../mods-available/userdir.conf .
+\# pwd
+/etc/apache2/mods-enabled
+
+
diff --git a/doc/tips/follow_wikilinks_from_inside_vim.mdwn b/doc/tips/follow_wikilinks_from_inside_vim.mdwn
new file mode 100644
index 000000000..015a4ecee
--- /dev/null
+++ b/doc/tips/follow_wikilinks_from_inside_vim.mdwn
@@ -0,0 +1,47 @@
+The [ikiwiki-nav](http://www.vim.org/scripts/script.php?script_id=2968) plugin
+for vim eases the editing of IkiWiki wikis, by letting you "follow" the
+wikilinks on your file (page), by loading the file associated with a given
+wikilink in vim. The plugin takes care of following the ikiwiki linking rules
+to figure out which file a wikilink points to
+
+The plugin also includes commands (and mappings) to make the cursor jump to the
+previous/next wikilink in the current file
+
+## Jumping to pages
+
+To open the file associated to a wikilink, place the cursor over its text, and
+hit Enter (`<CR>`). This functionality is also available through the
+`:IkiJumpToPage` command
+
+## Moving to next/previous wikilink in current file
+
+`Ctrl-j` will move the cursor to the next wikilink. `Ctrl-k` will move it to the
+previous wikilink. This functionality is also available through the
+`:IkiNextWikiLink` command. This command takes one argument, the direction to
+move into
+
+ * `:IkiNextWikiLink 0` will look forward for the wikilink
+ * `:IkiNextWikiLink 1` will look backwards for the wikilink
+
+## Installation
+
+Copy the `ikiwiki_nav.vim` file to your `.vim/ftplugin` directory.
+
+## Current issues:
+
+ * The plugin only works for wikilinks contained in a single text line;
+ multiline wikilinks are not (yet) seen as such
+
+## Notes
+
+The official releases of the plugin are in the
+[vim.org script page](http://www.vim.org/scripts/script.php?script_id=2968)
+
+The latest version of this script can be found in the following location
+
+<http://git.devnull.li/cgi-bin/gitweb.cgi?p=ikiwiki-nav.git;a=blob;f=ftplugin/ikiwiki_nav.vim;hb=HEAD>
+
+Any feedback you can provide is appreciated; the contact details can be found
+inside the plugin
+
+[[!tag vim]]
diff --git a/doc/tips/github.mdwn b/doc/tips/github.mdwn
index 9bdf15751..d745bfcc5 100644
--- a/doc/tips/github.mdwn
+++ b/doc/tips/github.mdwn
@@ -5,7 +5,7 @@ site. Your laptop is used to generate and publish changes to it.
This is possible because github now supports
[github pages](http://github.com/blog/272-github-pages).
-Note that github limits free accounts to 100 mb of git storage. It's
+Note that github limits free accounts to 100 MB of git storage. It's
unlikely that a small wiki or blog will outgrow this, but we are keeping
two copies of the website in git (source and the compiled site), and all
historical versions too. So it could happen. If it does, you can pay github
diff --git a/doc/tips/howto_limit_to_admin_users.mdwn b/doc/tips/howto_limit_to_admin_users.mdwn
new file mode 100644
index 000000000..4d579327a
--- /dev/null
+++ b/doc/tips/howto_limit_to_admin_users.mdwn
@@ -0,0 +1,9 @@
+Enable [[plugins/lockedit]] in your setup file.
+
+For example:
+
+ add_plugins => [qw{goodstuff table rawhtml template embed typography sidebar img remove lockedit}],
+
+And to only allow admin users to edit the page, simply specify a pagespec for everything in the .setup:
+
+ locked_pages => '*',
diff --git a/doc/tips/htaccess_file.mdwn b/doc/tips/htaccess_file.mdwn
new file mode 100644
index 000000000..6964cf24e
--- /dev/null
+++ b/doc/tips/htaccess_file.mdwn
@@ -0,0 +1,27 @@
+If you try to include a `.htaccess` file in your wiki's source, in order to
+configure the web server, you'll find that ikiwiki excludes it from
+processing. In fact, ikiwiki excludes any file starting with a dot, as well
+as a lot of other files, for good security reasons.
+
+You can tell ikiwiki not to exclude the .htaccess file by adding this to
+your setup file:
+
+ include => '^\.htaccess$',
+
+Caution! Before you do that, please think for a minute about who can edit
+your wiki. Are attachment uploads enabled? Can users commit changes
+directly to the version control system? Do you trust everyone who can
+make a change to not do Bad Things with the htaccess file? Do you trust
+everyone who *might* be able to make a change in the future? Note that a
+determined attacker who can write to the htaccess file can probably get a
+shell on your web server.
+
+If any of these questions have given you pause, I suggest you find a
+different way to configure the web server. One way is to not put the
+`.htaccess` file under ikiwiki's control, and just manually install it
+in the destdir. --[[Joey]]
+
+[Apache's documentation](http://httpd.apache.org/docs/2.2/howto/htaccess.html)
+says:
+> In general, you should never use .htaccess files unless you don't have
+> access to the main server configuration file.
diff --git a/doc/tips/html5.mdwn b/doc/tips/html5.mdwn
new file mode 100644
index 000000000..cb71c0887
--- /dev/null
+++ b/doc/tips/html5.mdwn
@@ -0,0 +1,27 @@
+First, if you just want to embed videos using the html5 `<video>` tag,
+you can do that without switching anything else to html5.
+However, if you want to fully enter the brave new world of html5, read on..
+
+Currently, ikiwiki does not use html5 by default. There is a `html5`
+setting that can be turned on, in your setup file. Rebuild with it set, and
+lots of fancy new semantic tags will be used all over the place.
+
+You may need to adapt your CSS for html5. While all the class and id names
+are the same, some of the `div` elements are changed to other things.
+Ikiwiki's default CSS will work in both modes.
+
+The html5 support is still experimental, and may break in some browsers.
+No care is taken to add backwards compatibility hacks for browsers that
+are not html5 aware (like MSIE). If you want to include the javascript with
+those hacks, you can edit `page.tmpl` to do so.
+[Dive Into HTML5](http://diveintohtml5.org/) is a good reference for
+current compatability issues and workarounds with html5. And a remotely-loadable
+JS shiv for enabling HTML5 elements in IE is available through [html5shiv at Google Code](http://code.google.com/p/html5shiv/).
+
+---
+
+Known ikiwiki-specific issues:
+
+* [[plugins/htmltidy]] uses `tidy`, which is not html5 aware, so if you
+ have that enabled, it will mangle it back to html4.
+* [[plugins/toc]] does not understand the html5 outline algorithm.
diff --git a/doc/tips/ikiwiki_on_Mac_OS_X_Snow_Leopard.mdwn b/doc/tips/ikiwiki_on_Mac_OS_X_Snow_Leopard.mdwn
new file mode 100644
index 000000000..bb1db0cbb
--- /dev/null
+++ b/doc/tips/ikiwiki_on_Mac_OS_X_Snow_Leopard.mdwn
@@ -0,0 +1,184 @@
+These are some notes on installing ikiwiki on Mac OS X Snow Leopard. I have a three year old machine with a lot of stuff on it so it took quite a while, YMMV.
+
+The best part of installing ikiwiki was learning how to use git. I never used source control before but its pretty slick.
+
+
+## installing git:
+
+cd /opt/ikiwiki/install
+
+curl http://kernel.org/pub/software/scm/git/git-(latest version).tar.gz -O
+
+tar xzvf git-(latest version).tar.gz
+
+cd git-(latest version)
+
+./configure --prefix=/usr/local
+
+make prefix=/usr/local all
+
+sudo make install
+
+
+git config --global user.name "firstname lastname"
+
+git config --global user.email "email here"
+
+git config --global color.ui "auto"
+
+
+curl http://www.kernel.org/pub/software/scm/git/git-manpages-1.7.3.1.tar.gz | sudo tar -xzC /usr/local/share/man/
+
+
+## installing ikiwiki:
+I had terrible trouble installing ikiwiki. It turned out I had accidentally installed Perl through ports. Uninstalling that made everything install nicely.
+I got an error on msgfmt. Turns out this is a program in gettext. I installed that and it fixed the error.
+
+cd ..
+
+git clone git://git.ikiwiki.info/
+
+cd git.ikiwiki.info/
+
+perl Makefile.PL LIB=/Library/Perl/5.10.0
+
+make
+
+sudo make install
+
+when you make ikiwiki it gives you a .git folder with the ikiwiki files. Stay out of this folder. You want to learn how to create a clone and make all your changes in the clone. When you push the changes ikiwiki will update. I moved a file in this folder by accident because I named my working file the same and I couldn't get into the setup page. I had apparently messed up my ikiwiki git repository. I did a pull into my clone, deleted the repository and webserver/ cgi folders and ran a new setup. Then I did a git clone and dragged all my old files into the new clone. Did the git dance and did git push. Then the angels sang.
+
+
+## using git from inside a git folder:
+
+start with git clone, then learn to do the git dance like this.
+
+git pull
+
+make your changes to your clone
+
+git commit -a -m "message here"
+
+git push
+
+
+When you can't get into the setup page or you get strange behavior after a setup update the Utilities > Console app is your friend.
+
+## installing gitweb
+
+cd ../git-1.7.3.1/gitweb
+
+make GITWEB_PROJECTROOT="/opt/ikiwiki/" GITWEB_CSS="/gitweb.css" GITWEB_LOGO="/git-logo.png" GITWEB_FAVICON="/git-favicon.png" GITWEB_JS="/gitweb.js"
+
+cp gitweb.cgi /Library/WebServer/CGI-Executables/
+
+cp /usr/local/share/gitweb/static/git-favicon.png /Library/WebServer/Documents/
+
+cp /usr/local/share/gitweb/static/git-logo.png /Library/WebServer/Documents/
+
+cp /usr/local/share/gitweb/static/gitweb.css /Library/WebServer/Documents/
+
+cp /usr/local/share/gitweb/static/gitweb.js /Library/WebServer/Documents/
+
+
+sudo chmod 2755 /Library/WebServer/CGI-Executables/gitweb.cgi
+
+sudo chmod 2755 /Library/WebServer/Documents/git-favicon.png
+
+sudo chmod 2755 /Library/WebServer/Documents/git-logo.png
+
+sudo chmod 2755 /Library/WebServer/Documents/gitweb.css
+
+sudo chmod 2755 /Library/WebServer/Documents/gitweb.js
+
+
+## installing xapian:
+
+download xapian and omega
+
+I needed pcre: sudo ports install pcre
+
+./configure
+
+make
+
+sudo make install
+
+
+## installing omega:
+
+I had a build error do to libiconv undefined symbols. sudo port deactivate libiconv took care of it. After install I had trouble with ikiwiki so I did a sudo port install libiconv and ikiwiki came back.
+
+./configure
+
+make
+
+sudo make install
+
+
+## installing Search::Xapian from CPAN
+
+for some reason this wouldn't install using CPAN console so I went to CPAN online and downloaded the source.
+
+perl Makefile.PL
+
+make
+
+make test
+
+sudo make install
+
+it installed without issue so I'm baffled why it didn't install from command line.
+
+
+ ## setup file
+ _!/usr/bin/perl
+ _ Ikiwiki setup automator.
+
+ _ This setup file causes ikiwiki to create a wiki, check it into revision
+ _ control, generate a setup file for the new wiki, and set everything up.
+
+ _ Just run: ikiwiki -setup /etc/ikiwiki/auto.setup
+
+ _By default, it asks a few questions, and confines itself to the user's home
+ _directory. You can edit it to change what it asks questions about, or to
+ _modify the values to use site-specific settings.
+ require IkiWiki::Setup::Automator;
+
+ our $wikiname="your wiki";
+ our $wikiname_short="yourwiki";
+ our $rcs="git";
+ our $admin="your name";
+ use Net::Domain q{hostfqdn};
+ our $domain="your.domain";
+
+ IkiWiki::Setup::Automator->import(
+ wikiname => $wikiname,
+ adminuser => [$admin],
+ rcs => $rcs,
+ srcdir => "/opt/ikiwiki/$wikiname_short",
+ destdir => "/Library/WebServer/Documents/$wikiname_short",
+ repository => "/opt/ikiwiki/$wikiname_short.".($rcs eq "monotone" ? "mtn" : $rcs),
+ dumpsetup => "/opt/ikiwiki/$wikiname_short.setup",
+ url => "http://$domain/$wikiname_short",
+ cgiurl => "http://$domain/cgi-bin/$wikiname_short/ikiwiki.cgi",
+ cgi_wrapper => "/Library/WebServer/CGI-Executables/$wikiname_short/ikiwiki.cgi",
+ adminemail => "your\@email.com",
+ add_plugins => [qw{goodstuff websetup}],
+ disable_plugins => [qw{}],
+ libdir => "/opt/ikiwiki/.ikiwiki",
+ rss => 1,
+ atom => 1,
+ syslog => 1,
+ )
+
+
+## turning on search plugin:
+
+I turned on the plugin from the setup page in ikiwiki but it gave an error when I went to search. Error "Error: /usr/lib/cgi-bin/omega/omega failed: No such file or directory".
+I did a "find / -name "omega" -print" and found the omega program in "/usr/local/lib/xapian-omega/bin/omega".
+
+Then I went into the 2wiki.setup file and replaced the bad path, updated and badda-boom badda-bing.
+
+
+
diff --git a/doc/tips/ikiwiki_on_Mac_OS_X_Snow_Leopard/discussion.mdwn b/doc/tips/ikiwiki_on_Mac_OS_X_Snow_Leopard/discussion.mdwn
new file mode 100644
index 000000000..ae3969879
--- /dev/null
+++ b/doc/tips/ikiwiki_on_Mac_OS_X_Snow_Leopard/discussion.mdwn
@@ -0,0 +1 @@
+If you want do a bunch of manual labor, this is good, but most people probably want to get ikiwiki via a package system. My Mac laptop's ikiwiki is installed from pkgsrc. --[[schmonz]]
diff --git a/doc/tips/inside_dot_ikiwiki.mdwn b/doc/tips/inside_dot_ikiwiki.mdwn
index b81ffae8d..a74d00f47 100644
--- a/doc/tips/inside_dot_ikiwiki.mdwn
+++ b/doc/tips/inside_dot_ikiwiki.mdwn
@@ -6,9 +6,10 @@ you need/want to.
## the index
-`.ikiwiki/indexdb` contains a cache of information about pages, as well
-as all persisitant state about pages. It used to be a (semi) human-readable
-text file, but is not anymore.
+`.ikiwiki/indexdb` contains a cache of information about pages.
+This information can always be recalculated by rebuilding the wiki.
+(So the file is safe to delete and need not be backed up.)
+It used to be a (semi) human-readable text file, but is not anymore.
To dump the contents of the file, enter a perl command like this.
diff --git a/doc/tips/inside_dot_ikiwiki/discussion.mdwn b/doc/tips/inside_dot_ikiwiki/discussion.mdwn
index 34d5b9252..69df369ec 100644
--- a/doc/tips/inside_dot_ikiwiki/discussion.mdwn
+++ b/doc/tips/inside_dot_ikiwiki/discussion.mdwn
@@ -6,14 +6,15 @@ My database appears corrupted:
No idea how this happened. I've blown it away and recreated it but, for future reference, is there any less violent way to recover from this situation? I miss having the correct created and last edited times. --[[sabr]]
> update: fixed ctimes and mtimes using [these instructions](http://u32.net/Mediawiki_Conversion/Git_Import/#Correct%20Creation%20and%20Last%20Edited%20time) --[[sabr]]
-> That's overly complex. Just run `ikiwiki -setup your.setup -getctime`.
+> That's overly complex. Just run `ikiwiki -setup your.setup -gettime`.
> BTW, I'd be interested in examining such a corrupt storable file to try
> to see what happened to it. --[[Joey]]
->> --getctime appears to only set the last edited date. It's not supposed to set the creation date, is it? The only place that info is stored is in the git repo.
+>> --gettime appears to only set the last edited date. It's not supposed to set the creation date, is it? The only place that info is stored is in the git repo.
>>> Pulling the page creation date out of the git history is exactly what
->>> --getctime does. --[[Joey]]
+>>> --gettime does. (It used to be called --getctime, and only do that; now
+>>> it also pulls out the last modified date). --[[Joey]]
>> Alas, I seem to have lost the bad index file to periodic /tmp wiping; I'll send it to you if it happens again. --[[sabr]]
diff --git a/doc/tips/integrated_issue_tracking_with_ikiwiki.mdwn b/doc/tips/integrated_issue_tracking_with_ikiwiki.mdwn
index ea7835b33..0c871d6c0 100644
--- a/doc/tips/integrated_issue_tracking_with_ikiwiki.mdwn
+++ b/doc/tips/integrated_issue_tracking_with_ikiwiki.mdwn
@@ -155,7 +155,7 @@ be inlined into a given page. A few examples:
* A typical list of all open bugs, with their full text, and a form to post new
bugs.
- \[[!inline pages="bugs/* and !link(done) and !*/Discussion" actions=yes postform=yes show=0]]
+ \[[!inline pages="bugs/* and !link(done) and !*/Discussion" actions=yes postform=yes show=0 rootpage="bugs"]]
* Index of the 30 most recently fixed bugs.
diff --git a/doc/tips/laptop_wiki_with_git.mdwn b/doc/tips/laptop_wiki_with_git.mdwn
index 9758beb80..cfa565d1a 100644
--- a/doc/tips/laptop_wiki_with_git.mdwn
+++ b/doc/tips/laptop_wiki_with_git.mdwn
@@ -1,3 +1,5 @@
+[[!toc]]
+
Using ikiwiki with the [[rcs/git]] backend, some interesting things can be done
with creating mirrors (or, really, branches) of a wiki. In this tip, I'll
assume your wiki is located on a server, and you want to take a copy with
@@ -8,6 +10,8 @@ version on the laptop, perhaps while offline. You can browse and edit the
wiki using a local web server. When you're ready, you can manually push the
changes to the main wiki on the server.
+## simple clone approach
+
First, set up the wiki on the server, if it isn't already. Nothing special
needs to be done here, just follow the regular instructions in [[setup]]
for setting up ikiwiki with git.
@@ -49,3 +53,19 @@ update the wiki, with a command such as `ikiwiki -setup wiki.setup -refresh`.
If you'd like it to automatically update when changes are merged in, you
can simply make a symlink `post-merge` hook pointing at the `post-update`
hook ikiwiki created.
+
+## bare mirror approach
+
+As above, set up a normal ikiwiki on the server, with the usual bare repository.
+
+Next, `git clone --mirror server:/path/to/bare/repository`
+
+This will be used as the $REPOSITORY on the laptop. Then you can follow
+the instructions in [[setup by hand|/setup/byhand]] as per a normal ikiwiki
+installation. This means that you can clone from the local bare repository
+as many times as you want (thus being able to have a repository which is
+used by the ikiwiki CGI, and another which you can use for updating via
+git).
+
+Use standard git commands, run in the laptop's bare git repository
+to handle pulling from and pushing to the server.
diff --git a/doc/tips/nearlyfreespeech.mdwn b/doc/tips/nearlyfreespeech.mdwn
index 4b3b02eac..a3d1ec678 100644
--- a/doc/tips/nearlyfreespeech.mdwn
+++ b/doc/tips/nearlyfreespeech.mdwn
@@ -14,7 +14,8 @@ After you [get an account](https://www.nearlyfreespeech.net/about/start.php),
create a site using their web interface.
Mine is named `ikiwiki-test` and I used their DNS instead of getting my
-own, resulting in <http://ikiwiki-test.nfshost.com/>
+own, resulting in <http://ikiwiki-test.nfshost.com/>. (Not being kept up
+anymore.)
They gave me 2 cents free funding for signing up, which is enough to pay
for 10 megabytes of bandwidth, or about a thousand typical page views, at
diff --git a/doc/tips/nearlyfreespeech/discussion.mdwn b/doc/tips/nearlyfreespeech/discussion.mdwn
index a003760b9..b76432566 100644
--- a/doc/tips/nearlyfreespeech/discussion.mdwn
+++ b/doc/tips/nearlyfreespeech/discussion.mdwn
@@ -9,3 +9,14 @@ BEGIN failed--compilation aborted at (eval 19) line 2.
perl is 5.8.9
> This is fixed in 3.1415926. --[[Joey]]
+
+
+Hi!<br />
+How can i upgrade my nearlyfreespeech installation of ikiwiki? To install it i have followed your instructions here.<br>
+But now if I want to upgrade it to a newer version?<br />
+Thanks for your incredible work!
+
+> You can move `~/ikiwiki` out of the way and then re-download and install
+> it ikiwiki. --[[Joey]]
+
+Thanks a lot Joey. :-)
diff --git a/doc/tips/optimising_ikiwiki.mdwn b/doc/tips/optimising_ikiwiki.mdwn
new file mode 100644
index 000000000..d66ee9343
--- /dev/null
+++ b/doc/tips/optimising_ikiwiki.mdwn
@@ -0,0 +1,188 @@
+Ikiwiki is a wiki compiler, which means that, unlike a traditional wiki,
+all the work needed to display your wiki is done up front. Where you can
+see it and get annoyed at it. In some ways, this is better than a wiki
+where a page view means running a program to generate the page on the fly.
+
+But enough excuses. If ikiwiki is taking too long to build your wiki,
+let's fix that. Read on for some common problems that can be avoided to
+make ikiwiki run quick.
+
+[[!toc]]
+
+(And if none of that helps, file a [[bug|bugs]]. One other great thing about
+ikiwiki being a wiki compiler is that it's easy to provide a test case when
+it's slow, and get the problem fixed!)
+
+## rebuild vs refresh
+
+Are you building your wiki by running a command like this?
+
+ ikiwiki -setup my.setup
+
+If so, you're always telling ikiwiki to rebuild the entire site, from
+scratch. But, ikiwiki is smart, it can incrementally update a site,
+building only things affected by the changes you make. You just have to let
+it do so:
+
+ ikiwiki -setup my.setup -refresh
+
+Ikiwiki automatically uses an incremental refresh like this when handing
+a web edit, or when run from a [[rcs]] post-commit hook. (If you've
+configured the hook in the usual way.) Most people who have run into this
+problem got in the habit of running `ikiwiki -setup my.setup` by hand
+when their wiki was small, and found it got slower as they added pages.
+
+## use the latest version
+
+If your version of ikiwiki is not [[!version]], try upgrading. New
+optimisations are frequently added to ikiwiki, some of them yielding
+*enormous* speed increases.
+
+## run ikiwiki in verbose mode
+
+Try changing a page, and run ikiwiki with `-v` so it will tell you
+everything it does to deal with that changed page. Take note of
+which other pages are rebuilt, and which parts of the build take a long
+time. This can help you zero in on individual pages that contain some of
+the expensive things listed below.
+
+## expensive inlines
+
+Do you have an archive page for your blog that shows all posts,
+using an inline that looks like this?
+
+ \[[!inline pages="blog/*" show=0]]
+
+Or maybe you have some tag pages for your blog that show all tagged posts,
+something like this?
+
+ \[[!inline pages="blog/* and tagged(foo)" show=0]]
+
+These are expensive, because they have to be updated whenever you modify a
+matching page. And, if there are a lot of pages, it generates a large html
+file, which is a lot of work. And also large RSS/Atom files, which is even
+more work!
+
+To optimise the inline, consider enabling quick archive mode. Then the
+inline will only need to be updated when new pages are added; no RSS
+or Atom feeds will be built, and the generated html file will be much
+smaller.
+
+ \[[!inline pages="blog/*" show=0 archive=yes quick=yes]]
+
+ \[[!inline pages="blog/* and link(tag)" show=0 archive=yes quick=yes]]
+
+Only downsides: This won't show titles set by the [[ikiwiki/directive/meta]]
+directive. And there's no RSS feed for users to use -- but if this page
+is only for the archives or tag for your blog, users should be subscribing
+to the blog's main page's RSS feed instead.
+
+For the main blog page, the inline should only show the latest N posts,
+which won't be a performance problem:
+
+ \[[!inline pages="blog/*" show=30]]
+
+## expensive maps
+
+Do you have a sitemap type page, that uses a map directive like this?
+
+ \[[!map pages="*" show=title]]
+
+This is expensive because it has to be updated whenever a page is modified.
+The resulting html file might get big and expensive to generate as you
+keep adding pages.
+
+First, consider removing the "show=title". Then the map will not show page
+titles set by the [[ikiwiki/directive/meta]] directive -- but will also
+only need to be generated when pages are added or removed, not for every
+page change.
+
+Consider limiting the map to only show the toplevel pages of your site,
+like this:
+
+ \[[!map pages="* and !*/*" show=title]]
+
+Or, alternatively, to drop from the map parts of the site that accumulate
+lots of pages, like individual blog posts:
+
+ \[[!map pages="* and !blog/*" show=title]]
+
+## sidebar issues
+
+If you enable the [[plugins/sidebar]] plugin, be careful of what you put in
+your sidebar. Any change that affects what is displayed by the sidebar
+will require an update of *every* page in the wiki, since all pages include
+the sidebar.
+
+Putting an expensive map or inline in the sidebar is the most common cause
+of problems. At its worst, it can result in any change to any page in the
+wiki requiring every page to be rebuilt.
+
+## avoid htmltidy
+
+A few plugins do neat stuff, but slowly. Such plugins are tagged
+[[plugins/type/slow]].
+
+The worst offender is possibly [[plugins/htmltidy]]. This runs an external
+`tidy` program on each page that is built, which is necessarily slow. So don't
+use it unless you really need it; consider using the faster
+[[plugins/htmlbalance]] instead.
+
+## be careful of large linkmaps
+
+[[plugins/Linkmap]] generates a cool map of links between pages, but
+it does it using the `graphviz` program. And any changes to links between
+pages on the map require an update. So, avoid using this to map a large number
+of pages with frequently changing links. For example, using it to map
+all the pages on a traditional, highly WikiLinked wiki, is asking for things
+to be slow. But using it to map a few related pages is probably fine.
+
+This site's own [[plugins/linkmap]] rarely slows it down, because it
+only shows the index page, and the small set of pages that link to it.
+That is accomplished as follows:
+
+ \[[!linkmap pages="index or (backlink(index)"]]
+
+## overhead of the search plugin
+
+Be aware that the [[plugins/search]] plugin has to update the search index
+whenever any page is changed. This can slow things down somewhat.
+
+## profiling
+
+If you have a repeatable change that ikiwiki takes a long time to build,
+and none of the above help, the next thing to consider is profiling
+ikiwiki.
+
+The best way to do it is:
+
+* Install [[!cpan Devel::NYTProf]]
+* `PERL5OPT=-d:NYTProf`
+* `export PER5OPT`
+* Now run ikiwiki as usual, and it will generate a `nytprof.out` file.
+* Run `nytprofhtml` to generate html files.
+* Those can be examined to see what parts of ikiwiki are being slow.
+
+## scaling to large numbers of pages
+
+Finally, let's think about how huge number of pages can affect ikiwiki.
+
+* Every time it's run, ikiwiki has to scan your `srcdir` to find
+ new and changed pages. This is similar in speed to running the `find`
+ command. Obviously, more files will make it take longer.
+
+* Also, to see what pages match a [[ikiwiki/PageSpec]] like "blog/*", it has
+ to check if every page in the wiki matches. These checks are done quite
+ quickly, but still, lots more pages will make PageSpecs more expensive.
+
+* The backlinks calculation has to consider every link on every page
+ in the wiki. (In practice, most pages only link to at most a few dozen
+ other pages, so this is not a `O(N^2)`, but closer to `O(N)`.)
+
+* Ikiwiki also reads and writes an `index` file, which contains information
+ about each page, and so if you have a lot of pages, this file gets large,
+ and more time is spent on it. For a wiki with 2000 pages, this file
+ will run about 500 kb.
+
+If your wiki will have 100 thousand files in it, you might start seeing
+the above contribute to ikiwiki running slowly.
diff --git a/doc/tips/parentlinks_style.mdwn b/doc/tips/parentlinks_style.mdwn
index 5294e5452..f9dfa8f55 100644
--- a/doc/tips/parentlinks_style.mdwn
+++ b/doc/tips/parentlinks_style.mdwn
@@ -6,7 +6,7 @@ a subset of a page's parents. It also provides a few bonus
possibilities, such as styling the parent links depending on their
place in the path.
-[[!toc ]]
+[[!toc levels=2]]
Content
=======
@@ -77,10 +77,29 @@ following lines in `page.tmpl`:
<a href="<TMPL_VAR NAME="URL">" class="height<TMPL_VAR NAME="HEIGHT">">
<TMPL_VAR NAME="PAGE">
</a> /
+ </TMPL_IF>
</TMPL_LOOP>
Then write the appropriate CSS bits for `a.height1`, etc.
+Avoid showing title of toplevel index page
+------------------------------------------
+
+If you don't like having "index" appear on the top page of the wiki,
+but you do want to see the name of the page otherwise, you can use a
+special `HAS_PARENTLINKS` template variable that the plugin provides.
+It is true for every page *except* the toplevel index.
+
+Here is an example of using it to hide the title of the toplevel index
+page:
+
+ <TMPL_LOOP NAME="PARENTLINKS">
+ <a href="<TMPL_VAR NAME=URL>"><TMPL_VAR NAME=PAGE></a>/
+ </TMPL_LOOP>
+ <TMPL_IF HAS_PARENTLINKS>
+ <TMPL_VAR TITLE>
+ </TMPL_IF>
+
Full-blown example
------------------
diff --git a/doc/tips/psgi.mdwn b/doc/tips/psgi.mdwn
new file mode 100644
index 000000000..0d2eeefc8
--- /dev/null
+++ b/doc/tips/psgi.mdwn
@@ -0,0 +1,21 @@
+Here's the app.psgi file if you want to run ikiwiki with [PSGI](http://plackperl.org) instead of apache or other web servers:
+
+ use Plack::App::CGIBin;
+ use Plack::Builder;
+ use Plack::App::File;
+
+ builder {
+ mount '/ikiwiki.cgi' => Plack::App::CGIBin->new(file => './ikiwiki.cgi')->to_app;
+ enable "Plack::Middleware::Static",
+ path => sub { s!(^(?:/[^.]*)?/?$)!${1}/index.html! },
+ root => '.';
+ mount '/' => Plack::App::File->new(root => ".")->to_app;
+ };
+
+Put it in your destdir and now your can run `plackup -p <port>`.
+
+Note that you should configure your `url` and `cgiurl` to point to the listening address of plackup.
+
+Also, the app.psgi residing in the destdir means that /app.psgi is accessible from the web server.
+
+Hopefully some day ikiwiki web ui will speak psgi natively.
diff --git a/doc/tips/spam_and_softwaresites.mdwn b/doc/tips/spam_and_softwaresites.mdwn
new file mode 100644
index 000000000..a07889e6b
--- /dev/null
+++ b/doc/tips/spam_and_softwaresites.mdwn
@@ -0,0 +1,87 @@
+Any wiki with a form of web-editing enabled will have to deal with
+spam. (See the [[plugins/blogspam]] plugin for one defensive tool you
+can deploy).
+
+If:
+
+ * you are using ikiwiki to manage the website for a [[examples/softwaresite]]
+ * you allow web-based commits, to let people correct documentation, or report
+ bugs, etc.
+ * the documentation is stored in the same revision control repository as your
+ software
+
+It is undesirable to have your software's VCS history tainted by spam and spam
+clean-up commits. Here is one approach you can use to prevent this. This
+example is for the [[git]] version control system, but the principles should
+apply to others.
+
+## Isolate web commits to a specific branch
+
+Create a separate branch to contain web-originated edits (named `doc` in this
+example):
+
+ $ git checkout -b doc
+
+Adjust your setup file accordingly:
+
+ gitmaster_branch => 'doc',
+
+## merging good web commits into the master branch
+
+You will want to periodically merge legitimate web-based commits back into
+your master branch. Ensure that there is no spam in the documentation
+branch. If there is, see 'erase spam from the commit history', below, first.
+
+Once you are confident it's clean:
+
+ # ensure you are on the master branch
+ $ git branch
+ doc
+ * master
+ $ git merge --ff doc
+
+## removing spam
+
+### short term
+
+In the short term, just revert the spammy commit.
+
+If the spammy commit was the top-most:
+
+ $ git revert HEAD
+
+This will clean the spam out of the files, but it will leave both the spam
+commit and the revert commit in the history.
+
+### erase spam from the commit history
+
+Git allows you to rewrite your commit history. We will take advantage of this
+to eradicate spam from the history of the doc branch.
+
+This is a useful tool, but it is considered bad practise to rewrite the
+history of public repositories. If your software's repository is public, you
+should make it clear that the history of the `doc` branch in your repository
+is unstable.
+
+Once you have been spammed, use `git rebase` to remove the spam commits from
+the history. Assuming that your `doc` branch was split off from a branch
+called `master`:
+
+ # ensure you are on the doc branch
+ $ git branch
+ * doc
+ master
+ $ git rebase --interactive master
+
+In your editor session, you will see a series of lines for each commit made to
+the `doc` branch since it was branched from `master` (or since the last merge
+back into `master`). Delete the lines corresponding to spammy commits, then
+save and exit your editor.
+
+Caveat: if there are no commits you want to keep (i.e. all the commits since
+the last merge into master are either spam or spam reverts) then `git rebase`
+will abort. Therefore, this approach only works if you have at least one
+non-spam commit to the documentation since the last merge into `master`. For
+this reason, it's best to wait until you have at least one
+commit you want merged back into the main history before doing a rebase,
+and until then, tackle spam with reverts.
diff --git a/doc/tips/spam_and_softwaresites/discussion.mdwn b/doc/tips/spam_and_softwaresites/discussion.mdwn
new file mode 100644
index 000000000..21f0a5d7e
--- /dev/null
+++ b/doc/tips/spam_and_softwaresites/discussion.mdwn
@@ -0,0 +1,8 @@
+In the cleanup spam section:
+
+> Caveat: if there are no commits you want to keep (i.e. all the commits since the last merge into master are either spam or spam reverts) then git rebase will abort.
+
+Wouldn't it be enough then to use `git reset --hard` to the desired last good commit?
+
+regards,
+iustin
diff --git a/doc/tips/switching_to_usedirs.mdwn b/doc/tips/switching_to_usedirs.mdwn
index 183ce00ac..92871439f 100644
--- a/doc/tips/switching_to_usedirs.mdwn
+++ b/doc/tips/switching_to_usedirs.mdwn
@@ -8,9 +8,7 @@ to usedirs, or edit your setup file and turn usedirs back off.
or manually.
* Since usedirs is enabled, ikiwiki will have created a bunch of new
html files. Where before ikiwiki generated a `dest/foo.html`, now it will
- generate `dest/foo/index.html`. But, the old html files will still be
- present too. Remove them:
- find dest -name \*.html -not -name index.html -exec rm {} \;
+ generate `dest/foo/index.html`. The old html files will be removed.
* If you have a blog that is aggregated on a Planet or similar, all the
items in the RSS or atom feed will seem like new posts, since their URLs
have changed. See [[howto_avoid_flooding_aggregators]] for a workaround.
diff --git a/doc/tips/untrusted_git_push.mdwn b/doc/tips/untrusted_git_push.mdwn
index 3573a0ddf..948a55063 100644
--- a/doc/tips/untrusted_git_push.mdwn
+++ b/doc/tips/untrusted_git_push.mdwn
@@ -68,7 +68,7 @@ untrusted changes. It should *not* include the user that ikiwiki normally runs
as.
Once you're done modifying the setup file, don't forget to run
-`ikiwiki -setup --refresh --wrappers` on it.
+`ikiwiki --setup ikiwiki.setup --refresh --wrappers` on it.
## git setup
@@ -112,11 +112,3 @@ abort the push before refs are updated. However, the changeset will still
be present in your repository, wasting space. Since nothing refers to it,
it will be expired eventually. You can speed up the expiry by running `git
prune`.
-
-When aborting a push, ikiwiki displays an error message about why it didn't
-accept it. If using git over ssh, the user will see this error message,
-which is probably useful to them. But `git-daemon` is buggy, and hides this
-message from the user. This can make it hard for users to figure out why
-their push was rejected. (If this happens to you, look at "'git log --stat
-origin/master..`" and think about whether your changes would be accepted
-over the web interface.)
diff --git a/doc/tips/vim_and_ikiwiki.mdwn b/doc/tips/vim_and_ikiwiki.mdwn
new file mode 100644
index 000000000..e4136aa5d
--- /dev/null
+++ b/doc/tips/vim_and_ikiwiki.mdwn
@@ -0,0 +1,28 @@
+# Vim and ikiwiki
+
+## Syntax highlighting
+
+[ikiwiki-syntax](http://www.vim.org/scripts/script.php?script_id=3156) is a vim
+syntax highlighting file for ikiwiki [[ikiwiki/markdown]] files. It highlights
+directives and wikilinks. It only supports prefixed directives, i.e.,
+\[[!directive foo=bar baz]], not the old format with spaces.
+
+------
+
+The previous syntax definition for ikiwiki links is at [[vim_syntax_highlighting/ikiwiki.vim]]; however,
+it seems to not be [[maintained
+anymore|forum/navigation_of_wiki_pages_on_local_filesystem_with_vim#syn-maintenance]],
+and it has some [[issues|forum/ikiwiki_vim_syntaxfile]].
+
+## Page creation and navigation
+
+The [ikiwiki-nav](http://www.vim.org/scripts/script.php?script_id=2968) package
+is a vim plugin that enables you to do the following from inside vim:
+
+ * Jumping to the file corresponding to the wikilink under the cursor.
+ * Creating the file corresponding to the wikilink under the cursor (including
+ directories if necessary.)
+ * Jumping to the previous/next wikilink in the current file.
+ * Autocomplete link names.
+
+Download it from [here](http://www.vim.org/scripts/script.php?script_id=2968)
diff --git a/doc/tips/vim_syntax_highlighting.mdwn b/doc/tips/vim_syntax_highlighting.mdwn
index 172b763c3..8f2fdc1f0 100644
--- a/doc/tips/vim_syntax_highlighting.mdwn
+++ b/doc/tips/vim_syntax_highlighting.mdwn
@@ -1,4 +1,20 @@
-[[ikiwiki.vim]] is a vim syntax highlighting file for ikiwiki
-[[ikiwiki/markdown]] files.
+This page is deprecated. See [[tips/vim_and_ikiwiki]] for the most up to date
+content
-Installation instructions are at the top of the file.
+--------
+
+[ikiwiki-syntax](http://www.vim.org/scripts/script.php?script_id=3156) is a vim
+syntax highlighting file for ikiwiki [[ikiwiki/markdown]] files. It highlights
+directives and wikilinks. It only supports prefixed directives, i.e.,
+\[[!directive foo=bar baz]], not the old format with spaces.
+
+See also: [[follow_wikilinks_from_inside_vim]]
+
+------
+
+The previous syntax definition for ikiwiki links is at [[ikiwiki.vim]]; however,
+it seems to not be [[maintained
+anymore|forum/navigation_of_wiki_pages_on_local_filesystem_with_vim#syn-maintenance]],
+and it has some [[issues|forum/ikiwiki_vim_syntaxfile]].
+
+[[!tag vim]]
diff --git a/doc/tips/yaml_setup_files.mdwn b/doc/tips/yaml_setup_files.mdwn
new file mode 100644
index 000000000..e8ab4f144
--- /dev/null
+++ b/doc/tips/yaml_setup_files.mdwn
@@ -0,0 +1,12 @@
+Here's how to convert your existing standard format ikiwiki setup file into
+the new YAML format recently added to ikiwiki.
+
+1. First, make sure you have the [[!cpan YAML]] perl module installed.
+ (Run: `apt-get install libyaml-perl`)
+2. Run: `ikiwiki -setup my.setup -dumpsetup my.setup --set setuptype=Yaml`
+
+The format of the YAML setup file should be fairly self-explanatory.
+
+(To convert the other way, use "setuptype=Standard" instead.)
+
+--[[Joey]]