summaryrefslogtreecommitdiff
path: root/doc/forum
diff options
context:
space:
mode:
Diffstat (limited to 'doc/forum')
-rw-r--r--doc/forum/How_does_ikiwiki_remember_times__63__.mdwn25
-rw-r--r--doc/forum/Migrating_old_repository_to_new_ikiwiki_system__63__.mdwn4
-rw-r--r--doc/forum/an_alternative_approach_to_structured_data.mdwn34
-rw-r--r--doc/forum/google_openid_broken__63__.mdwn27
-rw-r--r--doc/forum/how_to_setup_ikiwiki_on_a_remote_host.mdwn35
-rw-r--r--doc/forum/navigation_of_wiki_pages_on_local_filesystem_with_vim.mdwn34
-rw-r--r--doc/forum/speeding_up_ikiwiki.mdwn2
-rw-r--r--doc/forum/utf8_warnings_for___34____92__xAB__34__.mdwn47
-rw-r--r--doc/forum/where_are_the_tags.mdwn9
-rw-r--r--doc/forum/wishlist-discussion:_Editformular_showing_existing_tags.mdwn15
10 files changed, 208 insertions, 24 deletions
diff --git a/doc/forum/How_does_ikiwiki_remember_times__63__.mdwn b/doc/forum/How_does_ikiwiki_remember_times__63__.mdwn
index 6ce576db1..6b7739fd0 100644
--- a/doc/forum/How_does_ikiwiki_remember_times__63__.mdwn
+++ b/doc/forum/How_does_ikiwiki_remember_times__63__.mdwn
@@ -20,15 +20,17 @@ Do I have it right?
> Some VCS, like git, set the file mtimes to the current time
> when making a new checkout, so they will be lost if you do that.
> The creation times can be retrived using the `--getctime` option.
-> I suppose it might be nice if there were a `--getmtime` that pulled
-> true modification times out of the VCS, but I haven't found it a big
-> deal in practice for the last modification times to be updated to the
-> current time when rebuilding a wiki like this. --[[Joey]]
+> --[[Joey]]
>
> > Thanks for the clarification. I ran some tests of my own to make sure I understand it right, and I'm satisfied
> > that the order of posts in my blog can be retrieved from the VCS using the `--getctime` option, at least if I
> > choose to order my posts by creation time rather than modification time. But I now know that I can't rely on
> > page modification times in ikiwiki as these can be lost permanently.
+>
+> > > Update: It's now renamed to `--gettime`, and pulls both the creation
+> > > and modification times. Also, per [[todo/auto_getctime_on_fresh_build]],
+> > > this is now done automatically the first time ikiwiki builds a
+> > > srcdir. So, no need to worry about this any more! --[[Joey]]
> >
> > I would suggest that there should at least be a `--getmtime` option like you describe, and perhaps that
> > `--getctime` and `--getmtime` be _on by default_. In my opinion the creation times and modification times of
@@ -91,19 +93,6 @@ Do I have it right?
> A quick workaround for me to get modification times right is the following
> little zsh script, which unfortunately only works for git:
- #!/usr/bin/env zsh
-
- set +x
-
- for FILE in **/*(.); do
- TIMES="`git log --pretty=format:%ai $FILE`"
- MTIME="`echo $TIMES | head -n1`"
-
- if [ ! -z $MTIME ]; then
- echo touch -m -d "$MTIME" $FILE
- touch -m -d "$MTIME" $FILE
- fi
-
- done
+>> Elided; no longer needed since --gettime does that, and much faster! --[[Joey]]
> --[[David_Riebenbauer]]
diff --git a/doc/forum/Migrating_old_repository_to_new_ikiwiki_system__63__.mdwn b/doc/forum/Migrating_old_repository_to_new_ikiwiki_system__63__.mdwn
index fe67e6aba..d7a33b526 100644
--- a/doc/forum/Migrating_old_repository_to_new_ikiwiki_system__63__.mdwn
+++ b/doc/forum/Migrating_old_repository_to_new_ikiwiki_system__63__.mdwn
@@ -20,10 +20,6 @@ How do I set up an ikiwiki system using a pre-existing repository (instead of cr
> recreate the ikiwiki srcdir
> 3. `git clone` from the bare git repository a second time,
> to create a checkout you can manually edit (optional)
-> 4. run `ikiwiki --getctime --setup your.setup`
-> The getctime will ensure page creation times are accurate
-> by putting the info out of the git history,
-> and only needs to be done once.
>
> If you preserved your repository, but not the setup file,
> the easiest way to make one is probably to run
diff --git a/doc/forum/an_alternative_approach_to_structured_data.mdwn b/doc/forum/an_alternative_approach_to_structured_data.mdwn
index eb6ee4445..6e6af8adb 100644
--- a/doc/forum/an_alternative_approach_to_structured_data.mdwn
+++ b/doc/forum/an_alternative_approach_to_structured_data.mdwn
@@ -18,6 +18,12 @@ I think it could be really powerful and useful, especially if it becomes part of
> It looks like an interesting idea. I don't have time right now to look at it in depth, but it looks interesting. -- [[Will]]
+> I agree such a separation makes some sense. But note that the discussion on [[todo/structured_page_data]]
+> talks about associating data types with fields for a good reason: It's hard to later develop a good UI for
+> querying or modifying a page's data if all the data has an implicit type "string". --[[Joey]]
+
+>> I'm not sure that having an implicit type of "string" is really such a bad thing. After all, Perl itself manages with just string and number, and easily converts from one to the other. Strong typing is generally used to (a) restrict what can be done with the data and/or (b) restrict how the data is input. The latter could be done with some sort of validated form, but that, too, could be decoupled from looking up and returning the value of a field. --[[KathrynAndersen]]
+
## Second Pass
I have written additional plugins which integrate with the [[plugins/contrib/field]] plugin to both set and get structured page data.
@@ -27,3 +33,31 @@ I have written additional plugins which integrate with the [[plugins/contrib/fie
* [[plugins/contrib/ymlfront]] - looks for YAML-format data at the front of a page; this is just one possible back-end for the structured data
--[[KathrynAndersen]]
+
+> I'm not an IkiWiki committer ([[Joey]] is the only one I think)
+> but I really like the look of this scheme. In particular,
+> having `getfield` interop with `field` without being *part of*
+> `field` makes me happy, since I'm not very keen on `getfield`'s
+> syntax (i.e. "ugh, yet another mini-markup-language without a
+> proper escaping mechanism"), but this way people can experiment
+> with different syntaxes while keeping `field` for the
+> behind-the-scenes bits.
+>
+>> I've started using `field` on a private site and it's working
+>> well for me; I'll try to do some code review on its
+>> [[plugins/contrib/field/discussion]] page. --s
+>
+> My [[plugins/contrib/album]] plugin could benefit from
+> integration with `field` for photos' captions and so on,
+> probably... I'll try to work on that at some point.
+>
+> [[plugins/contrib/report]] may be doing too much, though:
+> it seems to be an variation on `\[[inline archive="yes"]]`,
+> with an enhanced version of sorting, a mini version of
+> [[todo/wikitrails]], and some other misc. I suspect it could
+> usefully be divided up into discrete features? One good way
+> to do that might be to shuffle bits of its functionality into
+> the IkiWiki distribution and/or separate plugins, until there's
+> nothing left in `report` itself and it can just go away.
+>
+> --[[smcv]]
diff --git a/doc/forum/google_openid_broken__63__.mdwn b/doc/forum/google_openid_broken__63__.mdwn
index 68b44f2c1..96ba2d791 100644
--- a/doc/forum/google_openid_broken__63__.mdwn
+++ b/doc/forum/google_openid_broken__63__.mdwn
@@ -50,3 +50,30 @@ The openid is
<https://www.google.com/accounts/o8/id?id=AItOawltlTwUCL_Fr1siQn94GV65-XwQH5XSku4>
(what a mouthfull!), and I don't know who that is or how to use it since it
points to a fairly useless xml document, rather than a web page. --[[Joey]]
+
+> That string is what's received via the discovery protocol. The user logging in with a Google account is not supposed to write that when logging in, but rather <https://www.google.com/accounts/o8/id>. The OpenID client library will accept that and redirect the user to a sign in page, which will return that string as the OpenID. It's not really usable as an identifier for edits and whatnots, but an alternative would be to use the attribute exchange extension to get the email address and display that. See <http://code.google.com/apis/accounts/docs/OpenID.html#Parameters>.
+
+> Yahoo's OpenID implementation works alike, but I haven't looked at it as much. It uses <https://me.yahoo.com/> to receive the endpoint.
+
+> I've added buttons that submit the two above URLs for logging in with a Google and Yahoo OpenID, respectively, to my locally changed OpenID login plugin.
+
+> Using the Google profile page as the OpenID is really orthogonal to the above. --[[kaol]]
+
+>> First, I don't accept that the openid google returns from their
+>> generic signin url *has* to be so freaking ugly. For contrast,
+>> look at the openid you log in as if you use the yahoo url.
+>> <https://me.yahoo.com/joeyhess#35f22>. Nice and clean, now
+>> munged by ikiwiki to "joeyhess [me.yahoo.com]".
+>>
+>> Displaying email addresses is not really an option, because ikiwiki
+>> can't leak user email addresses like that. Displaying nicknames or
+>> usernames is, see [[todo/Separate_OpenIDs_and_usernames]].
+>>
+>> It would probably be good if the openid plugin could be configured with
+>> a list of generic openid urls, so it can add quick login buttons using
+>> those urls.
+>>
+>> The ugly google url will still be exposed here and there where
+>> a unique user id is needed. That can be avoided by not using the generic
+>> <https://www.google.com/accounts/o8/id>, but instead your own profile
+>> like <http://www.google.com/profiles/joeyhess>. --[[Joey]]
diff --git a/doc/forum/how_to_setup_ikiwiki_on_a_remote_host.mdwn b/doc/forum/how_to_setup_ikiwiki_on_a_remote_host.mdwn
new file mode 100644
index 000000000..1c0f8f561
--- /dev/null
+++ b/doc/forum/how_to_setup_ikiwiki_on_a_remote_host.mdwn
@@ -0,0 +1,35 @@
+Hi all!
+I really like ikiwiki and i tested it on my local machine but i have one question that i can't answer reading documentation (my fault of course)...
+I have an account and some space on a free hosting service.
+Now, i want to put my ikiwiki on this remote web space so that i can browse it from wherever i want.
+I have my source dir and my git dir on my local machine.
+How can i upload my ikiwiki on the remote host and manage it via git as i can do when i test it locally?
+Where is specified? Where can i find documentation about it?
+
+Thanks in advance!
+
+Pab
+
+> There are several ways to accomplish this, depending on what you really
+> want to do.
+>
+> If your goal is to continue generating the site locally, but then
+> transfer it to the remote host for serving, you could use the
+> [[plugins/rsync]] plugin.
+>
+> If your goal is to install and run the ikiwiki software on the remote host,
+> then you would follow a similar path to the ones described in these tips:
+> [[tips/nearlyfreespeech]] [[tips/DreamHost]]. Or even [[install]] ikiwiki
+> from a regular package if you have that kind of access. Then you could
+> push changes from your local git to git on the remote host to update the
+> wiki. [[tips/Laptop_wiki_with_git]] explains one way to do that.
+> --[[Joey]]
+
+Thanks a lot for your answer.
+rsync plugin would be perfect but... how would i manage blog post?
+I mean... is it possible to manage ikiwiki blog too with rsync plugin in the way you told me? --Pab
+
+> If you want to allow people to make comments on your blog, no, the rsync plugin will not help, since it will upload a completely static site where nobody can make comments. Comments require a full IkiWiki setup with CGI enabled, so that people add content (comments) from the web. --[[KathrynAndersen]]
+
+Ok, i understand, thanks.
+Is there any hosting service that permits to have a full installation of iwkiwiki or i am forced to get a vps or to mantain a personal server for that? --Pab
diff --git a/doc/forum/navigation_of_wiki_pages_on_local_filesystem_with_vim.mdwn b/doc/forum/navigation_of_wiki_pages_on_local_filesystem_with_vim.mdwn
index 1cb5ed27e..7bc032949 100644
--- a/doc/forum/navigation_of_wiki_pages_on_local_filesystem_with_vim.mdwn
+++ b/doc/forum/navigation_of_wiki_pages_on_local_filesystem_with_vim.mdwn
@@ -1,3 +1,7 @@
+**UPDATE** I have created a [[page|tips/follow_wikilinks_from_inside_vim]] in
+the tips section about the plugin, how to get it, install it and use it. Check
+that out. --[[jerojasro]]
+
I wrote a vim function to help me navigate the wiki when I'm editing it. It extends the 'gf' (goto file) functionality. Once installed, you place the cursor on a wiki page name and press 'gf' (without the quotes); if the file exists, it gets loaded.
This function takes into account the ikiwiki linking rules when deciding which file to go to.
@@ -64,7 +68,27 @@ the plugin has, as of now, two problems:
> Seems about ready for me to think about pulling it into ikiwiki
> alongside [[tips/vim_syntax_highlighting/ikiwiki.vim]]. If you'll
> please slap a license on it. :) --[[Joey]]
->
+>
+>> GPL version 2 or later (if that doesn't cause any problems here). I'll add it
+>> to the file --[[jerojasro]]
+>>
+>>> I see you've put the plugin on vim.org. Do you think it makes sense to
+>>> also include a copy in ikiwiki? --[[Joey]]
+>>>
+>>>> mmm, no. There would be two copies of it, and the git repo. I'd rather have
+>>>> a unique place for the "official" version (vim.org), and another for the dev
+>>>> version (its git repo).
+>>>>
+>>>> actually, I would also suggest to upload the [[`ikiwiki.vim`|tips/vim_syntax_highlighting]] file to vim.org --[[jerojasro]]
+>>>>>
+>>>>> If you have any interest in maintaining the syntax highlighting
+>>>>> plugin and putting it there, I'd be fine with that. I think it needs
+>>>>> some slight work to catch up with changes to ikiwiki's directives
+>>>>> (!-prefixed now), and wikilinks (able to have spaces now). --[[Joey]]
+>>>>>
+>>>>>> I don't really know too much about syntax definitions in vim. But I'll give it a stab. I know it fails when there are 2 \[[my text|link]] wikilinks in the same page.
+>>>>>> I'm not promising anything, though ;) --[[jerojasro]]
+>
> Also, I have a possible other approach for finding ikiwiki's root. One
> could consider that any subdirectory of an ikiwiki wiki is itself
> a standalone wiki, though probably one missing a toplevel index page.
@@ -80,6 +104,10 @@ the plugin has, as of now, two problems:
>
> And if that's the case, you can resolve an absolute link by looking for
> the page closest to the root that matches the link.
+>
+>> I like your idea; it doesn't alter the matching of the relative links, and
+>> should work fine with absolute links too. I'll implement it, though I see
+>> some potential (but small) issues with it --[[jerojasro]]
>
> It may even make sense to change ikiwiki's own handling of "absolute"
> links to work that way. But even without changing ikiwiki, I think it
@@ -93,3 +121,7 @@ the plugin has, as of now, two problems:
> and vim would go to that file.
>
> --[[Joey]]
+>
+>> your approach will add more noise when the plugin grows the page-creation
+>> feature, since there will be no real root to limit the possible locations for
+>> the new page. But it is far better than demanding for a `.ikiwiki` dir --[[jerojasro]]
diff --git a/doc/forum/speeding_up_ikiwiki.mdwn b/doc/forum/speeding_up_ikiwiki.mdwn
index 2c2ac240e..799186cf8 100644
--- a/doc/forum/speeding_up_ikiwiki.mdwn
+++ b/doc/forum/speeding_up_ikiwiki.mdwn
@@ -56,7 +56,7 @@ number is still too large to really visualize: the graphviz PNG and PDF output
engines segfault for me, the PS one works but I can't get any PS software to
render it without exploding.
-Now, the relations in the links hash are not the same thing as IkiWiki's notion of dependencies. Can anyone point me at that data structure / where I might be able to add some debugging foo to generate a graph of it?
+Now, the relations in the links hash are not the same thing as Ikiwiki's notion of dependencies. Can anyone point me at that data structure / where I might be able to add some debugging foo to generate a graph of it?
Once I've figured out that I might be able to optimize some pagespecs. I
understand pagespecs are essentially translated into sequential perl code. I
diff --git a/doc/forum/utf8_warnings_for___34____92__xAB__34__.mdwn b/doc/forum/utf8_warnings_for___34____92__xAB__34__.mdwn
new file mode 100644
index 000000000..72f2d38e0
--- /dev/null
+++ b/doc/forum/utf8_warnings_for___34____92__xAB__34__.mdwn
@@ -0,0 +1,47 @@
+# getting Warnings about UTF8-Chars.
+
+I'm getting multiple warnings:
+
+ utf8 "\xAB" does not map to Unicode at /usr/share/perl5/IkiWiki.pm line 774, <$in> chunk 1.
+
+
+I'm assuming this is once per File, but even in verbose mode, it doesn't tell me which file is a problem.
+It first reads all the files, and afterwards when parsing/compiling them, it outputs the warning, so I can't
+deduce the offending files.
+
+Is there a way to have ikiwiki output the position, where it encounters the character?
+
+Probably all this has to do with locale-settings, and usage of mixed locales in a distributed setup ...
+I'd rather cleanup some of the file(name)s of unexpected characters. --[[jwalzer]]
+
+--------
+
+**Update** : So I took the chance to insert debug into ikiwiki.pm:
+
+ root@novalis:/usr/share/perl5# diff -p /tmp/IkiWiki.orig.pm IkiWiki.pm
+ *** /tmp/IkiWiki.orig.pm Sun Feb 14 15:16:08 2010
+ --- IkiWiki.pm Sun Feb 14 15:16:28 2010
+ *************** sub readfile ($;$$) {
+ *** 768,773 ****
+ --- 768,774 ----
+ }
+
+ local $/=undef;
+ + debug("opening File: $file:");
+ open (my $in, "<", $file) || error("failed to read $file: $!");
+ binmode($in) if ($binary);
+ return \*$in if $wantfd;
+
+
+But what I see now is not quite helpful, as it seems, STDERR and DEBUG are asyncronous, so they mix up in a way, that I can't really see, whats the problem ... Maybe I'm better off for troubleshooting, to insert an printf to strerr to have it in the same stream.. --[[jwalzer]]
+
+
+----
+
+**Update:** The "print STDERR $file;"-Trick did it .. I was able to find a mdwn-file, that (was generated by a script of me) had \0xAB in it.
+
+Nevertheless I still wonder if this should be a problem. This character happend to be in an *\[\[meta title='$CHAR'\]\]-tag* and an *\[$CHAR\]http://foo)-Link*
+
+Should this throw an warning? Maybe this warning could be catched an reported inclusively the containing filename? maybe even with an override, if one knows that it is correct that way? --[[jwalzer]]
+
+[[!tag solved]]
diff --git a/doc/forum/where_are_the_tags.mdwn b/doc/forum/where_are_the_tags.mdwn
new file mode 100644
index 000000000..ecb49fe43
--- /dev/null
+++ b/doc/forum/where_are_the_tags.mdwn
@@ -0,0 +1,9 @@
+Where is the tag cloud/tag listing of all the tags used in this wiki? I know we
+have tags enabled. --[[jerojasro]]
+
+> This wiki does not use one global toplevel set of tags (`tagbase` is not
+> set).
+>
+> There are tags used for the [[plugins]], and a tag cloud of those
+> there. [[wishlist]] and [[patch]] are tags too, but I don't see the point
+> of a tag cloud for such tags. --[[Joey]]
diff --git a/doc/forum/wishlist-discussion:_Editformular_showing_existing_tags.mdwn b/doc/forum/wishlist-discussion:_Editformular_showing_existing_tags.mdwn
new file mode 100644
index 000000000..49c55e20e
--- /dev/null
+++ b/doc/forum/wishlist-discussion:_Editformular_showing_existing_tags.mdwn
@@ -0,0 +1,15 @@
+# How about:
+
+having a list of all existing tags in the Edit-Formular as a selectionbox?
+
+Assume I have tagbase=/tags/ and for every tag I have given to articles an existing page there.
+
+Would it be possible to list all these tags together with the Formular, as selectionbox.
+Maybe even with parsing of the content and preselecting the tags, that are given in the article and vice-versa when selecting the fields then also generating the \[\[\!tag\]\]-sourcecode ?
+
+this would need a bit JS-work and somehow on compiletime we need to put the list of tags somewhere, where the cgi could read them from.
+This way, even a pagespec would suffice to determine the usable list of tags and not only the tagbase-variable.
+
+> I think this would be very hard to achieve with the current tag plugin, due to the nature of its implementation.
+>
+> I've had a "tag2" plugin on the go for a while which supports this. It's in a very rough stage but I'll try to find it and upload it somewhere. -- [[Jon]]