summaryrefslogtreecommitdiff
path: root/IkiWiki/Plugin
AgeCommit message (Collapse)Author
2009-05-23remove commas in tohighlight listJoey Hess
2009-05-22highlight: New plugin supporting syntax highlighting of pretty much anything.Joey Hess
* debian/control: Add suggests for libhighlight-perl, although that package is not yet created by Debian's highlight source package. (See #529869)
2009-05-22listdirectives: Avoid listing _comment directives and generally assume any ↵Joey Hess
directive starting with _ is likewise internal.
2009-05-21support longname for page types in commands and renameJoey Hess
Also, sort the list of page types.
2009-05-16tidy up new page_types codeJon Dowland
2009-05-16add a long name for textile/txtl ("Textile")Jon Dowland
2009-05-16add a long name for mdwn ("Markdown")Jon Dowland
2009-05-16check for longname for each syntax pluginJon Dowland
We build an array of [ plugin name, long name ] pairs, where long name is an optional argument to hook(). So, a syntax plugin could define long "friendly" name, such as "Markdown" instead of mdwn, and we would then pass this array to formbuilder to populate the drop-down on the edit page.
2009-05-06remove pagespec_match_list override for externalJoey Hess
Not needed since it returns a list of pages, not a fail/success object.
2009-05-06external: Fix pagespec_match and pagespec_match_list. Closes: #527281Joey Hess
2009-05-06Avoid %links accumulating duplicates. (For TOVA)Joey Hess
This is sorta an optimisation, and sorta a bug fix. In one test case I have available, it can speed a page build up from 3 minutes to 3 seconds. The root of the problem is that $links{$page} contains arrays of links, rather than hashes of links. And when a link is found, it is just pushed onto the array, without checking for dups. Now, the array is emptied before scanning a page, so there should not be a lot of opportunity for lots of duplicate links to pile up in it. But, in some cases, they can, and if there are hundreds of duplicate links in the array, then scanning it for matching links, as match_link and some other code does, becomes much more expensive than it needs to be. Perhaps the real right fix would be to change the data structure to a hash. But, the list of links is never accessed like that, you always want to iterate through it. I also looked at deduping the list in saveindex, but that does a lot of unnecessary work, and doesn't completly solve the problem. So, finally, I decided to add an add_link function that handles deduping, and make ikiwiki-transition remove the old dup links.
2009-05-05inline: Minor optimisation.Joey Hess
When finding the pageurl, it was calling bestlink unnecessarily. Since at this point $page contains the full name of the page that is being inlined, there is no need to do bestlink's scan for it. This is only a minor optimisation, since bestlink is only called once per displayed, inlined page.
2009-04-23simplifiyJoey Hess
2009-04-23Revert "pagespec_match_list * optimisation"Joey Hess
This reverts commit 2f96c49bd1826ecb213ae025ad456a714aa04863. I forgot about internal pages. We don't want * matching them! I left the optimisation in pagecount, where it used to live. Internal pages probably don't matter when they're just being counted.
2009-04-23avoid using pagespec_match_list hereJoey Hess
I forgot to check if it was called from preprocess, and it is not; it's called by a format hook. If an error is thrown from a format hook, wiki build fails, so we don't want that.
2009-04-23simplifyJoey Hess
2009-04-23pagespec_match_list * optimisationJoey Hess
Add an optimisation for the semi-common case of a "*" pagespec. Can avoid doing any real processing in this case.
2009-04-23formattingJoey Hess
2009-04-23typoJoey Hess
2009-04-23pagespec_match_list added and used in most appropriate placesJoey Hess
* pagespec_match_list: New API function, matches pages in a list and throws an error if the pagespec is bad. * inline, brokenlinks, calendar, linkmap, map, orphans, pagecount, pagestate, postsparkline: Display a handy error message if the pagespec is erronious.
2009-04-23comments: Add link to comment post form to allow user to sign in if they ↵Joey Hess
wish to, if the configuration makes signin optional for commenting.
2009-04-23pagespec error/failure distinction and error display by inlineJoey Hess
* Add IkiWiki::ErrorReason objects, and modify pagespecs to return them in cases where they fail to match due to a configuration or syntax error. * inline: Display a handy error message if the inline cannot display any pages due to such an error. This is perhaps somewhat incomplete, as other users of pagespecs do not display the error, and will eventually need similar modifications to inline. I should probably factor out a pagespec_match_all function and make it throw ErrorReasons.
2009-04-22fix idJoey Hess
2009-04-22websetup: If setup fails, restore old setup file.Joey Hess
2009-04-22blogspam: Load RPC::XML library in checkconfig, so that an error can be ↵Joey Hess
printed at that point if it's not available, allowing the admin to see it during wiki setup. Closes: #520015
2009-04-22websetup: Display stderr in browser if ikiwiki setup fails.Joey Hess
2009-04-04remove unnecessary variableJoey Hess
2009-04-04remove debuggingJoey Hess
2009-04-04fix display of web commits in recentchangesJoey Hess
The darcs backend appends @web to the names of web committers, so remove it when extracting.
2009-04-04fix bug I introducedJoey Hess
2009-04-04move comments to copyright and changelogJoey Hess
2009-04-04formatting, layout, indentation, coding styleJoey Hess
2009-04-04Merge branch 'master'Joey Hess
Conflicts: doc/ikiwiki-makerepo.mdwn
2009-04-01recentchanges: change to using do=goto links.Joey Hess
2009-03-27use md5sum for page_to_idJoey Hess
The munged ids were looking pretty nasty, and were not completly guaranteed to be unique. So a md5sum seems like a better approach. (Would have used sha1, but md5 is in perl core.)
2009-03-26comments: Fix anchor ids to be legal xhtml. Closes: #521339Joey Hess
Well, that was a PITA. Luckily, this doesn't break guids to comments in rss feeds, though it does change the links. I haven't put in a warning about needing to rebuild to get this fix. It's probably good enough for new comments to get the fix, without a lot of mass rebuilding.
2009-03-26comments: Fix too loose test for comments pages that matched normal pages ↵Joey Hess
with "comment_" in their name. Closes: #521322
2009-03-20fix rcs_getctime to return first, not last, change timeJoey Hess
This was being buggy and returning the file's last change time, not its creation time. (I checked all the others (except tla) and they're ok.)
2009-03-20fix rcs_getctime to return first, not last, change timeJoey Hess
This was being buggy and returning the file's last change time, not its creation time.
2009-03-19inline: Fix urls to feed when feedfile is used on an index page.Joey Hess
It would be better to use urlto() here, but will_render has not yet been called on the feed files at this point, so it won't work. (And reorganizing so it can be is tricky.)
2009-03-19avoid crashing if Sort::Naturally is not installedJoey Hess
2009-03-19implement sort=title_natural for inlinechrysn
adds a new sorting order, title_natural, that uses Sort::Naturally's ncmp function to provide better sorting for inlines
2009-03-09git: Manually decode git output from utf-8, avoids warning messages on ↵Joey Hess
invalidly encoded output.
2009-03-09git: Fix utf-8 encoding of author names.Joey Hess
I guess what's happening here is that since the name is passed to git via an environment variable, perl's normal utf-8 IO layer stuff doesn't work. So we have to explicitly decode the string from perl's internal representation into utf-8.
2009-03-09avoid uninitialized value warningsJoey Hess
2009-03-08When loading a template in scan mode, let preprocess know it only needs to scan.Joey Hess
This makes wikis such as zack's much faster in the scan pass. In that pass, when a template contains an inline, there is no reason to process the entire inline and all its pages. I'd forgotten to pass along the flag to let preprocess() know it was in scan mode, leading to much unncessary churning.
2009-03-08avoid potential infinite loop in smiley expansionJoey Hess
- In 3.05, ikiwiki began expanding templates in scan mode, for annoying, expensive, but ultimatly necessary reasons of correctness. - Smiley processing has a bug: It inserts a span for the smiley, and then continues searching forward in the content for more, starting at $end_of_smiley+1. Which means it searches for smilies in the span too! And if it somehow finds one, we get an infinite loop here. - This bug can, probably, only be tickled if a htmllink to show the smiley fails, because the smiley file doesn't exist, or because ikiwiki doesn't know about it. In that case, a link will be inserted to _create_ the missing page, and that link will include the smiley inside the <a></a>. - When a template is expanded in scan mode, and it contains an inline, the sanitize hook is run during scan mode, which never happened before. That causes the smiley processor to run, before ikiwiki is, necessarily, aware that all the smiley files exist (depending on scan order). So it inserts creation links for them, and triggers the bug. I've put in the simple fix of jumping forward past the inserted span, and it does fix the problem. I will need to look in a bit more detail into why an inline nested inside a template is fully expanded during the scan pass -- that really shouldn't be necessary, and it makes things much slower than they need to be.
2009-03-08configure wmd to leave text in markdownJoey Hess
2009-03-07look for wmd/wmd.jsJoey Hess
This means that the underlay needs to have a wmd/wmd/wmd.js, which is a trifle weird, but it isolates all the wmd stuff in a single wmd subdirectory of the built wiki. The wmd/images creating a toplevel images directory was particularly bad.
2009-03-07make wmd comment support comment editing (I think)Joey Hess