Age | Commit message (Collapse) | Author |
|
|
|
|
|
|
|
plovs reported a crash when templates were not installed properly,
with a non-useful error about the template object not being defined.
I've audited all uses of template_depends(), and template(), and it makes
sense for them to throw an error if the template cannot be found. All code
with a user-supplied template catches errors already, to handle template
parse failures.
It did not make sense for template_file to throw errors, as some code uses
it to probe if a template file is available.
|
|
That template is user-controlled.
|
|
The HTML::Tree changelog says:
[THINGS THAT MAY BREAK YOUR CODE OR TESTS]
...
* Attribute names are now validated in as_XML and invalid names will
cause an error.
and indeed the regression tests do get an error.
|
|
|
|
With a relative xrds-location, the openid perl client module will fail.
I haven't checked the specs to see if it needs to be absolute, but all
examples I've seen are absolute, so it seems a very good idea.
|
|
|
|
pasted on a page before being cut.
|
|
errors from conflicting obsolete remote branches.
|
|
that contained only a number, fixing a longstanding crash of the rst plugin.
|
|
I also tried setting RPC::XML::ENCODING but that did not prevent the crash,
and it seems that blogspam.net doesn't like getting xml encoded in unicode,
since it mis-flagged comments as spammy that way that are normally allowed
through.
|
|
|
|
second parameter, to allow for plugins that needs access to this information earlier than the delete hook.
|
|
|
|
This could happen if checkconfig was run twice, I think.
|
|
|
|
be configured via the web.
|
|
The only unsafe thing should be that enabling it with some languages will
generate po files.
|
|
If I am not mistaking all source files in ikiwiki are encoded in Unicode UTF-8.
Adding `\usepackage[utf8]{inputenc}` enables LaTeX to deal with the encoding.
As a consequence some special characters like umlauts can be used in the source
code which is useful for foreign languages.
[[!teximg code="a = b \text{ für alle } b \neq 2"]]
But for example »≠« cannot be used in LaTeX right now. One has to use other TeX
systems like XeTeX or LuaTeX featuring native UTF-8 support or use additional
nonstandard packages like uniinput [1].
I used the package `inputenc` (`texdoc inputenc`) and not `inputenx` (`texdoc
inputenx`), because I have not used `inputenx` that much and using the option
`math` is not supported in Debian (and I guess other distributions too) since
`inpmath` is not included in CTAN.
[1] http://wiki.neo-layout.org/browser/latex/Standard-LaTeX
Signed-off-by: Paul Menzel <paulepanter@users.sourceforge.net>
|
|
array of things that need built. (Backwards compatability code keeps plugins using the old interface working.)
|
|
These return codes are not currently used, but might be later.
|
|
a login session.
|
|
|
|
|
|
|
|
Following along with change in Render.pm
|
|
|
|
allow for nonstandard installations.
|
|
|
|
|
|
|
|
A missing smileys.mdwn caused the plugin to error out interrupting the
building process. Instead, we check for the file presence and warn without
erroring out in case it's missing, in a similar fashion as it's
currently done for the shortcut plugin.
|
|
This reverts commit 3ef8864122c2e665d41ed4d45baa50d4a5d21873.
Most aggregators block javascript and so it would display uglily.
Need to find a way to fallback to static buttons instead.
|
|
This makes the javascript be added to rss feeds, which allows the buttons
to be displayed by aggregators. At least, if the aggregator does not
sanitize javascript.
|
|
Thanks to jaywalk for the initial implementation at a flattr plugin!
This one is less configurable, but simpler.
|
|
|
|
cannot identify a file.
|
|
|
|
|
|
This reverts commit 25447bccae0439ea56da7a788482a4807c7c459d.
|
|
|
|
|
|
|
|
|
|
simplify dependencies. Closes: #591040
|
|
|
|
The po rescan hook re-runs the scan hooks, and runs the preprocess ones in scan
mode, both on the po-to-markup converted content. This way, plugins such as meta
are given a chance to gather correct information, rather than ugly/buggy escaped
data it did gather from unconverted PO files.
|
|
This is needed for the po plugin vs. e.g. meta titles.
In order to get rid of the ugly "rebuilding all pages to fix meta titles" thing,
Joey suggested to make "po, at scan time, re-run the scan hooks, passing them
modified content (either converted from po to mdwn or with the escaped stuff
cheaply de-escaped)". This would unfortunately not work, as the meta plugin
gathers its data using the preprocess hook in scan mode: it would overwrite with
buggy data the correct data we would have forced it to gather in po's scan hook.
We then need a hook that runs *after* the preprocess hook has been run in scan
mode, but *before* any page rendering is started. Hence this one.
|