Age | Commit message (Collapse) | Author |
|
|
|
|
|
If a page is taken from the underlay, and one of the specified languages
does not have po files in the underlay, it would create a broken link
to the translated version of the page for that language.
With this change, there's no broken link.
|
|
I think the N/A was not intended to be visible, but it can show up as the
percent translated to a language. This happens if the page is located in an
underlay, and not translated to the language in any other underlay.
|
|
|
|
This was tricky. $links{$page/discussion} must be checked; with it in
lowercase.
|
|
version, but continue. Closes: #541205
|
|
When first editing a page that was in the underlay, avoid losing
the translation by copying the po file over from the underlay.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
(and gettext it as translators will see this!)
|
|
|
|
|
|
In order to support translated basewiki and other underlays, we need
support for mo files in underlays.
The code did not allow this before, because if a mo file was in an
underlay, then it might try to update it, and its pot, and write to the
underlay, which is guaranteed to either fail due to permissions, or be
undesirable.
To fix, my approach is to just detect if a mo or pot file that is about to
be updated is in an underlay, and skip updating it. This seems to work
well:
- If the mo is out of date in the underlay, it won't get updated, but this
would probably be due to a problem in the underlay, or more likely,
the wiki is being rebuilt and so it *thinks* the mo is out of date,
but it's really not (and it would be a waste of time to rebuild it
anyway).
- If a page from the basewiki is edited, it is saved to the srcdir,
which causes generation of an updated mo and pot also in the srcdir;
the underlay stops being used for that page, and everything seems
to work.
Note that I am not including an underlay search directory for pot files.
They *seem* to be unnecessary for the underlay, since the mo files
in there never need to be updated.
|
|
|
|
It seem to make sense to remove the check for there being slave languages
as part of this, since one might want a wiki that is only in non-English.
|
|
These are for use by wikis where the primary language is not English.
On such a wiki, it makes sense to use an underlay has the source for pages
in the native language.
|
|
It exports gettext and stuff by default, which conflicts with IkiWiki
exports.
|
|
|
|
|
|
Signed-off-by: intrigeri <intrigeri@boum.org>
|
|
|
|
Signed-off-by: intrigeri <intrigeri@boum.org>
|
|
|
|
|
|
|
|
... as Joey suggested on todo/need_global_renamepage_hook
This hook is applied recursively to returned additional rename
hashes, so that it handles the case where two plugins use the hook:
plugin A would see when plugin B adds a new file to be renamed.
The full set of rename hashes can no longer be changed by hook functions, that
are only allowed to return any additional rename hashes it wants to add.
Rationale: the correct behavior of the recursion would be hard, if not
impossible, to define, if already considered pages were changing on the run.
Signed-off-by: intrigeri <intrigeri@boum.org>
|
|
... as my meta branch probably won't be merged before the po plugin is, contrary
to what I was originally supposing.
This implies removing the po_translation_status_in_links and
po_strictly_refresh_backlinks options.
Added a note to the TODO section to think of bringing these features back later,
as they really enhance user experience on a translatable wiki.
Signed-off-by: intrigeri <intrigeri@boum.org>
|
|
This is intended to solve Joey's concerns expressed on
http://ikiwiki.info/todo/need_global_renamepage_hook/, i.e. the need to make it
possible to use this hook from external plugins.
A plugin using this hook still can add/modify/remove elements of the
@torename array.
Signed-off-by: intrigeri <intrigeri@boum.org>
|
|
It can now be configured with the po_strictly_refresh_backlinks setting.
Signed-off-by: intrigeri <intrigeri@boum.org>
|
|
... that was removed in 68869d664b978b063c9181d024edb34a63306c33
Without this scalar, a two-cells array is passed to $template->param, which
builds a hash with an odd number of elements.
Signed-off-by: intrigeri <intrigeri@boum.org>
|
|
Signed-off-by: intrigeri <intrigeri@boum.org>
|
|
Signed-off-by: intrigeri <intrigeri@boum.org>
|
|
After some thinking about it, I can't find why the type of a page being created
in the CGI could be restricted to po. So the previous case seems enough.
Signed-off-by: intrigeri <intrigeri@boum.org>
|
|
use is file-scoped so warnings and strict are already enabled
inside the second package, and IkiWiki is already loaded
(though not imported into this context)
|
|
|
|
|
|
|
|
|
|
scalar(undef) is undef, so using scalar here had no effect.
Instead make the function return "", probably avoiding an uninitialized
value warning.
|
|
if (scalar @array) is written idiomatically in perl as
if (@array).
|
|
I'm still not happy with the clarity of this warning message.
I don't understand when it could happen or why a warning is needed.
|
|
I prefer to use either of the other two syntaxes perl offers, and not this
one.
|
|
Signed-off-by: intrigeri <intrigeri@boum.org>
|
|
too old po4a
Signed-off-by: intrigeri <intrigeri@boum.org>
|
|
All meta titles are first extracted at scan time, i.e. before we turn
PO files back into translated markdown; escaping of double-quotes in
PO files breaks the meta plugin's parsing enough to save ugly titles
to %pagestate at this time.
Then, at render time, every page's passes on row through the Great
Rendering Chain (filter->preprocess->linkify->htmlize), and the meta
plugin's preprocess hook is this time in a position to correctly
extract the titles from slave pages.
This is, unfortunately, too late: if the page A, linking to the page B,
is rendered before B, it will display the wrongly-extracted meta title
as the link text to B.
On the one hand, such a corner case only happens on rebuild: on
refresh, every rendered page is fixed to contain correct meta titles.
On the other hand, it can take some time to get every page fixed.
We therefore re-render every rendered page after a rebuild to fix them
at once. As this more or less doubles the time needed to rebuild the
wiki, we do so only when really needed.
Signed-off-by: intrigeri <intrigeri@boum.org>
|