summaryrefslogtreecommitdiff
path: root/debian
diff options
context:
space:
mode:
authorJoey Hess <joey@kodama.kitenet.net>2008-11-12 17:19:41 -0500
committerJoey Hess <joey@kodama.kitenet.net>2008-11-12 17:30:54 -0500
commit716560b7f15b6e15b246c39c11eb8181d91c8662 (patch)
tree5b7a30dd2f18e02b02f064d0a1ab59fe891b6a71 /debian
parent2c858c9c95e287ebe3740a94f983f6ae9d6fb080 (diff)
check for invalid utf-8, and toss it back to avoid crashes
Since ikiwiki uses open :utf8, perl assumes that files contain valid utf-8. If it turns out to be malformed it may later crash while processing strings read from them, with 'Malformed UTF-8 character (fatal)'. As at least a quick fix, use utf8::valid as soon as data is read, and if it's not valid, call encode_utf8 on the string, thus clearing the utf-8 flag. This may cause follow-on encoding problems, but will avoid this crash, and the input file was broken anyway, so GIGO is a reasonable response. (I looked at calling decode_utf8 after, but it seemed to cause more trouble than it was worth. BTW, use open ':encoding(utf8)' avaoids this problem, but the corrupted data later causes Storable to crash when writing the index.) This is a quick fix, clearly imperfect: - It might be better to explicitly call decode_utf8 when reading files, rather than using the IO layer. - Data read other than by readfile() can still sneak in bad utf-8. While ikiwiki does very little file input not using it, stdin for the CGI would be one way.
Diffstat (limited to 'debian')
-rw-r--r--debian/changelog6
1 files changed, 6 insertions, 0 deletions
diff --git a/debian/changelog b/debian/changelog
index 99f35482e..3838a3e90 100644
--- a/debian/changelog
+++ b/debian/changelog
@@ -1,3 +1,9 @@
+ikiwiki (2.70) UNRELEASED; urgency=low
+
+ * Avoid crash on malformed utf-8 discovered by intrigeri.
+
+ -- Joey Hess <joeyh@debian.org> Wed, 12 Nov 2008 17:30:33 -0500
+
ikiwiki (2.69) unstable; urgency=low
* Avoid multiple ikiwiki cgi processes piling up, eating all memory,