2003-07-01 17:04:39

by Jan Hudec

[permalink] [raw]
Subject: [bug?] How to stuff 21 bits in __u16

Hello All,

I have a question to definition from nls.h (both 2.4.21 and 2.5.56):

The utf-8 decoding stuff seems to handle all characters up to
0x7fff_ffff. But then it supposes to store them in wchar_t and it is
defined as __u16. To me it seems like a bug (which should moreover be
trivial to fix with something like:)

--- linux-2.4.21/include/linux/nls.h.orig 2003-06-30 10:12:37.000000000 +0200
+++ linux-2.4.21/include/linux/nls.h 2003-07-01 19:07:17.000000000 +0200
@@ -4,7 +4,7 @@
#include <linux/init.h>

/* unicode character */
-typedef __u16 wchar_t;
+typedef __u32 wchar_t;

struct nls_table {
char *charset;

-------------------------------------------------------------------------------
Jan 'Bulb' Hudec <[email protected]>