Re: unicode (char as abstract data type)

Stephen Williams (steve@icarus.icarus.com)
Fri, 17 Apr 1998 12:04:01 -0600


cgweav@eskimo.com said:
> The needed breakthrough is to forget the C language signed/unsigned
> char as having something to do with natural language, and see it as
> merely a byte-wide int, and the name of the data type as some archaic
> baggage from the pdp-11.

There is a bit more to it then that.

I find myself writing NT kernel mode drivers, and being compelled to use
UNICODE is more then just irritating. The problem is that the programming
language thinks in terms of char* text. You start using wchar_t and before
you know it, you have a huge mess and you just can't seem to get the types
quite right anymore.

There are certainly better UNICODE manipulation libraries then those that
MS provides in the NT kernel, but, well there is more to it then just
changing one's mind.

-- 
Steve Williams                "The woods are lovely, dark and deep.
steve@icarus.com              But I have promises to keep,
steve@picturel.com            and lines to code before I sleep,
http://www.picturel.com       And lines to code before I sleep."

- To unsubscribe from this list: send the line "unsubscribe linux-kernel" in the body of a message to majordomo@vger.rutgers.edu