by Michael S. Kaplan, published on 2005/10/20 03:31 -04:00, original URI: http://blogs.msdn.com/b/michkap/archive/2005/10/20/482926.aspx
The LCMapString function looks like it ought to have a clear and unambiguous meaning for its parameters. Look at the prototype:
LCID Locale, // locale identifier
DWORD dwMapFlags, // mapping transformation type
LPCTSTR lpSrcStr, // source string
int cchSrc, // number of characters in source string
LPTSTR lpDestStr, // destination buffer
int cchDest // size of destination buffer
And for most of LCMapString's conversion types it is indeed correct.
Except when dealing with sort keys.
When you deal with sort keys, two things change:
Now obviously those kinds of cch/cb confusion situations can cause problems from a security standpoint (see for example the Platform SDK topic Security Considerations: International Features).
I guess if we were writing the function today, it might have been enough to split sort key creation into its own function.
Though in truth the nature of the confusion here (if there is any) is that buffers will be twice as large as they need to be, which may be a concern for memory consumption/fragmentation or performance. But it is not a security concern like it can be for some of the other NLS APIs.
I guess the key here is to read the documentation carefully. :-)
Internally at Microsoft it can cause (and at times has caused!) confusion as well, since the internal routines that handle sort keys will of course call the size parameter cbDest rather than cchDest, and if you are just looking at the code and you see someone pass a parameter like cchDest to another function asking for a cbDest, you will rightfully wonder if there is a huge bug there. But that is our problem to deal with....
This post brought to you by "ޛ" (U+079b, a.k.a. THAANA LETTER THAALU)
go to newer or older post, or back to index or month or day