I was afraid of that. I went fishing in case some bleeding-edge USB standard
was in the Unicode business, and there was kernel support to be exploited.
-----Original Message-----
From: xxxxx@lists.osr.com
[mailto:xxxxx@lists.osr.com] On Behalf Of Doron Holan
Sent: Monday, August 04, 2003 3:17 AM
To: Windows System Software Developers Interest List
Subject: [ntdev] RE: Fake keyboard driver (w2K and better)
Keyboard drivers report raw scan codes (which aren’t even ASCII values),
they do not report unicode characters at all. You basically would still
need a keyboard mapper DLL to map to the real VKs you want. To have the
system find your keyboard and open it, install it under the keyboard device
class and kbdclass will be an upper filter for driver. Kbdclass takes care
of all the wiring up to the raw input thread.
In the end, I don’t think a driver will get you closer to what you want;
drivers are as complex or more so then an IME.
d
This posting is provided “AS IS” with no warranties, and confers no rights.
-----Original Message-----
From: xxxxx@lists.osr.com
[mailto:xxxxx@lists.osr.com] On Behalf Of benson
Sent: Sunday, August 03, 2003 4:09 PM
To: Windows System Software Developers Interest List
Subject: [ntdev] Fake keyboard driver (w2K and better)
On W2K and above, I have a possible need for a purely software keyboard.
I’ll explain my lunatic scheme, so that the local experts can redirect me if
I’m on a chase of wild gooses.
There’s this Win32 API, SendInput. It is supposed to allow, on W2K and
better, the injection of an arbitrary Unicode character into the input
stream as if it were keyboard input.
We’ve run into two serious limitations/defects, and while we’ve opened a
support issue, we’re not confident that we will get a resolution.
I’ve already thought of one rather expensive alternative: construct a
special purpose IME that mapped, say, four digits of hex to a Unicode
character. IME’s are very complex beasts.
I’m wondering if, paradoxically, it might be less work to try to do this
with a device driver.
Now, in the normal case, keyboard events are mapped to Unicode characters in
ring 3 by some piece of Win32, using the maps in the keyboard dlls. I’m sure
I can’t make a map with 2^64 (more or less) entries, since I’ve discovered
by blue-screen experiment that these dlls end up in non-paged memory. Which
is odd, since they’re not supposed to be used in the kernel, but apparently
something does. If you link one wrong so that it doesn’t have the right
sections, BSOD follows.
Still, I am left wondering if the input device stack in the kernel has some
provision for input devices that yield arbitrary Unicode on their own.
If so, then I’m wondering how hard it would be to concoct a WDM driver that
pretended to be one, while not having any hardware, and THEN I’m wondering
how one would get the system to decide to switch to accepting input from it.
–benson
Questions? First check the Kernel Driver FAQ at
http://www.osronline.com/article.cfm?id=256
You are currently subscribed to ntdev as: xxxxx@windows.microsoft.com To
unsubscribe send a blank email to xxxxx@lists.osr.com
Questions? First check the Kernel Driver FAQ at
http://www.osronline.com/article.cfm?id=256
You are currently subscribed to ntdev as: xxxxx@dchbk.us
To unsubscribe send a blank email to xxxxx@lists.osr.com