Hi,
I’ve managed to reduce my driver to its structural essence. It still
crashes on the thread’s termination and I just don’t see what I’m doing
wrong. The source for this reduced driver can be found here:
http://home.wi.rr.com/dlarson/bad-driver.zip
I’ve been testing on a Win2k system with Driver Verifier enabled and the
checked kernel and hal dll’s loaded. My Win32 test program which simply
does synchronous reads and writes works fine. However, when I use a legacy
win16 program (see backround below) it crashes. Other recent posts have the
various debugger dumps.
Some background…
The purpose of the driver I’ve created (most of which has been ripped out
of the above version since it doesn’t affect the crash) is to implement a
legacy serial protocol that requires very tight timing that can’t be
achieved in a normal user mode program. The serial protocol is used by an
old Win16 program that can’t be ported to Win32.
Basically the driver must react to an address byte poll within 10 ms. The
address poll is signaled by the toggling of the CTS line. I implement the
protocol’s state transitions within a driver created thread. That actual
reaction to the address byte is performed in the I/O completion routine of
the address byte read IRP (this code is also missing in the above.) Writes
and reads to the port are queued in the driver for sychronized I/O that is
managed by the thread.
When the serial port is in this special protocol mode (which is set by a
DeviceIoControl call on the controlling device driver – again, not given
in the above stripped down driver) all DeviceIoControll calls which could
affect the state of the serial driver are intercepted by the filter driver
and simply completed successfully. Calls that fetch information are passed
down.
Could someone take pity on this poor wretch of a programmer, sneak a peek
at the code, and tell me if I’m doing something wrong?
Dale