I'm developing a KMDF driver that needs to read from a target serial port. It almost works, ie. my driver does receive bytes from the serial port as expected. The only problem is that any bytes that were sent to the serial port, whose value is over 128, lose the most significant bit, ie. if I send 130, my driver receives 2. Other bytes are received just fine.
The actual setup is that the driver runs in Hyper-V virtual machine and COM1 that I use is a virtual serial port, whose other end is a named pipe on the Hyper-V host machine. Now this is the confusing part:
- in Device Manager (inside VM), COM1 driver shows settings that indicate 8 bit data - I checked this because I suspected the setting was for 7 bit data, that would explain the loss of MSB.
- I wrote a separate console application (user mode) and this application receives the same message from the serial port without losing MSB, ie. I send 130 and the application receives 130. The application runs inside VM just as my driver and doesn't touch serial port settings.
- I found registry settings for ports at HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows NT\CurrentVersion\Ports and these are, for COM1: 9600,n,8,1, so all looking good.
In my driver, I printed received data as soon as I received them, before any processing, just to make sure the loss of MSB is not due to a programming error - still, it shows the loss of MSB.
I tried to find out how I could read serial port settings, the way my driver would see them, but it seems that the current serial driver doesn't have an IOCTL that would return that information.
Has anyone experienced something similar? How can I check and/or set comm port parameters, if that is the cause of this problem?
Regards,
Milan
Why not use the set of IOCTL_SERIAL_* ioctls to set the port configuration to what you want? See for example IOCTL_SERIAL_SET_LINE_CONTROL.
Note that this would be much easier from user mode using the com port apis. Are you sure you need to be a KMDF rather than a UMDF driver?
Ahh, that's what I was looking for, thanks Mark. I did look at doing that via IOCTL, but there were so many and I checked names only and the one that was allowing setting of serial line properties, but these were just driver limits.
I have to do it via kernel as my requirement is that when a particular sequence of input characters is detected I need to send something back via serial line within a few milliseconds, which is the kind of response time I don't think I can achieve from user level code.
I'm really curious what sort of serial line settings I'll read from my driver. This 7 bit character conversion is really baffling me. My understanding is that when OS starts, the settings (baud rate, data bits, etc) are read from the registry and then given to the serial driver to initialize hardware according to those settings. I know that I'm dealing with the virtual COM port, but one would expect that it would respect those settings as well. They were created using Set-VMComPort Powershell commands and these don't have anything that controls baud rate, etc so I assume they would behave like other "normal" com ports would.
I don't think I can achieve from user level code.
It is a commonly held belief that kernel code is somehow faster than user-mode code. That belief is untrue.
I agree with that statement. In this case, it's merely faster because I avoid the path from the kernel driver to the user level, through multiple layers (application was written in C#), then back to the kernel side to send response. Here, everything is done within the same function in the driver. Of course, perhaps it's 10 microseconds vs 100 microseconds and the effort is not worth it, but we had some measurements when this communication was done purely from user level, and the code was frequently failing to meet the deadline for sending response.
The fact that you have actually done the instrumentation raises your standing in my opinion.
Thanks Tim
The problem is finally resolved. It turns out, the serial driver did ignore registry settings and had data bits set to 7, instead of 8. Executing IOCTL_SERIAL_SET_LINE_CONTROL with 8-bit word setting fixed the issue (thanks Mark!).
The registry setting is read by the very old user mode serial APIs and applied when you open the port . The in box driver IIRC doesn’t read this setting either
1 Like