I have developed a UART driver on Win XP using WDF. I use the following timeouts for my UART driver.
ReadIntervalTimeout = 100;
ReadTotalTimeoutConstant = 100;
ReadTotalTimeoutMultiplier = 0;
WriteTotalTimeoutConstant = 0;
WriteTotalTimeoutMultiplier = 0;
My driver has two parts a Transmit and a recieve part. The transmit part runs in an i7 Quad code desktop and the
recieve part runs in an Pentium D machine.I am using FTDI USB-ser convertor.
When i transmit a file at 460800 baud, 8 bits, Even, 1 stop bit, No Hardware flow ctrl. I can transfer a 5 MB file and the file is recieved without any corruption. When i increase the baud rate to 961200 baud and try the same test, the recieved file is corrupt(i.e. few bytes are missing in the recieved file).
Even if i increase the ReadIntervalTimeout timeout and ReadTotalTimeoutConstant to 500 ms, the corruption happens at 961200 baud.
If i reverse the setup i.e. RX part on i7 machine and the TX part on Pentium D machine. The same logic transfers the file and is recieved without any problems.
I am trying to understand. how to choose a time out value that can work on a variety of machines.I cannot have a large timeout of 1 second as it will degrade the data transfer performance of my driver.
Regards,
Raja