I am writing a WDM driver to interface with a digital camera over USB. My
implementation utilizes 2 Bulk pipes for In/Out each with a max packet size of
64 bytes. .
My question is:
Will the WDM USBDI always try and read a complete packet off of a pipe
regardless of the requested size. So for example, if my IN pipe has a max packet
size of 64 bytes, and my driver only requests 4 bytes (Which is what my first
read from my app requests),
Will USBDI still try to read all 64 bytes from the pipe (Assuming ofcourse that
my camera has set 64 or more bytes) ?? I am currently getting a BUFFER_OVERFLOW
error when I request 4 bytes from USBDI when there is a 64 byte packet sitting
on the IN pipe. If I request any multiple of 64 all is well however. What I
think is happenning is that USBDI is reading one packet at a time, no less. So
If I request 4 bytes, my read dispatch routine is sending a URB down, saying it
wants 4 bytes. The buffer in which USBDI will read the requeste data into is
only 4 bytes *BUT* USBDI must read the Maximum Packet size so it tries to put
the 64 into the 4 byte buffer.
Is this correct ?
If this is all correct, I can see 2 solutions to this
1. Have my driver always make requests in multiples of 64, and buffer any
overflow from an operation, Thus if I am servicing a Readfile that only wants 4
bytes, in reality I will read 64 in the driver, return 4 and store 60 in my
Device Extension for the next app read (This stored data will either be all of,
or part of the next read) . This solution keeps all the details of this in the
2. Leave the driver alone, and have the interface DLL (Most applications will
like against a DLL I am writing to talk to my driver) always read in multiples
of 64 from the device, taking care of the buffer overflow internally.
Any suggestions ?