Internal Buffering within the standard usb webcam driver

I am having trouble reading an ~85 FPS USB webcam Windows, on average I am measuring 85 Hz but its being read in bursts so it looks like there’s buffering within the Windows standard uvc I have a python script like below to capture images import cv2 as cv import sys from datetime import datetime previous_time = datetime.now() n = int(sys.argv[1]) cam = cv.VideoCapture(n) while 1: ret,frame = cam.read() now = datetime.now() td = (now-previous_time).microseconds previous_time = now print(“Frame delta, {} ms”.format(td/1000)) When looking at the printed output, the time delta between frames is either 30 ms or near 0ms. Frame delta, 30.877 ms Frame delta, 0.0 ms Frame delta, 0.999 ms Frame delta, 30.525 ms Frame delta, 0.0 ms Frame delta, 0.997 ms Frame delta, 29.956 ms I have tested the same webcam on Linux with the same script and the frame delta is a lot more consistent (close to 12 ms with ± 5ms deviation) which is what I am expecting. I have tried the Microsoft Media foundation backend with OpenCV and also the direct show backend, both have the same behavior. I have even tried write a C++ version of the script using the Microsoft Media Foundation API and it has the same issue. Writing a new Windows driver would be a last resort but I’m running out of options, any advice would be greatly appreciated.

Why do you care" You can’t display anything that fast anyway.

What’s the frame size and format?

800,800 with Y8. We aren’t displaying this stream. We are using these streams for eye tracking and object tracking. Both of the algorithms involved need smooth low latency frames to work well.

Shouldn’t happen. I’ve run cameras at 120Hz without that kind of problem. Do you see the same results if you use time.time() instead of datetime.now()? Do you have access to graphedt? The renderer there tracks the frame rate and jitter amount.

I can get access to that , we also can reproduce this in C++ using the Windows media foundation API and it also can be seen with gstreamer. Were you using bulk or isoch UVC? We are using isochronous. Even on a normal low FPS (10 FPS) laptop webcam we see -+20 ms of jitter which seems excessive.

in your C++ code, what was your time source?

Std::chrono

in your C++ code, what part of std:chrono is your time source?

uint64_t micro = duration_cast(system_clock::now().time_since_epoch()).count()

I think std::chrono::steady_clock would be better suited measuring time diffs reliably.

You might try QueryPerformanceCounter once to see if the results are the same. Remember to divide by QueryPerformanceFrequency.

in your version of std, what is the source of system_clock?

this article discusses the basics. Note that the names of these things are different on *nix, but the concepts are all the same

https://learn.microsoft.com/en-us/windows/win32/sysinfo/acquiring-high-resolution-time-stamps

Solution: We implemented our own host webcam driver with libusb