Is WPIlib (or the underlying NIVision libraries) expected to deal with multiple, simultaneous USB cameras?
I'm trying to get two USBCamera()s setup -- one for streaming to the DS and one for on-robot vision analysis. I can plug in two cameras, they show up as "cam0" and "cam1", and I can capture (or stream) images from either of them. Everything works as I expect.
But only one at a time.
When I try to start both of those at once the second one started always returns "No image acquisition in progress", even though startCapture() has already been called (for example, when using the WPIlib-provided M-JPEG server, which calls startCapture() internally).
Interestingly this problem goes away if I call USBCamera.getImage() directly instead of in a separate thread. For example, if I rig CameraServer to expose capture(), rig capture() to not loop, and call CameraServer.capture() directly from a command everything seems to work fine for as many captures I care to take. That's just not practical while the robot is otherwise in use -- it needs to run in its own thread to avoid making other commands lag. I don't know if this thread-related difference is important or just a side-effect of tying up the main thread for long periods, but I thought it was worth documenting.
FYI: I'm using the Microsoft LifeCam HD‑3000 camera, though I don't have any reason to believe this is hardware-specific.
I'm trying to get two USBCamera()s setup -- one for streaming to the DS and one for on-robot vision analysis. I can plug in two cameras, they show up as "cam0" and "cam1", and I can capture (or stream) images from either of them. Everything works as I expect.
But only one at a time.
When I try to start both of those at once the second one started always returns "No image acquisition in progress", even though startCapture() has already been called (for example, when using the WPIlib-provided M-JPEG server, which calls startCapture() internally).
Interestingly this problem goes away if I call USBCamera.getImage() directly instead of in a separate thread. For example, if I rig CameraServer to expose capture(), rig capture() to not loop, and call CameraServer.capture() directly from a command everything seems to work fine for as many captures I care to take. That's just not practical while the robot is otherwise in use -- it needs to run in its own thread to avoid making other commands lag. I don't know if this thread-related difference is important or just a side-effect of tying up the main thread for long periods, but I thought it was worth documenting.
FYI: I'm using the Microsoft LifeCam HD‑3000 camera, though I don't have any reason to believe this is hardware-specific.
Comment