Help connecting a USB webcam / c270

Hi There,

I am struggling to have my camera to be recognized by the OF server. I do not manage to have a image…
I am using a USB cam (logitech c270). Also tested with the well known C920. Neither is recognized.

I checked the cams are working on the Pi using VLC.

Error description:
In the info tab it is always stated “no camera connected”.
Also had the following message in the log: “WARNING: Requested an LST from a non-PiCamera streamer”

In my JSON setting file, I noticed the camera type is set to “MissingCamera”, so I guess I should change that, but I can’t find what to put there to point to my usb cam.

Thanks a lot in advance for looking at it !

Hi @Rup, we should perhaps be clearer about this, but while the c270 (and a couple of other options) is supported by the hardware design, currently the OFM server software only works with the Raspberry Pi camera module. When I’ve built USB versions, I have usually just used a webcam application (like the Windows camera app, or QuickTime on a mac) to view the stream. This is something that might be supported in the future, though I suspect you’d need a Pi 4 at least, because the USB webcam would put a much greater load on the CPU than the Pi camera.

Hi @r.w.bowman,

Thanks for your reply ! This makes things clearer.
At least I am not facing a weird bug :slight_smile:
Indeed It wasn’t really clear to me that making a build with the c270 would not work with the software. (Maybe leaving a note on the STL configurator or on the assembly instruction page might guide others.)

The good news is that it maybe could ! (it would help keeping openflexure low cost as 720p usb cams can be sourced way cheaper than the RaspPi cam v2)
I am not sure if they are other constraints on the camera and software than simply adding a new stream source to the code, but if this is the case, I might give look with a friend and see whether we could contribute on the Git.
I get the potential issue on Pi computations limitations, but except from this, does a usb cam would face other uncompatibilities with some OF features due to intrisinc various characteristics (resolution, different fields of view than raspi cam,… ? )
Thanks !

There are several functions of the software that I think use the fact that the Pi Camera is integrated quite deeply into the Pi hardware.

Getting a USB camera to show up in the view window and getting it to capture images are simple functions that I can imagine would be possible to implement, but might have slow response as Richard points out.

The fast autofocus routine requires being able to grab information from the live preview feed, which is probably very hard to replicate with an external USB camera. Similarly, I expect the information used for auto exposure and white balance are quite specific to the Pi Camera.

However for manual use, open the Openflexure Webapp software to do motion and a USB camera app to do image capture and you have most functions except for autofocus and auto scan. You could automate specific tasks using the Python or Matlab client, Python libraries which are able to talk to the USB webcam at the same time must exist.

This is about right I think, the major missing feature would be autofocus. I think this might actually work, just less well, because you’d have to generate the MJPEG stream in software rather than hardware (a stream is required for the live view anyway). I know @B.Diederich implemented a USB camera, so he clearly figured out a way to compress each frame as JPEG already.

If you’re using a USB webcam I think it makes sense not to use a Pi, because you could do with more processing power and aren’t tied to the camera. If you wrote a replacement camera class, in theory it should all work - but is completely untested so far… The next release will have a way to swap in a different camera class, so it should be possible without needing to modify the core server code.

Wuhu. Looking forward to see the next release @r.w.bowman! :slight_smile:
Regarding the USB camera-bit. I somehow managed to convert the numpy frame into a pillow image object that is then compressed into the JPEG stream of images. If you’re curious the code is here (disclaimer: It’s coded by me, so please don’t expect much from it :wink:
I think if you replace the way you read the frames by the openCV method you’re pretty much done.

I realise I didn’t answer your quesiton about compatibility fully - I think you’d want a sensor that is vaguely similar in size, otherwise you’d need to adjust the tube lens (there is now some documentation about why the tube lens is 50mm and what would happen if it changed, but I think it’s on the v7 branch still). You’d also want pixels that are small enough to slightly oversample the image, the 1.12um pixels on the Pi Camera v2 are definitely smaller than we need, but being bigger than that by a factor of 10 would definitely degrade image quality quite a bit. I think a factor of 2 or 4 is probably OK though.

Pixel size and sensor size are the main things to check - but obviously board size matters too, just to make sure it physically fits inside the microscope!

Thanks for the clarifications, this makes clearer what usb cams are viable or not.
The c270 is already borderline (with 2.8um pixel). And the image quality is still rather good. Therefore it is already probably the best compromise when going for USB cam.

Sorry for the late reply, but thanks for the directions @B.Diederich , I’ll have a look to your code.

great - the c270 is also nice because it’s easy enough to get hold of a consistent part. That’s always significant if you want to put effort into mounting it nicely! I suspect we’ll see a range of higher-end machine vision USB cameras used in coming years, for some imaging techniques where efficiency matters a bit more, those may have bigger pixels but will probably also have custom optics modules to go with them.