How would I go about accessing this video stream to send to a streaming service such as YouTube (or maybe transfer over MQTT). From what I understand only one app can access the Raspberry Pi Camera at a time, so I think this would be the only way I could go about getting a stream. I could also try to repeatedly capture images, but that wont work when there’s a scan function running. Any ideas?
The Openflexure sends the video stream as a web stream that you can access on another computer on the local network. This is what you see in the live view in the Openflexure webapp GUI. There is also an http request that will give you the stream only without the rest of the app
There are security issues with opening up the Openflexure server to http requests from outside the local network, but it feels as though it should be possible for a second local computer to request the web stream from the microscope and forward it where you want? I have no idea whether that is actually possible or sensible.
There are I think also systems to stream whatever is being sent to a display, which would need access to the display output rather than the camera input and could again be on another computer on the local network.
The HTTP stream is an MJPEG stream - it’s not very efficient, but if you use something to transcode it, you ought to be able to stream it quite nicely. I’d have thought the easiest way would be to use something on a different computer (e.g. OBS Studio) to access the MJPEG stream (or a window showing it) as a “video source” and then stream it to wherever you want.
I think MQTT is generally intended for quite short messages, most examples of video/image transfer I’ve seen use MQTT to notify that an image is ready, then HTTP to actually download it. In principle it wouldn’t be too hard to write Python code that does something with each incoming frame of the MJPEG stream - there is some code in the python client module that does that (openflexure-microscope-client
).
We’re trying to avoid use of a second computer, and rather stream directly to YouTube from the Pi; however perhaps if the Pi is resource limited in this regard, having a second Pi (even just a RPi Zero 2W) would work OK, and this would only apply for a few devices.
Perhaps a standalone terminal running a Python script to access the http stream and pass that to YouTube would work (avoiding the need for a second computer). @kenzoat had something working with streaming a screenshare of the portion of the stream with the live preview, so something similar but accessing the http stream directly would likely work. The screen share method was a quick workaround, but seems to not be feasible for long-term streams per the companion post (again, sorry for the duplicate! That was my bad)
We have MQTT working for “point and shoot” style commands with a moderate-quality compressed image. We have this separate from the streaming, to your point about it being meant for shorter messages. The primary reasons for using MQTT for the “point and shoot” commands rather than using http is simply that we’re comfortable with it and trust that our implementation is secure/encrypted/private and relatively robust and part of our ecosystem already, whereas we’re a lot less comfortable with http. Maybe a good time for us to learn!
Is your requirement to operate the microscope outside the local network? and have you got a single microscooe is each local network? I see then that a second local computer is not a great solution. I was imagining that you were just wanting to stream your local microscope to other people.
The Pi Zero2 should have enough to run the ‘lite’ version of the Openflexure server. I got one, but i have not got around to trying it out. Then a local Pi 4 (Pi 5 would do better, and you can use the latest Pi OS as it is not running the Microscope server) could do the recoding and other things? If the Zero2 works, this might be a better distribution of your big-Pi, little-Pi suggestion.
I guess streaming to YouTube or similar needs an outgoing HTTP connection - that’s not a security problem as it wouldn’t let anyone control the Pi, but it’s also not a function that’s built in to the Pi. I’ve never streamed to YouTube from Python, but I guess it would be a relatively simple operation to read frames of the microscope’s MJPEG stream, and then upload them. If you’ve got enough bandwidth to stream them without re-encoding, that’s definitely lightest on the Pi’s CPU.
It is also possible that something like gstreamer could read the MJPEG stream and upload a properly compressed version. I’ve never done it on the Pi, but it’s just about possible it’s capable.
Do shout if you get it working - it would be lots of fun to see live streams