Graphical interface for motor control

Hi everyone,

I just finished to adapt the microscope to my optics (olympus 40x objective and FLIR blackfly USB3.1) and wrote a short arduino code to move the stage since I can’t connect the micriscope because the camera is not supported. The stage moves and everything works fine but of course I don’t have a lot of features like position memorization, or even x-y precise movement since the two directions are kind of coupled.

Is there an option to use the OpenFlexure software with graphical interface to control the motors while acquiring images with another program? Of course connecting the camera and have everything connected would be the best option but I saw other posts in the forum saying that it isn’t possible yet.

Thanks in advance :slight_smile:

The step position is remembered in the Sangaboard Arduino code and you can just talk to that directly over the serial/USB. This should be the same if you are using the Openflexure web app or your own software to talk to the Arduino. From your post I think you are using your own Arduino code rather than the Sangaboard code.

On the microscope each motor gives a single axis of motion, which with the standard Openflexure optics modules is along the horizontal and vertical directions of the camera. However it is not along the long and short axes of the slide in your picture. The two motorised motion axes are diagonally at 45 degrees. If you want to make movements in the directions along the axes of the slide you need to move x and y together for the short axis, and opposite (+x and -y , or -x and +y) for the long axis. You can just put that function into whatever program you are using to talk to your Arduino code (or the Sangaboard Arduino code if you use that.)

You have a very cool system there.

Thank you very much for your reply. Instead of improving my code (the purpose of which was to test the system’s ability to move and focus) I would like to be able to connect to OpenFlexure connect, but unfortunately with the camera I am using, this is not possible.
The purpose of my post was therefore to find out if there is a graphical interface/programme that I can use to control the motors while capturing images via another application, or if there is a possibility to connect other types of cameras.

It is possible to use the OpenFlexure GUI without a camera (it just displays a black feed as a dummy camera, normally seen when the camera is accidentally disconnected). Obviously you won’t be able to use the features such as click to move, but for basic positioning it will work fine. Unfortunately you’ll still have the issue with the camera being 45⁰ from the axis. We’re working on adding the ability to different cameras with the GUI.

There is a Python module that will talk to the Sangaboard (or an Arduino running the Sangaboard firmware):

This can be installed with pip install sangaboard and should work either on the Raspberry Pi, or on your laptop, so it should be OK on whatever computer the board is connected to. I think it requires Python 3.

That module doesn’t provide a graphical interface, but it gives you the low-level move commands (i.e. you can get the position, and instruct it to make relative moves on one or three axes). There’s no provision there for converting between cartesian coordinates and motor coordinates for the Delta stage, though I see you’re using the regular microscope, so that shouldn’t be an issue for you.

When it comes to camera/stage integration, the code that handles camera-stage calibration in the microscope software is also installable as a stand-alone module (pip install camera-stage-mapping). There is a little documentation, and it’s not an out-of-the-box solution, but it should make it possible to map coordinates in your image to coordinates using the stage, even if the axes are not aligned.

The branch of the software that modularises the camera is here:

At the moment, this makes it possible to provide an alternative subclass of BaseCamera that replaces the Raspberry Pi camera. This was the only thing that tied the system to the Raspberry Pi, and so you should be able to run the server on a computer of your choice once you’ve implemented the camera class.

The big downside is that I have not yet documented (or indeed figured out) how many of the functions exposed by the camera class are actually required. What I can say for sure is that the fast autofocus method will only work if you have a reasonably quick MJPEG stream, which is also necessary for the image preview.

Do you know if there are Python bindings for your camera already? That would be a big step in the right direction!

Just to try to give a more concrete example, if you want to write a Python script to control the stage, the most annoying part is often setting up the Python environment in the first place. This varies depending on which operating system you use, and how your computer is configured.

The first step is ensuring you have a suitable version of Python. Anything from Python 3.7 to Python 3.10 should work OK for sangaboard I think. If you open a command prompt (PowerShell is what I use on Windows, or a bash shell on Linux) and type python --version hopefully it will give you a version number in that range. If it starts with a 2, try python3 --version instead - some systems still use python3 instead of python for version 3. If you don’t have Python 3 installed, there are many ways to get it. Most often, I just install Python and then worry about packages later. Others doing science are keen on using Anaconda to manage their Python environments, though I always find it a bit confusing.

Once you have a working version of Python installed, you should make yourself a new folder to work in. Inside that folder, you should create a “virtual environment” to let you install all the Python modules you need, without messing up your system’s Python installation. This stops the work you’re doing with OpenFlexure from messing up other applications or projects on your computer. You can do this by typing:

python -m venv .venv --prompt OFM

This will create a folder called .venv in the current folder, and put in a basic installation of Python.

You should now “activate” your new environment. On Windows, you type

.venv\Scripts\activate

On Linux you need to run

source .venv/bin/activate

After that, you should see [OFM] appear at the start of each line in the console. That means your virtual environment is installed and active. You can now start installing the packages, e.g.

pip install sangaboard

Now, you should be able to start Python and issue interactive commands:

python

That should put you into a Python command prompt

import sangaboard
sb = sangaboard.SangaBoard("COM1")
sb.move_rel([1000,0,0])
sb.position
exit()

Note that "COM1" should be replaced with the name of the serial port you are using. If you leave it empty, i.e. SangaBoard() it will try to autodetect, which may or may not work depending on how many ports you have and what’s plugged in.

The commands above will do a relative move by 1000 steps in X, then print the position, then exit the Python interpreter.

I hope that explains a bit better how to get something basic working!