Building a platform for 'automated' cell localization

Hi
For my bachelors end project, I’ve tried to create a “automated” platform for cell localization. The aim of this project is to reduce time needed to localize a Circulating Tumor Cells(CTC) with the ONI nanoimager. The automation of this process saves researchers a lot of time.

After rinsing a clinical blood sample to isolate CTC’s, cell density is very low. The ONI nanoimager is used to do research on the CTC’s. It has a very low Field of View (50 - 80 micrometer), making the localization of CTC’s very time intensive.

I have build the Openflexure microscope as platform to scan a sample slide to search for CTC’s. This is more effective because the openflexure has a larger FOV and is easily customizable.

The goal of this project is to localize CTC obtained from clinical blood samples. Using the Openflexure microscope, a scan of a sample can be made. So that CTC’s can be localized, and their coordinates be used as input for the ONI nanoimager.

All files mentioned can be found on github, i have also uploaded interesting pictures to this github

The custom cad files are below: some are made with help from a friend, others are self-made using tinkercad

ArducamOptics.stl: Camera objective custom made for rducam 8MP 1080P USB-cameramodule (the outer green part has been removed from the arducam)
Slide holder ibid.stl This holder has been made to fit a (75,5x25,5 mm) (Ibidi) sample slide. The slide holder can be mounted on the OF stage and the ONI stage making sure the slide always has the same relative position.
slideholder3rdwellcenter.stl This file is is similar to the normal slide holder, but the slide has been moved in length, so that the 3rd well of a Ibidi µ-Slide VI 0.4 is exactly in the center of the OF microscope. This is useful since the OF microscope has a small Range of motion, A full well can not be scanned using the previous slide holder ibid ( THIS SLIDE HOLDER HAS NOT BEEN USED IN THE RESEARCH BUT DEVELOPED AFTERWARDS)
Calibration slide houder.stl This is a slide holder for a calibration slide, used to calibrate the OF microscope. the Ø = 0.15 mm dot is at the exact center of the microscope.
Microscope-3.5mm minder.stl This is an exact copy of the Main body, except the stage is 3.5 mm lower, this is to correct for the heightening of the sample when using the slide holder.

SOFTWARE SOLUTIONS

A python script “NMG_Microscope_Control” is written to: control the microscope, autofocus on the sample, perform a scan, calibrate the microscope to set the zero point and take pictures. The main purpose of this code is to perform a scan on a slide, so that a montage can be made. This montage is later analyzed to determine the location of individual cells, using the 2nd script, “NMG_Coordinate”.
Montages were created using the Fiji plugin Grid/Stitching [11] The Tile overlap was set to 33%.

FLASHING THE FIRMWARE & SOFTWARE
“NMG_Microscope_Control” was written in Python 3.11 using the UC2-Rest Package by Beniroquai. (/github.com/openUC2/UC2-REST) The firmware for this package was flashed on the ESP32 DEV-based UC2 board. Instructions on how to flash the firmware, and the pin configuration for the stepper motors was on the GitHub of OpenUC2 (/youseetoo.github.io)
Instead of using the ESP32 DEV-based UC2 board, a regular ESP32 WEMOS D1 R32-based UC2 board can also be used.

Let me know If you are interested in more information.

1 Like

That is a nice project. Which Arducam are you using? It looks as though it might be the B0196 with the same Sony IMX219 as in the Pi Camera v2. If so the optics modules for that will also be available in the main repository soon.
What method are you using for autofocus? I am not familiar with what is available in the UC2 software and what you have had to write.

Thanks,
Yes i’m using the B0196 camera with a IMX219 lens
This is the Arducam i’ve bought.

The Cad file was not perfect but it did work for my project. Now i have updated the CAD file that the arducam fits perfectly “B0196 arducam USB camera Objective.stl” you can post it in the main repository!
Note that the outer green board should be removed!

I’ve only used the UC2 software to communicate with the stepper motors.

I have written an autofocus script myself using the Laplacian. I’ve tried many different things, but a perfect autofocus stays hard. The best option seemed to first zoom in fast, pass the optimal focus point, and then zooming out very slowly again until the laplacian has reached it’s top

May you be interested, here some more explanation from my report…

“”… This is performed by zooming in a set amount of steps at the Z axis until the image is out of focus, then zooming out untill the maximum Laplacian is exceeded. The Laplacian is continuously calculated, it is an algorithm that … The function stops zooming out after the Laplacian has exceeded ‘Diff Max Laplacian’ over the maximal computed Laplacian. This value is dependent on the sample. The formula for the Laplacian can be found in Appendix A.2

Cells that are not stained appear translucent under a bright field microscope, The color of the cell may vary from gray to white depending on the focus depth. The autozoom is not per-fect and does not focus at the exact same depth every time. The result is that the same cell can appear white in one picture, but grey in the next picture. A mix of these colors makes the stitching impossible.
in the areas where there are no cells, the autofocus function has no reference point to focus on. This causes the Z-motor to keep zooming out, crashing the application. To overcome this limitation in the autofocus function, the Z position is adjusted manually each picture. This makes the scan semi-automatic, the user cannot leave the microscope unattended during a scan.“”

You have a number of difficult problems to solve there!
In the Openflexure software we find that the most robust focus metric is the size of the jpeg images in the camera stream. This also has the advantage of being calculated ‘free’ by GPU in making the camera preview stream on the Raspberry Pi. There is a paper Fast, high‐precision autofocus on a motorised microscope: Automating blood sample imaging on the OpenFlexure Microscope - Knapper - 2022 - Journal of Microscopy which sets out the approach to autofocus in the Openflexure software. (I am an author, but a minor one. The clever stuff on focussing came from the other authors :slight_smile:). I don’t know whether this is also possible in the ESP32 architecture. How big is the Laplacian kernel that you are using, and are you using full 8MP images for your Laplacian calculations?

Mis-focusing on blank areas is also big issue for tiling. @JohemianKnapsody has been working on an automated solution to recognising blank regions. He has a thread on this Forum Automated Slide Scanning and Tiling.

I had a look at your repository, particularly the images of the holders in use. The cross-registration between the two systems is very nicely done.