First build, basic low cost microscope v7.0.0-beta1

I am a pathologist and interested in creating a DIY slide/section scanner that would be adequate for histology/histopath education purposes. Ideally something a resident could build as a research project for less than ~1500 USD that would put out nice quality large area scans. The OFM ecosystem seems like a great place to start. I wanted to try out the basic optics and experiment with the tiling software before starting to modify things or building some fancier optics.

Comments on building version 7.0 beta 1:

  • I was concerned about the printing process because this is my first attempt at FDM printing. I was gratified that within 6 hours of opening the box I was able to assemble and get pretty decent prints out of a stock original Ender3. Printing all the parts took about a week without any major issues or notable failures. This basic printer is perfectly adequate to print the microscope.

  • Itā€™s a bit hard to know what orientation the camera board should have relative to the pi_camera_platform part, since the screws are in a rectangular pattern it could be mounted wrong. Perhaps the side of the part could be marked with some text or image to denote the direction the camera connector should point.

  • I used the motor controller workaround, and found that even with an extra-long camera cable it was VERY hard to connect everything together in the recommended order. I found that connecting the cable to the Pi first, and then fishing the loose end of the cable up to the camera later after the electronics drawer was closed, was much easier than trying to wire things up with the drawer tethered half-shut by the delicate ribbon cable. I also cut a slot for the camera ribbon into the mounting board that sits on top of the Pi, similar to how the Sangaboard is laid out.

  • There wasnā€™t an obvious place to feed in the external power cable destined for the motor controller boards. I ended up just making a hole and mounting a female micro-USB socket to accept external power.

Hereā€™s the first image (this is a bone marrow core, primary myelofibrosis), after centering and focusing the condenser and calibrating the camera:


Not bad! the shading correction is working well. The optics donā€™t seem to have a particularly flat field, but I also tried doing a z-stack and extended-DOF stacking in Fiji and got this result with more of the edges in focus:

Overall I think the center 1/3 of the image is quite good; there seem to be enough resolution in the optics to match the magnification. The color response toward the edges is not great which I understand is due to the camera sensorā€™s lenslet array design and not something that would improve with the high-res optics and a discrete objective.

Compensating for this, we can get a nice-looking image, though admittedly I just eyeballed the correction:

I will try some larger stitches next.

4 Likes

@Yashka I am glad you seem to have found the build relatively straightforward. Your images look very good. Did you take note of the Openflexure Microscope V7.0.0-beta1 released thread bugs and bug fixes? The lens spacer for the low cost optics of the beta-1 release has a mistake in the lens holder which can mean that the lens sits at an angle. From the quality of your images, the lens must be straight even if you did use the lens spacer that has the bug!

I shall make a note of your comments in our issues list. Improving clarity of the orientation of the Pi camera on the platform is not something that we had spotted before. There are currently not even words in the instructions to describe the orientation. The latest iteration of the camera platform has a small dimple on the top of the platform which is the optical axis and sits under the camera module (merge request !309 with changes visible in the view app button from that page, at least until it is merged into the main version). That should help with making some description of the orientation in the text.

The power and camera cable issues are known, but have not actually been listed yet!

It would be really interesting to hear how you get on with applying this for histopath education.

Oh! That could definitely be an issue. I might try swapping that out regardless, as I do see a slight tilting of the field that I donā€™t think is the stage. I wonder how hard it is to remove the pi lens from the spacer without damaging it.

The Pi lens should come out reasonably easily using something pointy around the side. If you are printing a new one, could you please try the spacer and platform from the merge request 309 linked above. The view app button gives you the instructions containing the new parts. If the instruction for mounting the camera were:

  • Take the camera platform and note the position of the dimple which marks the axis of the microscope.
  • Take the Pi Camera and place it on top of the pi camera platform, with the camera sensor above the dimple.

Would that be simple and clear?
Edit: The beta-1 camera platform does not have the dimple, I added it to help to check the changes I was making.

Yes, those instructions seem very clear - I donā€™t remember if there was a dimple on my beta1 platform.

I attempted to poke my lens into a more correct position and only succeeded in making things much worse, so I will definitely be printing a new spacer.

edit: well, I clearly spoke too soon regarding how easy it was to get my printer running. Mainboard seems to have fried itself beyond rescue. There will be some delay in making any new parts, clearly.

1 Like

Iā€™m back up and running, and the new lens spacer is printed. The flatness of field and overall sharpness are improved. Hereā€™s a quick area stitch and 1:1 subset I took today and stitched with @JohemianKnapsody openflexure-stitching module.

1:1

I am having some trouble with my Z axis; small movements sort of just jiggle things and sometimes move in the wrong direction. I guess this is backlash-related but Iā€™m not sure how the compensation settings work.

3 Likes

That is very impressive, and particularly nice that you have been able to get the new stitching working.

For your z-axis problem it sounds as though the mechanism might be broken. Backlash would mean that the stage would not move initially when you change direction (for about 80 steps), but then it should move smoothly until you change direction again. It should never move in the wrong direction and it should not jiggle. This thread Microscope v7 stage is jumping / skipping / rewinding was an extreme case with similar symptoms. I suppose if you have a high resolution system and you are doing really small movements then the motor might shake the system enough to jiggle sideways when it moves, and within the backlash range you might see small motion in the wrong direction rather than being stationary. How small are the ā€˜smallā€™ steps, and is it smooth for motion in one direction or for large moves?

1 Like

Wow, I would say that looks great, but youā€™re better qualified to say that than I am! Great to hear that you used our stitching program as well, Iā€™m curious how you found using it?

Nothing to add to Williamā€™s suggestions about z-motion and backlash, I think heā€™s covered it. If this is an automated scan with a small step size in z, I am looking to add something that will make z movements a bit more reliable

Really great images indeed. It will be nice to see images of your OFM build.

I havenā€™t done any more imaging, but hereā€™s a photo of the setup:

I also made a prototype color calibration matrix by imaging my smartphoneā€™s R G and B pixels:

I identified what each pixel was supposed to be, got a list of pixel values for each channel in those regions and their coordinates, and then extrapolated a smooth image of each color response.
image

I then applied the correction to a test image:
Before:
image

After:
(Removed image because the results are deceptive, this did not work)

full notebook
This code is very inefficient and takes minutes to process one image, but Iā€™m sure there are ways to speed it up.

Edit:
I actually made a mistake with my math, and the nice-looking result above was a fluke. While I can get fairly nice looking images of the color responses and can use that crosstalk data to unmix the original image of the smartphone pixels, I canā€™t yet use that to correct other images. Iā€™ll keep working on it.

3 Likes

The stitching works quite well! I have some experience with ASHLAR, a multi cycle imaging registration program for cyclic immunostaining, and compared to that experience this stitching seems much less likely to fail (give garbage results) but is slower.

The output quality is very good. I understand not using tile blending was a design choice but I do think itā€™s something that most users would desire.

2 Likes

Yes I donā€™t think weā€™ll ever be as powerful / versatile as ASHLAR, and stitches can definitely go wrong, but it should become fairly seamless soon - realtime stitching in our OpenFlexure connect software is the aim, with users only changing settings if they need to

Tile blending is definitely something weā€™ll look into, at the moment weā€™re focused on making the existing features reliable before testing it out in a pathology teaching hospital

WOW. Thanks so much for taking time to show these results. I always wanted to find time to practice writing python scripts but never seem to find the time for it. I am very impressed by the quality you get using the low-cost objective.
I recently purchased a very cheap 60x on amazon and the details are outstanding. lung

1 Like

Looks great! Iā€™m going to set up my high res version with a 40x soon

I have updated the color correction code; I think it works properly now. The smartphone screen image needs to be taken using the exact settings and optical configuration used to acquire the images to correct.

notebook

1 Like

@Yashka, how can I acquire an image with RGB pixels? Currently, Iā€™m only getting a blacked-out image even following the instructions:

Answering here, noting the similar question you posted on your other thread Focusing and Chromatic Aberration Issues

I assume that the microscope still works in brightfield mode with a slide?

Is it as simple as brightness? The phone screen will not be as bright as the focused LED from the condenser. In the settings tab there is a ā€œAuto gain and shutter speedā€ button which would adjust the camera to the dimmer image, if that is the problem.