I am a pathologist and interested in creating a DIY slide/section scanner that would be adequate for histology/histopath education purposes. Ideally something a resident could build as a research project for less than ~1500 USD that would put out nice quality large area scans. The OFM ecosystem seems like a great place to start. I wanted to try out the basic optics and experiment with the tiling software before starting to modify things or building some fancier optics.
Comments on building version 7.0 beta 1:
I was concerned about the printing process because this is my first attempt at FDM printing. I was gratified that within 6 hours of opening the box I was able to assemble and get pretty decent prints out of a stock original Ender3. Printing all the parts took about a week without any major issues or notable failures. This basic printer is perfectly adequate to print the microscope.
It’s a bit hard to know what orientation the camera board should have relative to the pi_camera_platform part, since the screws are in a rectangular pattern it could be mounted wrong. Perhaps the side of the part could be marked with some text or image to denote the direction the camera connector should point.
I used the motor controller workaround, and found that even with an extra-long camera cable it was VERY hard to connect everything together in the recommended order. I found that connecting the cable to the Pi first, and then fishing the loose end of the cable up to the camera later after the electronics drawer was closed, was much easier than trying to wire things up with the drawer tethered half-shut by the delicate ribbon cable. I also cut a slot for the camera ribbon into the mounting board that sits on top of the Pi, similar to how the Sangaboard is laid out.
There wasn’t an obvious place to feed in the external power cable destined for the motor controller boards. I ended up just making a hole and mounting a female micro-USB socket to accept external power.
Here’s the first image (this is a bone marrow core, primary myelofibrosis), after centering and focusing the condenser and calibrating the camera:
Not bad! the shading correction is working well. The optics don’t seem to have a particularly flat field, but I also tried doing a z-stack and extended-DOF stacking in Fiji and got this result with more of the edges in focus:
Overall I think the center 1/3 of the image is quite good; there seem to be enough resolution in the optics to match the magnification. The color response toward the edges is not great which I understand is due to the camera sensor’s lenslet array design and not something that would improve with the high-res optics and a discrete objective.
Compensating for this, we can get a nice-looking image, though admittedly I just eyeballed the correction:
I will try some larger stitches next.
@Yashka I am glad you seem to have found the build relatively straightforward. Your images look very good. Did you take note of the Openflexure Microscope V7.0.0-beta1 released thread bugs and bug fixes? The lens spacer for the low cost optics of the beta-1 release has a mistake in the lens holder which can mean that the lens sits at an angle. From the quality of your images, the lens must be straight even if you did use the lens spacer that has the bug!
I shall make a note of your comments in our issues list. Improving clarity of the orientation of the Pi camera on the platform is not something that we had spotted before. There are currently not even words in the instructions to describe the orientation. The latest iteration of the camera platform has a small dimple on the top of the platform which is the optical axis and sits under the camera module (merge request !309 with changes visible in the
view app button from that page, at least until it is merged into the main version). That should help with making some description of the orientation in the text.
The power and camera cable issues are known, but have not actually been listed yet!
It would be really interesting to hear how you get on with applying this for histopath education.
Oh! That could definitely be an issue. I might try swapping that out regardless, as I do see a slight tilting of the field that I don’t think is the stage. I wonder how hard it is to remove the pi lens from the spacer without damaging it.
The Pi lens should come out reasonably easily using something pointy around the side. If you are printing a new one, could you please try the spacer and platform from the merge request 309 linked above. The
view app button gives you the instructions containing the new parts. If the instruction for mounting the camera were:
- Take the camera platform and note the position of the dimple which marks the axis of the microscope.
- Take the Pi Camera and place it on top of the pi camera platform, with the camera sensor above the dimple.
Would that be simple and clear?
Edit: The beta-1 camera platform does not have the dimple, I added it to help to check the changes I was making.
Yes, those instructions seem very clear - I don’t remember if there was a dimple on my beta1 platform.
I attempted to poke my lens into a more correct position and only succeeded in making things much worse, so I will definitely be printing a new spacer.
edit: well, I clearly spoke too soon regarding how easy it was to get my printer running. Mainboard seems to have fried itself beyond rescue. There will be some delay in making any new parts, clearly.
I’m back up and running, and the new lens spacer is printed. The flatness of field and overall sharpness are improved. Here’s a quick area stitch and 1:1 subset I took today and stitched with @JohemianKnapsody openflexure-stitching module.
I am having some trouble with my Z axis; small movements sort of just jiggle things and sometimes move in the wrong direction. I guess this is backlash-related but I’m not sure how the compensation settings work.
That is very impressive, and particularly nice that you have been able to get the new stitching working.
For your z-axis problem it sounds as though the mechanism might be broken. Backlash would mean that the stage would not move initially when you change direction (for about 80 steps), but then it should move smoothly until you change direction again. It should never move in the wrong direction and it should not jiggle. This thread Microscope v7 stage is jumping / skipping / rewinding was an extreme case with similar symptoms. I suppose if you have a high resolution system and you are doing really small movements then the motor might shake the system enough to jiggle sideways when it moves, and within the backlash range you might see small motion in the wrong direction rather than being stationary. How small are the ‘small’ steps, and is it smooth for motion in one direction or for large moves?
Wow, I would say that looks great, but you’re better qualified to say that than I am! Great to hear that you used our stitching program as well, I’m curious how you found using it?
Nothing to add to William’s suggestions about z-motion and backlash, I think he’s covered it. If this is an automated scan with a small step size in z, I am looking to add something that will make z movements a bit more reliable
Really great images indeed. It will be nice to see images of your OFM build.
I haven’t done any more imaging, but here’s a photo of the setup:
I also made a prototype color calibration matrix by imaging my smartphone’s R G and B pixels:
I identified what each pixel was supposed to be, got a list of pixel values for each channel in those regions and their coordinates, and then extrapolated a smooth image of each color response.
I then applied the correction to a test image:
(Removed image because the results are deceptive, this did not work)
This code is very inefficient and takes minutes to process one image, but I’m sure there are ways to speed it up.
I actually made a mistake with my math, and the nice-looking result above was a fluke. While I can get fairly nice looking images of the color responses and can use that crosstalk data to unmix the original image of the smartphone pixels, I can’t yet use that to correct other images. I’ll keep working on it.
The stitching works quite well! I have some experience with ASHLAR, a multi cycle imaging registration program for cyclic immunostaining, and compared to that experience this stitching seems much less likely to fail (give garbage results) but is slower.
The output quality is very good. I understand not using tile blending was a design choice but I do think it’s something that most users would desire.
Yes I don’t think we’ll ever be as powerful / versatile as ASHLAR, and stitches can definitely go wrong, but it should become fairly seamless soon - realtime stitching in our OpenFlexure connect software is the aim, with users only changing settings if they need to
Tile blending is definitely something we’ll look into, at the moment we’re focused on making the existing features reliable before testing it out in a pathology teaching hospital
WOW. Thanks so much for taking time to show these results. I always wanted to find time to practice writing python scripts but never seem to find the time for it. I am very impressed by the quality you get using the low-cost objective.
I recently purchased a very cheap 60x on amazon and the details are outstanding. lung
Looks great! I’m going to set up my high res version with a 40x soon
I have updated the color correction code; I think it works properly now. The smartphone screen image needs to be taken using the exact settings and optical configuration used to acquire the images to correct.