Focusing and Chromatic Aberration Issues

Hello everyone, I have constructed an OFM version 7, equipped with an Olympus PLN 100X Oil Immersion Objective. The attached image is sourced from cellular pathology. However, I am encountering challenges achieving a harmonious focus that encompasses both the central area and the edges simultaneously. Additionally, I’ve observed a discrepancy in color vividness between the central and peripheral regions of the image.

I wonder if some specific components might be the cause of the issue. Your assistance and guidance in this matter would be immensely appreciated.


The lack of image colour saturation away from center is more than I would expect. The background is a nice flat white, so the colour correction is working, but is it a strong correction on a guite non-uniform illumination? The microscope software allows you to download the lens shading table, if you post here that it might tell something.

The sharpness drop-off is not going to be the objective with that high quality Olympus! You seem to be experienced, but still obvious things might be missed. So at the risk of stating the obvious:
Did you print the infinity version of the optics module? From what I see online the PLN 100 is infinity corrected.
I assume you have a good quality achromatic tube lens.
Is the tube lens the right way up? The flatter side should face the camera.
Are the camera, tube lens and objective all sitting properly perpendicular to the optical axis. A tilt can give all sorts of aberrations. Anything that might have got under the tube lens would be hard to see.

Aside from those basic things, is the defocus a ring or a sample tilt? On the one image you post it looks to be focused centre and right but out of focus top left? As you move the microscope in z does the focus move from one side to the other? Or does it focus at two sides at the same position and the centre at a different position? A small tilt in the microscope stage is not uncommon, and 100x is very unforgiving. If it looks like a tilt a piece of paper under one side of the slide can confirm.


Hopefully @WilliamW’s comments will help solve the focus issue. The drop off in colour saturation does look stronger than normal. The colour saturation issues come from the lenslet array on the pi camera sensor, as we are changing the angle the light comes in by moving the lens. More detail can be found in our paper on this.

1 Like

Thank @WilliamW @j.stirling for all the advice you’ve provided.
I’ve finally managed to identify the mistake made using the finite version of the optics module. As a result, I’ll need to reprint both the optics and the tall stand. Are there any other changes needed for the print parts?
Additionally, I’d like to know how to ensure that the lens and objective are aligned correctly perpendicular to the optical axis. Any tips or suggestions you can offer on this matter would be highly valued.

The lens screws into an RMS thread in the optics module so it is important not to cross thread it (as the plastic is soft) but then it should sit perpendicular. The lens gripper has an internal seat for the lens so it also should bottom out in a perpendicular position. The seat diameter is about 10mm, and with a print resolution of 0.2mm I would expect it to sit perpendicular to within 1 degree as long as the lens is bottomed out on the seat.

The thing that causes tilt most often is the optics attachment point of the main body becoming detached from the print bed during printing. This can result in a bent channel for the optics module to sit in. This is something that requires extra custom supports.

1 Like

As @j.stirling says, the optics being perpendicular in the optics module is about making sure that they are properly against their seat. The tube lens can be pushed in too little so that it is not on its seat, or there can be stray filament that gets in the way on the seat. A cross-threaded objective will not sit with the objective shoulder fully against the seat at the top of the optics module.

If you have a lens marked infinity and used the optics module for 160mm back focal length that is likely to be the main cause.

I am still unable to resolve the color saturation issue, even after changing the optics module to the infinite version (optics_picamera_2_rms_infinity_f50d13.stl). I am using an Olympus PLN 100X Oil Immersion Objective, with a parfocal length of 45mm and a tube lens focal length of 180mm. Could the issues be related to the optics module length, which changes the angle of the light? Would a customized module be necessary? Thank you for your assistance.


The top left corner appears to be particularly out of focus. @WilliamW mentioned above to try the method talked about here Comparison of a pi-camera and a 20x optical lens - #9 by r.w.bowman

In regards to color saturation I am curious if substantially better results can be I achieved with a standard set up especially at 100x. I have never done any better than you with color saturation. Your images are overall the highest resolution images I’ve seen yet on OFM.

Most evenly illuminated images I’ve seen (Images were taken at lower magnification):

Not trying to be overly negative, if you achieve much more even color saturation I would love to see how you do it!

1 Like

There is the issue of the camera lenslet array that @j.stirling highlighted

The paper gives a thorough description of the cause and the effect. The camera calibration in the Openflexure software corresponds to fig 9 (b,f), corrected for white balance and vignetting but not colour un-mixing. To get the unmixing shown in fig 9 (c,d) or (g,h) requires extra information from test illumination or test targets of known colour.

As the root cause is in the construction of the Sony IMX219 sensor on the Raspberry Pi camera 2 eliminating the issue would be hard. It will be helped if the Openflexure software can work with other cameras, which is a current work in progress. Then it might be possible to find a suitable camera with no lenslet array, or with less lenslet offset. This is less common on small sensors, and using all of a bigger sensor means a much longer optics module.


Most major microscope manufacturers have transitioned to using infinity-corrected optical systems. The documentation for the finite version of OpenFlexure Microscope Imaging Optics can be accessed here. Is there also an infinite version available? I’m having trouble comprehending how to calculate parameters in the finite document like the lens-sensor distance and magnification within the context of an infinite setup. Your guidance and assistance in clarifying these concepts would be greatly valued.

The optics parts for infinity corrected objectives are in the Customisations and alternatives section of the instructions. Ether download all the STLs and find the equivalent part with infinity in the name or use the configurator further down the page. For bright field you would need optics_picamera_2_rms_infinity_f50d13.stl I think.

The infinity corrected optics modules are longer than those for 160mm tube length, you will need the taller stand which is also on the Customisations page.

Glad to see someone reading the optics calculation page. The calculation for the infinite conjugate is much simpler. The distance between the tube lens and the objective is not important as the light should be focussed at infinity. The distance between the back face of the lens and the face of the camera should be the back focal length of the lens. You can check how well this is focused by taking the optics module out of the microscope. Removing the objective. And taking an image of something far away (perhaps aim it out of a window and look at a tree). Is it in focus? If it is not you may need to shim the camera. If it is out of focus on one side this would mean the sensor itself is tilted.

1 Like

Thank you very much, @WilliamW and @j.stirling, for all your assistance. @j.stirling, when I removed the objective, it is out of focus. To address this, I need to adjust the camera’s position away from the optics module by approximately 1.5mm. Could you guide me on how to modify the printed part of the optics module to increase its height by 1.5mm?

The easiest thing to do is to shim the camera rather than reprint the optics again. We really need to document this better. I have made an issue on GitLab

I realise we don’t have any shims in the repository So I will start a merge request to add them in.

For now you can try printing these (should be very quick). By stacking them behind the camera you can make the correct height.

picamera_2_shim-0.2.stl (10.2 KB)
picamera_2_shim-0.3.stl (10.2 KB)
picamera_2_shim-0.5.stl (10.2 KB)
picamera_2_shim-1.0.stl (10.2 KB)


Before a full solution, the quick way to test whether this will help your images is to use a shim plate between the camera and the optics module and retest the microscope. This might be what you have done, if not there is a thin shim on a different project at DesignFiles/STL/picamera_shim.stl · re-parameterise-the-outer-shape · Bath Open Instrumentation Group / Autocollimator · GitLab. that is 0.2mm thick, but your slicer should be able to stretch it to 1.5mm. Or print several.

1 Like

The OFX microscope is truly an excellent product. However, I find this forum to be even more intriguing in terms of propelling its advancement and exploration. Thank you @WilliamW @j.stirling for the outstanding work you’ve done!
I’m curious about how this image achieved such uniform coloring. Was the flat field method employed to achieve this effect?

1 Like

I don’t know whether it was or not. There is certainly a yellow-ish centre and pink-ish edges which are typical of uncorrected images, but maybe not as strong as sometimes. It was quite a while ago, I think before big improvements were made to the colour correction.

1 Like

As usual, @WilliamW has beaten me to the answer: the loss of saturation at the edges is from the lenslet array on the camera. This isn’t something that will go away with infinity corrected optics (though that might help with the field curvature, which makes the edges out of focus). The paper @WilliamW references does indeed describe a way that will recover saturation at the edges, BUT it is quite dependent on the spectrum of your stain and illumination. While the calibration jig in the paper (documented as open hardware, by the way) would be a good start, I think it might in principle be possible to calibrate using only your sample. That would not only be more convenient, but also better, as it matches the exact spectral characteristics.

If you acquired a tiled scan with very high overlap (say 90% or 95%, maybe 10x10 images) it would be possible to solve for a spatially dependent unmixing matrix. As I’m currently trying to dream up masters projects, I am going to write a brief below: if you are lucky, an amazing Glasgow University student will appear and do this for us in the Autumn. If not, maybe you or someone else on the forum will decide it’s an interesting project :wink: Forgive the general introduction and Physics focus: that’s needed for our students who won’t be familiar with this project, and who will need to have some Physics content for their report.

Colour calibration for diagnostic microscopy

The OpenFlexure Microscope ( is a fully automated, portable digital microscope. It has found uses in many different contexts because it can be built using readily-available parts and standard tools, and the build instructions are freely available under an open, royalty-free license. However, the low-cost image sensor it uses doesn’t have uniform colour response. Previous work has used a calibration jig to generate an “unmixing matrix” that can restore the colour response (, but this isn’t practical for everyone to build and use.

This project will develop a physics-based linear model of how the camera forms a colour image, and use that to correct the mixing-together of colour channels that occurs at the edges of an image. More specifically, we will model the colour mixing effect using 3x3 matrices at each point in the image. By acquiring images of the same sample at slightly different positions, we will be able to explain each image as some part of the “true” sample image, distorted by our colour mixing model. Taking a large enough set of images, we will then be able to optimise the colour mixing matrices such that we can explain the images we observe. This will involve constrained, regularised optimisation, implemented in a Python script. We will then invert the matrices we have recovered, to restore the colour response across the image.

As there will be a significant amount of programming involved in this project, it would suit a student with reasonably advanced Python skills, and an understanding of linear algebra and matrices. There is time to improve your skills in either or both of these areas, but this is not a project suitable for non-programmers and will require you to do some matrix maths by hand, as well as in code.

If successful, this code may be incorporated into future versions of the OpenFlexure Microscope, potentially getting used to provide medical diagnostics in places where these resources are currently scarce. There is potential for the code and documentation you write to be incorporated into a valuable open source project, so that your work will be used by many people, long after your project has finished.


The paper @WilliamW references does indeed describe a way that will recover saturation at the edges, BUT it is quite dependent on the spectrum of your stain and illumination

How much of an issue is the color response issue with the low-cost (reversed pi lens) optics? If it’s substantial, would it be possible to integrate a shared “reference” correction into the software? Or is there too much variation from between each build/component lot for this to work?

It will be noticeable for sure. My hunch is that the bulk of the effect will be similar from sensor to sensor, and it comes from the sensor rather than the optics. Lens shading correction (i.e. Flat field) will need to be done for each microscope, but that’s relatively easy. Adding in a generic compensation for the loss in saturation might be possible - certainly one to consider. It’s unlikely to work in real time, but on saved/captured images it ought to be ok.