As usual, @WilliamW has beaten me to the answer: the loss of saturation at the edges is from the lenslet array on the camera. This isn’t something that will go away with infinity corrected optics (though that might help with the field curvature, which makes the edges out of focus). The paper @WilliamW references does indeed describe a way that will recover saturation at the edges, BUT it is quite dependent on the spectrum of your stain and illumination. While the calibration jig in the paper (documented as open hardware, by the way) would be a good start, I think it might in principle be possible to calibrate using only your sample. That would not only be more convenient, but also better, as it matches the exact spectral characteristics.
If you acquired a tiled scan with very high overlap (say 90% or 95%, maybe 10x10 images) it would be possible to solve for a spatially dependent unmixing matrix. As I’m currently trying to dream up masters projects, I am going to write a brief below: if you are lucky, an amazing Glasgow University student will appear and do this for us in the Autumn. If not, maybe you or someone else on the forum will decide it’s an interesting project Forgive the general introduction and Physics focus: that’s needed for our students who won’t be familiar with this project, and who will need to have some Physics content for their report.
Colour calibration for diagnostic microscopy
The OpenFlexure Microscope (https://openflexure.org/) is a fully automated, portable digital microscope. It has found uses in many different contexts because it can be built using readily-available parts and standard tools, and the build instructions are freely available under an open, royalty-free license. However, the low-cost image sensor it uses doesn’t have uniform colour response. Previous work has used a calibration jig to generate an “unmixing matrix” that can restore the colour response (https://openhardware.metajnl.com/articles/10.5334/joh.20), but this isn’t practical for everyone to build and use.
This project will develop a physics-based linear model of how the camera forms a colour image, and use that to correct the mixing-together of colour channels that occurs at the edges of an image. More specifically, we will model the colour mixing effect using 3x3 matrices at each point in the image. By acquiring images of the same sample at slightly different positions, we will be able to explain each image as some part of the “true” sample image, distorted by our colour mixing model. Taking a large enough set of images, we will then be able to optimise the colour mixing matrices such that we can explain the images we observe. This will involve constrained, regularised optimisation, implemented in a Python script. We will then invert the matrices we have recovered, to restore the colour response across the image.
As there will be a significant amount of programming involved in this project, it would suit a student with reasonably advanced Python skills, and an understanding of linear algebra and matrices. There is time to improve your skills in either or both of these areas, but this is not a project suitable for non-programmers and will require you to do some matrix maths by hand, as well as in code.
If successful, this code may be incorporated into future versions of the OpenFlexure Microscope, potentially getting used to provide medical diagnostics in places where these resources are currently scarce. There is potential for the code and documentation you write to be incorporated into a valuable open source project, so that your work will be used by many people, long after your project has finished.