In my opinion, 2 variants of the algorithm: constant Z(0) and adaptive Z(0) as the inverse calculation from manually adjusted Z for the current position.
I’m not totally clear what you’re suggesting here - do you mean that it would be nice to store a look-up table of Z values as a function of X and Y, then make Z relative to that? This is something that’s been looked into a few times in the past, but never been fully finished off. The issue that keeps reappearing is that there aren’t endstops in use on the microscope, so it’s hard to know exactly where you are after a power cycle (though the microcontroller tries to remember, so it should be at least approximately right).
I think my favourite approach is to remember that the stage should follow the surface of a sphere as it moves in X/Y, so you don’t need a look-up table but you can fit a curve to predict how Z should vary from point to point. That said, after I got the fast autofocus working, I stopped worrying as fast autofocus was much better than my attempts to predict where the focus would be.
Apologies if I’ve totally missed what you’re suggesting!
Richard, thanks for the answer. Unfortunately, I can only communicate via Google translator.
I made my proposal because of the not very convenient observation of moving objects. A little help from the automation in refining the focus when moving would be very useful.
Autofocus when using immersion oil and a liquid sample causes currents in the sample due to exposure to a viscous medium and distracts from observations. Possible manual focusing would be more convenient in some cases with a small correction.
The notion of movement in a sphere can be simplified to movement in a circle.
The initial focus z(0) is obtained by autofocus in the parking position and manual focusing tune if necessary. My suggestion is to use an adaptive algorithm that receives the initial value of z(0) using the inverse count from the current value of z(sqrt(x*x+y *y)) after the refinement focus.
I apologize if my ideas are somewhat naive))
In order not to start a new topic, I’ll ask you here - whether backlash compensation automatically calculated? I didn’t get data after auto-calibration.
Usually there wold be a glass slide or coverslip between the immersion oil and the actual sample. In that case the movement of the immersion oil (caused by focusing the lens) will not transfer to movement in the sample.
William, I use a standard cover glass. Unfortunately I can’t demonstrate it to you now, I returned the 40x lens. Which is not parfocal and requires tuning.
I apologize again - is it possible to save stage presets for at least 2 lenses?
When you say “presets” do you mean the camera-to-stage mapping? If so, there’s no option to do that in eV yet, but you can do it by making a copy of
/var/openflexure/settings/microscope_settings.json if you connect to your Pi via SSH.