Hi guys!
I have written a python script for a time lapse experiment, doing some tests I found some details with pictures collected at specific positions, it seems like the microscope has some accuracy issues with that. Let explain you with an example, the image sequence attached belongs to position X=24000, Y= 12000, using Fiji(ImageJ) was expecting to see no movement in the background, but as you can see there is certain movement, which means something is not going well, the movement is just a little but enough to complicate analysis during real experiments (e.g. visualising protein synthesis during certain period of time).
Could you help me to solve this problem (if it is feasible)? It could be a hardware or software issue?
I am using the OpenFlexure Microscope v6 (no Delta Stage)
Thank you so much!
1 Like
How long is the timelapse, and what is the magnification? Is the microscope stationary at this position, or is it taking pictures at more than one different position at each time point?
The microscope is remarkably stable in position if it is left stationary, but it does creep a little. A one-piece 3D printed flexure translation stage for open-source microscopy: Review of Scientific Instruments: Vol 87, No 2 (scitation.org) is an earlier version of the microscope but there is a measurement of the drift over a few days. Sometimes there is a more rapid drift in a ‘settling’ period just after the microscope has been sent to its fixed position. If you move the microscope and return to the same position there are other possible positioning errors, such as missed steps and backlash.
No system will be totally stable, but the movement here is relatively small and there are a couple of ways that you could correct for it. In timelapses that I have done I have needed to autofocus at each time step. This is the same type of drift but in z not x-y, and the microscope has a built-in focus algorithm to get back to the right place. You could have an algorithm to correct in x-y by taking the correlation of the new image with the old one which tells you where to move to, and then you can move the motors (the basic processes for this exist in the stage calibration routine). Alternatively in the image processing that you are doing you could cut out a smaller region of interest from wherever it is in each image, again by an image matching/location process. That would be easier, as long as the movement is not too far across the whole image. It would not be quite as good if you want to see small changes as there may be focus change or distortion or lighting change as your region of interest moves across the field of view. Moving the stage to keep your region of interest in the same place keeps all those things as similar as possible over time, but would take a bit more to implement.
Hi!
About your questions:
- The time lapse was two hours.
- Magnification is the maximum provided by Raspberry Pi Camera (I am not sure about the specific number).
- The microscope was taking pictures at more than one different position at each time point. To be specific, I made a loop to take pictures at three different positions for 2 hours, the image that I attached is one of those three positions (other images have the same ‘drift’).
Thank you so much for the feedback, I really appreciate this valuable information.
About your suggestions, I have already processed the images using a fiji macro, but I didn’t have a good result. Please, could you suggest another tool that I can use? Also, could you share with me the code that you use to correct x-y by taking correlation or the calibration routine to check it?
Thank you so much!
The Raspberry Pi camera with teh lens space gives a field of view similar to a 20x objective (see Comparison of a pi-camera and a 20x optical lens - Contributions - OpenFlexure Forum)
This means that the shift that you are seeing is quite large compared to the expected drift of the static stage from the paper that I linked earlier. However as you are also moving to different parts of the slide I would expect the repositioning to have larger changes than the static drift.
Image processing is not my core expertise, but what you are needing is basically to search for the position of a small image within a larger one. Fiji probably can do this. I am imagining that you centre the microscope field of view on the interesting features for the first set of pictures for the first time step. You can then cut out the central portion of that first image as the part you are actually going to use. On the second time step you look in the new picture for the area of the new picture that looks the most similar to the first picture. You can then cut out that area to be the second picture that you use, and so on. The detail of the method will be how much it moves, so how tightly you need to crop the image to make sure there region is present in all images. Then whether there is enough in the image that is the same throughout your timelapse that you can do the matching. You may need to have deliberate markers that you can search for if your specimen is changing a lot.
Possibly this Fiji / ImageJ plug-in might work on the final timelapse frames: Image Stabilizer (imagej.net) or Register Virtual Stack Slices (imagej.net)
I see, I understand better the drift during experiments.
I will try image processing with the tools that you shared with me.
Thank you!
As usual, @WilliamW has done most of the answer for me
That does look like more drift than I’d expect over two hours, though it’s worth asking if it was at room temperature or in an incubator?
If you are moving the stage, it is worth making sure you always approach the position from the same direction - i.e. each time you move, deliberately go to the desired position minus 300 steps in X, Y, Z, then make a move of [300, 300, 300]. That ensures that any mechanical backlash is compensated for. This used to be built into the software, but in some situations it got annoying and confusing, so it’s now off by default.
@irvgrimm hey bro, can u share the script pls
@irvgrimm Hi, I am now also trying to automatically capture images at intervals, could you please share your code of python script? Thank you very much!
@Yanyun, if you are only wanting to take photographs at intervals then the Blockly extension is the simplest way to do it and should be completely intuitive. @Soyalexf has a thread on the issue that if you want to do anything else then Blockly is not the way, you would need to use Python.
Thank you for your answer. Yep I just want to take a image for every hour and then know the focus drift. But what is Blockly extension and how should I work on it?
Blockly is in the latest release of the microscope Web App, if you update your openflexure server (ofm update
and ofm upgrade
on your Pi, see Raspbian-OpenFlexure). Alternatively you can use it directly from the web, if you computer is on the same network as the microscope. If http://microscope.local:5000 gives you the web app on your computer then http://openflexure.gitlab.io/openflexure-blockly/ should be able to connect to your microscope with a Blockly interface.
A repeat
block from the loop
tab gives you a loop.
From the OpenFlexure
tab you might want the autofocus
block before a capture
block
Finally a wait
block from within the loop
tab will give you the delay between images.
The image capture (and autofocus if you use that) will take a few seconds, so the images may not be exactly equally spaced in time and the wait time stated will need to be a bit less than the time spacing that you want.
Edit:
I have just tried taking my advice:
It seems that the Blockly extension is not included in OFM server 2.10.1
I was also not actually able to run the web-Blockly and connect to the microscope on a direct ethernet cable connection from my computer to the microscope, even though http://microscope:5000
does work. This may be because my microscope and computer are not able to be on the same network router here for various reasons, which is why I was using a direct cable connection.
If I give the microscope an internet connection and then open a browser direct on the microscope Pi then it does connect using http://openflexure.gitlab.io/openflexure-blockly and giving the address as microscope.local
Hi, thank you for your help. The blockly works well. But there’s a problem that the “ wait seconds “ block in Loop seems that only 600s can be set at most. but i want 1h-2h, i try to enter 3600seconds, but when i press enter to confirm, it jumps back to 600s. how should I solve that?
@yanyun I am not sure why there is a 600s limit in Blockly, but there is. For 3600 seconds you would need a loop that runs 6 times with a wait of 600 seconds as the only thing in the loop. That then takes one hour.
As soon as the SD card build is fixed again, I should add Blockly! If you make an issue on the Blockly repository, we can also have a look at removing the 600s maximum wait. That may be a default left over from somewhere else, I guess.