This semester, we will offer droplet microfluidics workshops, and this OpenFlexure version is convenient for this application. We don’t still have access to the FabLab, so I printed the plates for our prototype, making it more expensive. As soon as we have a new version, we will share some pictures in this thread.
Neat I think that’s the first replication outside my lab of that one - it’s a somewhat neglected design but I think it has a lot of potential. I would love to see someone pair it with a stepper motor stage (like a 3D printer one) to get wide area scanning.
This is probably a dumb question… would it be feasible to add a short throw XY motion capability to the optic head? As in make the lens & camera translate in X and Y in addition to Z. That way the microscope could be used to image over a small area (of the larger flat object placed on the surface). Perhaps using an epi-illumination setup, with the stage attached to (hanging down from) the bottom of the flat platform. Nominally, the motion controller and rpi would not also be suspended from the platform to keep the mass low.
That’s an interesting idea - there are a bunch of things you could do by moving the optics rather than the sample, but I’ve pretty much always thought of it in a long-travel scenario (e.g. you move the microscope to scan a whole multi-well plate). With short throw XY, I guess you could use it to fine-tune what you’re imaging, or scan a small (~10mm) area of a much bigger object.
It would take a bit of thought to get it to work nicely, but it’s definitely something that I could see finding some use.
[UPDATE 01] This week I’ve been working on designing the pieces for strobe illumination. As I mentioned, our motivation is to use this setup for droplet microfluidics. The small size and high speed of droplets moving through microfluidic devices create particular challenges for an imaging system based on low-cost and open-source components.
Stroboscopic imaging is an interesting technique that ‘freezes’ the motion of a fast-moving object. The light source needs to be synchronized with the objects to be inspected, and the camera is triggered at the exact moment of the light pulse. The GIF above is an example of a stroboscopic image of droplets (source: Zaber Technologies).
In our research group, we are able to do that with a high-power LED and a Raspberry Pi camera. I am sharing some pictures of the first version of the strobe illumination. As you can see, it is possible to adapt it in the flat-top microscope. For my next post, I will generate some droplets and record them while they are moving in the chip
[UPDATE 02] Hi everyone! I didn’t try to record droplets yet. Instead, I decided to explore XY translation mechanisms. The top and bottom plates were redesigned to hold a commercial XY stage and use all its workspace. In the picture below, you can see the new version with a microfluidic chip holder. This configuration allows us to work with up to three microscope slides on the stage, using this adapter.
I haven’t shared the design because the sample is far from the sensor. The current focusing mechanism has limitations for this new version, so it must be redesigned too. I was able to capture some pictures without the 3D-printed holders, but it’s not possible to use XY motion properly.