This semester, we will offer droplet microfluidics workshops, and this OpenFlexure version is convenient for this application. We donât still have access to the FabLab, so I printed the plates for our prototype, making it more expensive. As soon as we have a new version, we will share some pictures in this thread.
Neat I think thatâs the first replication outside my lab of that one - itâs a somewhat neglected design but I think it has a lot of potential. I would love to see someone pair it with a stepper motor stage (like a 3D printer one) to get wide area scanning.
This is probably a dumb question⌠would it be feasible to add a short throw XY motion capability to the optic head? As in make the lens & camera translate in X and Y in addition to Z. That way the microscope could be used to image over a small area (of the larger flat object placed on the surface). Perhaps using an epi-illumination setup, with the stage attached to (hanging down from) the bottom of the flat platform. Nominally, the motion controller and rpi would not also be suspended from the platform to keep the mass low.
Thatâs an interesting idea - there are a bunch of things you could do by moving the optics rather than the sample, but Iâve pretty much always thought of it in a long-travel scenario (e.g. you move the microscope to scan a whole multi-well plate). With short throw XY, I guess you could use it to fine-tune what youâre imaging, or scan a small (~10mm) area of a much bigger object.
It would take a bit of thought to get it to work nicely, but itâs definitely something that I could see finding some use.
[UPDATE 01] This week Iâve been working on designing the pieces for strobe illumination. As I mentioned, our motivation is to use this setup for droplet microfluidics. The small size and high speed of droplets moving through microfluidic devices create particular challenges for an imaging system based on low-cost and open-source components.
Stroboscopic imaging is an interesting technique that âfreezesâ the motion of a fast-moving object. The light source needs to be synchronized with the objects to be inspected, and the camera is triggered at the exact moment of the light pulse. The GIF above is an example of a stroboscopic image of droplets (source: Zaber Technologies).
In our research group, we are able to do that with a high-power LED and a Raspberry Pi camera. I am sharing some pictures of the first version of the strobe illumination. As you can see, it is possible to adapt it in the flat-top microscope. For my next post, I will generate some droplets and record them while they are moving in the chip
[UPDATE 02] Hi everyone! I didnât try to record droplets yet. Instead, I decided to explore XY translation mechanisms. The top and bottom plates were redesigned to hold a commercial XY stage and use all its workspace. In the picture below, you can see the new version with a microfluidic chip holder. This configuration allows us to work with up to three microscope slides on the stage, using this adapter.
I havenât shared the design because the sample is far from the sensor. The current focusing mechanism has limitations for this new version, so it must be redesigned too. I was able to capture some pictures without the 3D-printed holders, but itâs not possible to use XY motion properly.
[UPDATE 03] Hi everyone! I know it has been a while since the last update. I hope this post you may find it useful.
In the last four months, we have made good progress in developing variants of the flat-top microscope for performing microfluidic experiments. Matias and Tobias joined me in this project and contributed to the design and verification process.
As I pointed out in previous posts, our interest is in using stroboscopy for visualizing droplet generation on a PDMS chip. For this purpose, we designed a mechanism to hold the high-power LED with its heatsink and position it at different distances. Here is a reference of the structure without electronic components:
Other requirements I didnât mention before were increasing the Field of View (FOV) for microfluidics applications and rotating the sensor to improve usability. We tested different lenses and kept the same lens from the RPi cam. Here is a reference of the optics configuration:
Tobias and I tested this new version. Tobias used this workstation as part of an EMBO Practical Course called âMicrobial Metagenomics: a 360° approachâ. This workshop was offered in Germany, so our workstation had to fly from Chile to Germany and perform as intended. You can visit the following links to watch a video and pictures from this course and the workstation: Picture 1, Picture 2, and Video. Then, Tobias and I used it again for a practical session of a teaching course called Prototyping for Bioengineers (IBM2026) offered at the uni where we work. Here is a video of the workstation while I tested different flow rates for droplet generation, and a picture of the latest design:
Finally, we also finished the version with the XY translation mechanism. Currently, this version is used by our team in microfluidics experiments. Here are some pictures and a video of this version in our lab:
To finish and release the documentation of this workstation.
To do a workshop to teach how to build this platform for microfluidic research.
A final message related to the second bullet point: If you are in Latin America and your research area is microfluidics, or you are using this technology as part of your experiments, I invite you to apply to our next LIBRE hub workshop. Travel grants are available for participants.
To clarify, the 3D-printed and laser-cut parts of this stage are inspired by the flat-top microscope. However, we use custom electronics and different software to operate and control this instrument. More information can be found on our GitHub repo. In the near future, I would like to develop a plugin for OpenFlexure Connect to control the light source for this kind of application. I have seen on some forums that reaching up to 100 fps with the Raspberry Pi camera and higher frame rates with extra hardware is possible.