Greetings from Brazil,
Introduction:
I extend my sincerest congratulations to all those involved in the Openflexure initiative. The concept is truly remarkable, and upon my discovery of it, I was immediately compelled to embark on its construction.
Currently, I am in the process of building a version based on the beta V7 with high-resolution capabilities and epi-illumination. However, I encountered a need for an upright version during the assembly process. Consequently, I am currently experimenting with various designs to accommodate a beamsplitter cube in an inverted orientation, as well as exploring other methods to facilitate seamless transitions between epi-illumination and trans-illumination. These developments will be detailed in a subsequent post, once I have attained more tangible and presentable results to share.
Beyond my personal requirements, I envision a simplified and cost-effective iteration suitable for implementation in underprivileged public schools within my country. My aim is to demonstrate the efficacy of utilizing older Raspberry Pi boards paired with inexpensive cameras. Consequently, I am presently testing a configuration employing an antiquated RPi 1 B+ (later substituted with an RPi 2B due to issues with the SD card connection) in conjunction with the affordably priced PiCamera V1.3, alongside an Arduino Nano. Despite encountering various warnings and challenges reported in the forum, I became committed to this endeavor.
Just to make my point about affordability in developing countries like Brazil:
Affordable Version | Price (USD) | Recommended by OpenFlexure | Price (USD) | |||
---|---|---|---|---|---|---|
Component | Local | Aliexpress | Component | Local | Aliexpress | |
Raspberry Pi 1B+ / 2B (second handed) | 20.00 | * | Raspberry Pi 4 Model B (2Gb) | 125.00 | ** | |
PiCamera V1.3 (new - local market) | 11.00 | 3.30 | PiCamera V2 | 41.60 | 28.00 | |
Total | 33.00 | 23.30 | Total | 166.60 | 153.00 |
/* Aliexpress - not available
/** Aliexpress $75 + import taxes kicks-off (+$70) = $145, therefore not worthwhile
In summary, by acquiring second-hand RPi 1B / 2B boards (readily available from individuals selling them - usually used as RetroPie video games) and PiCamera V1.3 modules, we can assemble six microscopes at a fraction of the cost compared to building them with the currently recommended components. This approach significantly enhances affordability and accessibility, particularly for initiatives aimed at providing essential scientific tools to underserved communities.
Indeed, itâs crucial to acknowledge that while I am currently utilizing older and outdated components to construct the microscope, I fully support the communityâs broader efforts to adapt and advance the project with newer technologies such as the Raspberry Pi 4/5, PiCameras V2/V3, Sangaboard V4, and so forth. These advancements are essential for enhancing performance and functionality. However, itâs important to recognize that newer products often come with higher price tags, posing challenges for widespread adoption, particularly in resource-constrained communities.
By leveraging older components, we can mitigate cost barriers and ensure that the benefits of this technology reach those who need it most. Nonetheless, as the project evolves, finding ways to balance innovation with affordability will remain a key consideration to ensure inclusivity and accessibility for all.
Solving the issues with Raspberry Pi 1B+ / 2B and PiCamera V 1.3
Moving beyond these observations, letâs delve into the practicalities of setting up and operating an OpenFlexure Microscope with a Raspberry Pi 1B+ / 2B and PiCamera V 1.3.
For the Raspberry Pi, the setup is relatively straightforward. One must utilize the headless version of Raspbian OpenFlexure Lite, which lacks desktop support. However, this version allows seamless access to the microscope through its web interface.
However, the real challenge arises with the PiCamera V1.3. Initially, users may encounter difficulties getting the OpenFlexureâs Auto Calibration feature to function correctly, as extensively discussed in various online forums and posts.
To overcome this hurdle, I embarked on an in-depth exploration, delving into various concepts such as Bayer filters, lens shading correction, and binning. I dedicated approximately one week, investing numerous hours in studying resources such as the Openflexure-microscope-server, Picamerax documentation, and code repositories:
- Bayer filters, lens shading correction, binning,
- Openflexure-microscope-server,
- Picamerax documentation,
- Picamerax code,
My quest also led me to experiment with algorithms for creating lens shading correction tables, exploring resources from contributors like 6by9 and cpiaxip:
Additionally, I ventured into recompiling RaspiStill.c to incorporate a lens shading correction table:
- from 6by9âs userland (GitHub - 6by9/userland at lens_shading)
Through this rigorous exploration, I believe I have discovered a solution to enable Auto Calibration and Lens Shading Correction with a PiCamera V1.3 within the openflexure-microscope-server. I eagerly await feedback and reviews from esteemed members of this community.
For those interested in replicating my efforts, I offer detailed instructions below. However, feel free to skip ahead to the end of this post for the concise code adjustments necessary to enable PiCamera V1.3 compatibility within the openflexure-microscope-server.
A. Auto White Balance (AWB)
The Auto White Balance (AWB) functionality with a PiCamera V1.3 presents challenges, often resulting in greenish images that are notably inferior to the original ones. There are several posts about such behavior and the âconclusionâ was that the PiCamera V1.3 has been used instead of the supported V2.
To illustrate this discrepancy, I conducted experiments both with and without samples, yielding consistent results:
- Figures A and B depict the microscopeâs view without any sample (a clear field) before and after AWB adjustment.
- Figures D and E showcase the microscopeâs view with onion cells before and after AWB adjustment.
Upon investigation, I discovered that the PiCamera V1.3 (OV5647) and V2 (IMX219) employ different Bayer filter patterns. Specifically:
-
The PiCamera V1.3 utilizes the following Bayer pattern:
GBGBGBGBGBGBGBâŚ
RGRGRGRGRGRGRRâŚ
GBGBGBGBGBGBGBâŚ
RGRGRGRGRGRGRGâŚ
⌠-
Conversely, the PiCamera V2 employs this Bayer pattern:
BGBGBGBGBGBGBGâŚ
GRGRGRGRGRGRGâŚ
BGBGBGBGBGBGBGâŚ
GRGRGRGRGRGRGâŚ
âŚ
Analysing the OpenFlexure Microscope Server code, I discovered that the function âchannels_from_bayer_arrayâ in ârecalibrate_utils.pyâ (at /var/openflexure/application/openflexure-microscope-server/openflexure_microscope/api/default_extensions/picamera_autocalibrate/) was designed to map the V2 bayer pattern to BGGR channels, but not to map the V1.3 one.
Therefore, the trick is to change the mapping in line 256:
- from : bayer_pattern: List[Tuple[int, int]] = [(0, 0), (0, 1), (1, 0), (1, 1)]
- to: bayer_pattern: List[Tuple[int, int]] = [(0, 1), (1, 1), (0, 0), (1, 0)]
Indeed, the optimal approach to addressing the issue of different Bayer filter patterns between PiCamera V1.3 and V2 would involve implementing a more flexible solution within the "channels_from_bayer_array" function. Rather than hardcoding a new mapping specific to the V1.3 camera, a more robust method would be to introduce an additional parameter that specifies the camera version. This parameter could then be used to dynamically select the appropriate mapping.
Hereâs a conceptual outline of how this adjustment could be made:
def channels_from_bayer_array(bayer_array: np.ndarray, cam_version: int) â np.ndarray:
if bayer_array.shape==2:
âŚ# New PiCamera: V2 - IMX219
âŚbayer_pattern: List[Tuple[int, int]] = [(0, 0), (0, 1), (1, 0), (1, 1)]
else:
âŚ#PiCamera V1.3 - OV5647
âŚbayer_pattern: List[Tuple[int, int]] = [(0, 1), (1, 1), (0, 0), (1, 0)]
We can determine the camera version parameter âcam_versionâ by using a similar method as described in the Picamerax recipe for Capturing Raw Bayer Data , in preparation for calling âchannels_from_bayer_arrayâ as follows:
cam_version = {
⌠âRP_ov5647â: 1,
⌠âRP_imx219â: 2,
}[camera.exif_tags[âIFD0.Modelâ]]
With the implementation of this straightforward adjustment, the Auto White Balance (AWB) functionality of the OpenFlexure-Microscope-Server can be rectified, ensuring compatibility with both PiCamera V1.3 and V2 versions. Figures C and F serve as prime examples of the successful AWB procedure, displaying significantly improved image quality compared to the original images (A and D) and versions affected by incorrect AWB mapping (B and E).
B. Auto Flat Field Correction (Lens Shading Correction)
Let´s move onto the issue that took most of my time: the Lens Shading Correction.
After reading the great paper written by RW. Bowman at al. Flat-Field and Colour Correction for the Raspberry Pi Camera Module, I was able to âstartâ understanding the problems related to LSC, vignetting, etc, to rerun most of their codes, and I became confident that the distorted colors of my microscope images could be corrected (if not directly through the openflexure-microscope-server, it could be done offline).
I also understood that RW.Bowman took the main ideas about LSC from 6by9 and transferred them to pycamerax and later to the openflexure-microscope-server code. Nevertheless, for some unknown reason my images after the âAuto Flat Field Correctionâ were no better than the original ones, even after the âcorrectedâ AWB correction. My white (clean field) images like Figure G were transformed into color distorted images like Figure H after the Auto Flat Field Correction procedure, no matter how homogeneous my illumination was (that was neither homogeneous nor totally symmetric to be sincere).
After further study and hands-on experimentation, I successfully ran the programs developed by 6by9 (in C) and cpixip (in Python) to generate Lens Shading Correction (LSC) tables. Additionally, I rebuilt RaspiStill.c to integrate these new tables. As a result, I achieved reasonably homogeneous grayish images when using a clean field input, marking a significant milestone in my efforts to improve color accuracy in microscope images (yehhhh! )
But why didnât it happen with the algorithm implemented in the OpenFlexure Microscope Server? Why was the LSC working in RaspiStill but not in the OpenFlexure?
To gain insight into the issue, I decided to employ a non-white image, specifically a tilted grid pattern (Figure J), as the basis for the LSC procedure. This allowed me to visualize the image pattern in the LSC tables.
Upon evaluating the tables generated by the programs from 6by9 (in C) and cpixip (in Python) for the tilted grid pattern, I observed differences in their outputs. Interestingly, both sets of tables yielded perfect and similar results when integrated into the modified RaspiStill.c. However, there were distinct characteristics: the ls_table.h generated by 6by9âs program exhibited weights/gains similar to the input image(fig. J), while the ls_table generated by cpixipâs program was flipped horizontally and vertically (Figures N and O were created with the red channel of the ls_table.h created with 6by9 and cpixipâs programs, respectively).
A crucial parameter, ref_transform, was present at the end of the ls_table.h files generated by both programs. Notably, in 6by9âs program, ref_transform was set to 0, aligning with the similarity between the ls_table and the input image. Conversely, in cpixipâs program, ref_transform was set to 3, reflecting the flipped input image. This parameter was directly passed to the MMAL data structure in RaspiStill.c (mmal.MMAL_PARAMETER_LENS_SHADING_T). In short:
- 6by9âs program: ls_table.h similar to input image and red_transform=0
- cpixipâs program: ls_table.h flipped input image and red_transform=3
Additionally, examining the implementation in picamerax (Figure M), I found that the value 3 was hardcoded in the "_upload_lens_shading_table" function of "camera.py". This observation was accompanied by a comment indicating the need for further clarification on the appropriate value for ref_transform:
LINE 2548: ref_transform = 3, # TODO: figure out what this should be properly!!!
In summary, the issue stemmed from picamerax, utilized by the OpenFlexure Microscope Server, expecting a lens shading correction table that needed to be flipped horizontally and vertically in relation to the white clean field input image. To confirm this discovery, I utilized the tilted grid image (Figure J) to execute the "Auto Flat Field Correction'' process. The resulting "lens shading corrected" image (Figure K) clearly indicated that the correction was not applied as expected.
Rectifying this issue was relatively straightforward. By reversing the horizontal and vertical order of the second and third axes (x, y) within the âlst_from_channelsâ function in ârecalibrate_utils.pyâ, the correction process was corrected:
- Original Code: return lens_shading_table[::-1, ::, ::].copy()
- New Code: return lens_shading_table[::-1, ::-1, ::-1].copy()
With this simple adjustment, the Auto Flat Field Correction procedure now functions properly with my PiCamera V1.3. Figure L depicts the tilted grid image as input, while Figure I shows the result with the white clean field image as input.
While this adjustment resolves the issue for PiCamera V1.3, I remain somewhat confused about how the current Auto Flat Field Correction code operates with a PiCamera V2. Given that picamerax expects a flipped lens_shading_table, I wonder how this might affect the correction process for PiCamera V2. Unfortunately, without access to a PiCamera V2 for testing, I am unable to ascertain the exact implications.
I am eager to hear insights from experts like @r.w.bowman, whose expertise could shed light on this matter and provide valuable clarification.
Dirty temporary corrections to make AWB and LSC work for PiCamera V1.3 (hardcoded)
- Open the program ârecalibrate_utils.pyâ located in the directory /var/openflexure/application/openflexure-microscope-server/openflexure_microscope/api/default_extensions/picamera_autocalibrate of your Raspberry Pi with your preferred file editor.
- Go to line 256 and replace the mapping
- From: bayer_pattern: List[Tuple[int, int]] = [(0, 0), (0, 1), (1, 0), (1, 1)]
- To: bayer_pattern: List[Tuple[int, int]] = [(0, 1), (1, 1), (0, 0), (1, 0)]
- Go to line 354 and replace the return statement
- From: return lens_shading_table[::-1, :, :].copy()
- To : return lens_shading_table[::-1, ::-1, ::-1].copy()
- Restart the OFM server with âofm restartâ and you may test the Automatic Calibration of your cheap PiCamera V1.3 within your OpenFlexure Microscope.
I trust that this (too extensive) post has proven to be worthwhile, and I hope that this information may help someone. While I have only just begun this journey, dedicating a week+ to delving deeply into the intricacies of Auto White Balance (AWB) and Lens Shading Correction (LSC) is merely a small fraction of the countless hours that many of you have dedicated to advancing our collective understanding.
Best regards,
PEB
Figures: