By the razor blade method, here are my results:
8x 0.20NA (1.4-1.7um theoretical) 2.22um measured
10x 0.22NA (1.25 - 1.53um theoretical) 1.96um measured
10x 0.30NA (0.93 - 1.12um theoretical) 1.6um measured
16x 0.40NA (0.69 - 0.84um theoretical) 1.1um measured
By the bead method with 1um white beads
8x 0.20NA (no results)
10x 0.22NA 3.9um
10x 0.30NA 3.1um
16x 0.40NA 2.2um
16x 0.40NA 1.9um (blue LED)
16x 0.40NA 1.9um (green LED)
16x 0.40NA 2.47um (red LED)
16x 0.40NA 1.37um (manual focus -200 from FAST, green LED)
16x 0.40NA 1.9um (FAST focus, green LED)
16x 0.40NA 1.9um (MEDIUM focus, green LED)
16x 0.40NA 1.6um FINE focus, green LED)
Based on the 1951 USAF target, with the 10x 0.30NA objective I reached the limit of the scale 2.19um.
Without purchasing a more expensive target, I would like to find a way to verify the accuracy of these tests. My opinion, the results are very good, they do correlate, but which one is more accurate?
The bead method is closest to the 1951 target, the razor blade method is a much simpler test. One other possibility to make both methods closer would be to use a wider range, say 10% to 90% for the razor blade method?
My plan is to look for some papers on these methods, looking for accuracy.