It depends quite a lot on what you are actually wanting to do.
The physics/optics answer is to measure the lateral resolution of the microscope in pixels. This is not so easy to do, there is a thread First Timer High Resolution Objective Recommendation about different methods. The value that you get depends on many practical parameters of your microscope. Of course if you are only using the number to decide whether to desample your images then you can easily calculate the best case using the NA of your lens. The images cannot be better than that. Then you just need the conversion from micrometres to pixels on your microscope.
I think that this number probably gives you a sensible answer for downsampling for storage, but probably not a sensible answer for downsampling for processing. For example we were recently having issues with colour correction which turned out in part to be because of downsampling, and stitching will almost certainly be more robust with the extra pixels. It then is a question of whether there are ways to make the processing algorithms efficient. For example @JohemianKnapsody has been working on large area stitching using minimum memory resources, and the OpenFlexure autofocus piggybacks on the mjpeg compression that is already in the GPU.