Significance: Fourier ptychography (FP) is a computational imaging approach that achieves high-resolution reconstruction. Inspired by neural networks, many deep-learning-based methods are proposed to solve FP problems. However, the performance of FP still suffers from optical aberration, which needs to be considered.
Aim: We present a neural network model for FP reconstructions that can make proper estimation toward aberration and achieve artifact-free reconstruction.
Approach: Inspired by the iterative reconstruction of FP, we design a neural network model that mimics the forward imaging process of FP via TensorFlow. The sample and aberration are considered as learnable weights and optimized through back-propagation. Especially, we employ the Zernike terms instead of aberration to decrease the optimization freedom of pupil recovery and perform a high-accuracy estimation. Owing to the auto-differentiation capabilities of the neural network, we additionally utilize total variation regularization to improve the visual quality.
Results: We validate the performance of the reported method via both simulation and experiment. Our method exhibits higher robustness against sophisticated optical aberrations and achieves better image quality by reducing artifacts.
Conclusions: The forward neural network model can jointly recover the high-resolution sample and optical aberration in iterative FP reconstruction. We hope our method that can provide a neural-network perspective to solve iterative-based coherent or incoherent imaging problems.
We discuss two compact, cost-effective, and field-portable ptychographic lensless imaging platforms for quantitative microscopy. In the first implementation, we use a low-cost galvo scanner to rapidly scan an unknown laser speckle pattern on the object. To address the positioning repeatability and accuracy issues, we directly recover the positional shifts of the speckle pattern based on the phase correlation of the captured images. To bypass the resolution limit set by the imager pixel size, we employ a sub-sampled ptychographic phase retrieval process to recover the complex object. In the second implementation, we place a thin diffuser in between the object and the image sensor for light wave modulation. By blindly scanning the unknown diffuser to different x-y positions, we acquire a sequence of modulated intensity images for quantitative object recovery. Different from previous ptychographic implementations, we employ a unit magnification configuration with a Fresnel number of ~50,000, which is orders of magnitude higher than previous ptychographic setups. The unit magnification configuration allows us to have the entire sensor area, 6.4 mm by 4.6 mm, as the imaging field of view. The ultra-high Fresnel number enables us to directly recover the positional shift of the diffuser in the phase retrieval process. In this second implementation, we use a low-cost, DIY scanning stage to perform blind diffuser modulation. We further employ an up-sampling phase retrieval scheme to bypass the resolution limit set by the imager pixel size and demonstrate a half-pitch resolution of 0.78 µm. For both implementations, we validate the imaging performance via various biological samples. The reported platforms provide cost-effective and turnkey solutions for large field-of-view, high-resolution, and quantitative on-chip microscopy. They are adaptable for a wide range of point-of-care-, global-health-, and telemedicine-related applications.