Camera lenses suffer from aberrations that can cause noticeable blur in captured images, especially when large apertures are used. With very small apertures, images become less sharp due to diffraction. The blur that is due to camera optics can be removed by deconvolution if the PSFs that accurately characterize this blur are available. In this paper we describe a new system developed by us that allows estimating optics blur PSFs in an efficient manner. It consists of a new test chart with black square tiles containing small white random circle pattern and software for fully automatic processing of captured images of this chart. It can process a high resolution image of the chart and produce PSF estimates for a dense set of field position covering the entire image frame within a couple of minutes. The system has been tested with several different lenses and the estimated PSFs have been successfully used for removing optics blur from a variety of different images.
It is well known that optical proximity effects are highly dependent upon the details of the illumination
source. Tremendous effort is taken to match illumination source profiles between tools, as well as to
appropriately represent the source intensity distribution in the models used for OPC and post-OPC
verification. OPC software typically models the intensity profile in such a manner that empirical fitting of
the CD data during model calibration can result in a representation of the "effective" source. In some
cases, an actual measured source profile is available and can be referenced directly in the OPC recipe.
However, it is common to average the 4 quadrants of a measured source profile such that the source
representation is symmetrical about the <i>x</i> and <i>y</i> axes. This is done so that optical proximity correction can
be applied hierarchically, with a single correction applied to a cell which may be instantiated in multiple
orientations within the chip. It has generally been accepted that the positive runtime benefit accompanying
this symmetrization is beneficial relative to any potential accuracy loss for cells oriented in different
directions. In this paper, we investigate the impact of real source profile asymmetries on identical features
with different orientations.
Superresolution techniques attempt to reconstruct a higher resolution image (or video) from a low resolution input video sequence. The motivation here is that camera and scene motion lead temporal frames in the source video sequence contain similar but not identical information. In this paper, we present a novel warping synthesis combined with an error correction scheme to reconstruct an SR video from an input video sequence as an alternative to the conventional `sliding window' approach. Simulation results demonstrate that the proposed scheme reduces dramatically the computational load with little or no loss of reconstruction quality.
The performance of superresolution video enhancement relies heavily on the robustness and accuracy of motion estimation techniques. In this paper, we propose a novel and efficient block matching motion estimation algorithm suitable for estimating general motion existing in low resolution video frames. We exploit the spatial correlations between motion vectors and apply a coarse-to-fine multi-stage scheme to get the dense motion fields. We incorporate our motion estimation technique into the Projection Onto Convex Sets (POCS) superresolution framework. Experimental results show that the resulting high resolution images yield visual sharper enhanced images with significant PSNR improvement.
Conference Committee Involvement (2)
Digital Photography X
3 February 2014 | San Francisco, California, United States
Digital Photography IX
4 February 2013 | Burlingame, California, United States