We propose accelerated implementations of bilateral filter (BF) and nonlocal means (NLM) called color-compressive bilateral filter (CCBF) and color-compressive nonlocal means (CCNLM). CCBF and CCNLM are random filters, whose Monte-Carlo averaged output images are identical to the output images of conventional BF and NLM, respectively. However, CCBF and CCNLM are considerably faster because the spatial processing of multiple color channels are combined into a single random filtering process. This implies that the complexity of CCBF and CCNLM is less sensitive to color dimension (e.g., hyperspectral images) relatively to other BF and NLM methods. We experimentally verified that the execution time of CCBF and CCNLM are faster than the existing “fast” implementations of BF and NLM, respectively.
Fourier multispectral filtering is an attractive means to acquire an approximation of full spectral signal with very few measurements. In our previous work, we demonstrated this technique using photodetectors fabricated in-house using a single dielectric thin film as a Fabry-Perot sinusoidal filter. The main disadvantage of such filters is the design trade-off between the oscillation amplitude and the purity of the sinusoidal transmission spectrum. In this work we demonstrate a more attractive technique using liquid crystals (LCs). The birefringence response of the LCs naturally lends itself to a nearly perfect sinusoidal transmission function with high contrast, ideally oscillating between a transmission of 0% and 100%. Furthermore, the period of oscillations can be controlled by the applied voltage, making it possible to continuously tune the transmission spectrum and acquire a high resolution spectrum of the target signal. Our design consists of 16 custom photodetectors integrated with individually-addressable LC pixels. The whole system can be integrated onto a single-chip and enable high resolution spectroscopy capability in portable electronics.
Image correspondence is established by “matching” the feature descriptors of the interest points in the target image to that of the reference image. By acceptance testing, we refer to a postmatching hypothesis test used to screen out potential false matches—conventionally using descriptor test statistics. We propose a new acceptance testing strategy that does not rely on the descriptor test statistics exclusively. The contribution we bring is to demonstrate that, unlike feature matching, acceptance testing may incorporate additional photometric values of the scene to improve the recall rate. We show experimentally that the acceptance testing strategy that incorporates image feature detection statistics we refer to as detector response-ratio thresholding that are usually excluded from the feature descriptor vectors has a superior recall–precision performance compared to the state-of-the-art feature extraction techniques.
The goal of this work is to achieve an ultra-miniature spectrometer that has a footprint of less than 1mm x 1mm. The design is based on a novel technique known as Fourier spectral filtering. The sensor consists of a group of photodetectors integrated with different Fourier filters. Each filter, combined with the intrinsic spectral response of the silicon photodetector, was designed to have a unique sinusoidal transmission function. By utilizing multiple sinusoids and collecting the data from all of the detectors, a Fourier transform can be performed and the spectral content of the signal can be extracted.
Multispectral imaging has the capability to identify the state of objects based on their spectral characteristics. These are features not available with conventional color imaging based on metameric RGB (red, green and blue) colors alone. Current multispectral imaging systems use narrowband filters to capture the spectral content of a scene, which necessitates different filters to be designed and applied for each application. Previously, we demonstrated the concept of Fourier multispectral imaging using filters with sinusoidally varying transmittance [1, 2]. In this paper, we report to the design of a five channel, spatially multiplexed pixel filter array. This enables single-shot images and makes it possible to capture scenes containing moving objects.
Multispectral imaging beyond the three RGB colors still remains a challenge, especially in portable inexpensive systems. In this paper, we describe the design and fabrication of broadband multichroic filters that have a sinusoidal transmission spectra to utilize a novel methodology based on the Fourier spectral reconstruction in the frequency domain. Since the spectral filters are posed as an optimal sampling of hyperspectral images, they also allow for the reconstruction of the full spectrum from subsequent demosaicking algorithms. Unlike conventional Color Filter Arrays (CFA) which utilizes absorption dyes embedded in a polymeric material, the sinusoidal multichroic filters require an all-dielectric interference filter design. However, the goal of most dielectric filter designs is to achieve sharp transitions with high-contrast. A smoothly varying sinusoidal transition is more difficult with conventional approaches. However, this can be achieved by trading off the contrast. Following the principles of a simple Fabry-Perot cavity, we have designed and built interference filters from 0.5 sinusoidal periods to 3 sinusoidal periods from 450nm to 900nm spectral range. Also, in order to maintain a uniform period across the entire spectrum, the material must have a very low dispersion. In this design, we have used ZnS as the cavity material. The six filters have been used in a multispectral imaging test bed.
Microgrid polarimetric imagers sacrifice spatial resolution for sensitivity to states of linear polarization. We have recently shown that a 2 × 4 microgrid analyzer pattern sacrifices less spatial resolution than the conventional 2× 2 case without compromising polarization sensitivity. In this paper, we discuss the design strategy that uncovered the spatial resolution benefits of the 2 × 4 array.
Noise is present in all image sensor data. Poisson distribution is said to model the stochastic nature of the photon arrival process, while it is common to approximate readout/thermal noise by additive white Gaussian noise (AWGN). Other sources of signal-dependent noise such as Fano and quantization also contribute to the overall noise profile. Question remains, however, about how best to model the combined sensor noise. Though additive Gaussian noise with signal-dependent noise variance (SD-AWGN) and Poisson corruption are two widely used models to approximate the actual sensor noise distribution, the justification given to these types of models are based on limited evidence. The goal of this paper is to provide a more comprehensive characterization of random noise. We concluded by presenting concrete evidence that Poisson model is a better approximation to real camera model than SD-AWGN. We suggest further modification to Poisson that may improve the noise model.
Owing to the stochastic nature of discrete processes such as photon counts in imaging, real-world data measurements
often exhibit heteroscedastic behavior. In particular, time series components and other measurements
may frequently be assumed to be non-iid Poisson random variables, whose rate parameter is proportional to the
underlying signal of interest-witness literature in digital communications, signal processing, astronomy, and
magnetic resonance imaging applications. In this work, we show that certain wavelet and filterbank transform
coefficients corresponding to vector-valued measurements of this type are distributed as sums and differences
of independent Poisson counts, taking the so-called Skellam distribution. While exact estimates rarely admit
analytical forms, we present Skellam mean estimators under both frequentist and Bayes models, as well as
computationally efficient approximations and shrinkage rules, that may be interpreted as Poisson rate estimation
method performed in certain wavelet/filterbank transform domains. This indicates a promising potential
approach for denoising of Poisson counts in the above-mentioned applications.
Recent developments in spatio-spectral sampling theory for color imaging devices show that the choice of color
filter array largely determines the spatial resolution of color images achievable by subsequent processing schemes
such as demosaicking and image denoising. This paper highlights the cost-effectiveness of a new breed of color
filter array patterns based on this sampling theory by detailing an implementation of the demosaicking method
consisting of entirely linear elements and comprising a total of only ten add operations per full-pixel reconstruction.
With color fidelity that rivals the state-of-the-art interpolation methods and an order of complexity near
to that of the bilinear interpolation, this joint sensor-demosaicking solution to digital camera architectures can
fulfill the image quality and complexity needs of future digital multimedia simultaneously.
Owing to the properties of joint time-frequency analysis that compress energy and approximately decorrelate
temporal redundancies in sequential data, filterbank and wavelets are popular and convenient platforms for
statistical signal modeling. Motivated by the prior knowledge and empirical studies, much of the emphasis in
signal processing has been placed on the choice of the prior distribution for these transform coefficients. In this
paradigm however, the issues pertaining to the loss of information due to measurement noise are difficult to
reconcile because the effects of point-wise signal-dependent noise permeate across scale and through multiple
coefficients. In this work, we show how a general class of
signal-dependent noise can be characterized to an
arbitrary precision in a Haar filterbank representation, and the corresponding maximum a posteriori estimate
for the underlying signal is developed. Moreover, the structure of noise in the transform domain admits a variant
of Stein's unbiased estimate of risk conducive to processing the corrupted signal in the transform domain. We
discuss estimators involving Poisson process, a situation that arises often in real-world applications such as
communication, signal processing, and imaging.