Charge coupled devices (CCD) technology has been for a long time the technology of choice in high quality image
sensing. CCDs use a special manufacturing process to create the ability to transport charge across the chip without
distortion. This process leads to very high-quality sensors in terms of fidelity and light sensitivity. Drawbacks are high
power consumption and no possible on-chip processing capabilities. With CMOS reduced feature size technology, it
becomes possible to add on-chip control and processing units in order to obtain a fully integrated camera on a single
chip. For these reasons, it has gained potential for use in many applications. CMOS image sensors(CIS) use traditional
manufacturing processes to create the chip -- the same processes used to make most microprocessors. Based on this
difference, CMOS sensors traditionally have lower quality, lower resolution and higher noise. For gaining high quality
image, the analysis of the types and reasons of noise and noise reduction for CMOS image sensor are very important.
Noise control technology to various noises is discussed in this paper. Methods of noise reduction for linear CMOS
imagers and logarithmic CMOS imagers are different. An important factor limiting the performance of sensor arrays is
the nonuniform response of detectors. Fixed pattern noise caused by the nonuniform response of the sensors gives the
uncorrected images a white-noise-degraded appearance. Nonuniformity correction techniques are also developed and
implemented to perform the necessary calibration for sensing applications in this paper. Noise reduction and
nonuniformity correction are effective ways to gain high quality images for CMOS image sensor.
Multi-channel scanning radiometer, on boarding FY-2 geostationary meteorological satellite, plays a key role in remote sensing because of its wide field of view and continuous multi-spectral images acquirements. It is significant to evaluate image quality after performance parameters of the imaging system are validated. Several methods of evaluating imaging quality are discussed. Of these methods, the most fundamental is the MTF. The MTF of photoelectric scanning remote instrument, in the scanning direction, is the multiplication of optics transfer function (OTF), detector transfer function (DTF) and electronics transfer function (ETF). For image motion compensation, moving speed of scanning mirror should be considered. The optical MTF measurement is performed in both the EAST/WEST and NORTH/SOUTH direction, whose values are used for alignment purposes and are used to determine the general health of the instrument during integration and testing. Imaging systems cannot perfectly reproduce what they see and end up "blurring" the image. Many parts of the imaging system can cause blurring. Among these are the optical elements, the sampling of the detector itself, post-processing, or the earth's atmosphere for systems that image through it. Through theory calculation and actual measurement, it is proved that DTF and ETF are the main factors of system MTF and the imaging quality can satisfy the requirement of instrument design.
The CMOS active pixel sensor (APS) technology is of interest for space-borne instruments, such as low power imagers
for micro-spacecraft, star trackers and machine vision for space micro-rovers. It is significant to evaluate image quality
after the performance parameters of imaging system using CMOS APS are validated. The most fundamental of
evaluating imaging quality is modulation transfer function (MTF). The system MTF is the multiplication of optics
transfer function, detector transfer function and electronics transfer function. In CMOS APS arrays, the pixel area is
constructed of two functional parts. The first part, which has a certain geometrical shape, is the sensing element itself:
the active area that absorbs the illumination energy within it and turns that energy into charge carriers. The second part is
the control circuitry required for readout of this charge. The ratio between the active area and the total pixel area is
referred to as the fill factor (FF), which in APS is less than 100% (in contrast to CCDs where the FF can approach
100%). The preferred shape of the pixel active area is a square. However, designing the active area as a square can reduce the FF. Since the FF influences the signal and signal-to-noise ratio (SNR), it is preferred to keep it as high as possible. Theoretical analysis of the MTF for the active area shape is performed for an L shaped active area (most commonly used). And the effects of pixel active area shapes on imaging quality of CMOS active pixel sensor are analyzed.
Proc. SPIE. 6833, Electronic Imaging and Multimedia Technology V
KEYWORDS: Digital signal processing, Aerospace engineering, Imaging systems, Sensors, Imaging technologies, Signal processing, Charge-coupled devices, Analog electronics, Digital electronics, Prototyping
The novel visible nephogram imaging technology for polar orbit platform is demonstrated in the paper, and it could be
operated in from quarter moon to noon sunlight. The critical technologies and theirs solutions of the novel nephograph
are included: (i) the low light level imaging capability is achieved by the combination of time delay and integration
charge coupled device (TDI CCD) with push-broom imaging method; (ii) the large field of view capability is
implemented by the combination of 3 pieces of imaging module with smaller field of view; (iii) the wide dynamic range
capability is achieved by the combination of TDI CCD with gradient neutral density filter (NDF). On the basis of the
analysis and trade-off of system design, the prototype of novel visible nephograph for polar orbit platform is developed.
The results of experiments and tests in ground demonstration are satisfying, and the nephograph prototype is mainly met
the customer demand. In the end of paper, several problems and theirs solution of novel technology for space application
are also mentioned.
FY-2C is geostationary satellite which is researched and developed by China. The primary advantage of geostationary satellite is the ability to characterize the radiance by obtaining numerous views of a specific earth location at any time of a day. This allows the production of a composite image to monitor short-term weather better. This paper describes a technique that uses multi-spectral infrared composite images of FY-2C to estimate particles emission and recognize fog at night. Radiations of particles detected by FY-2C at different wavelengths are analyzed combined with solar spectral irradiance. Having several spectral bands makes the analysis algorithms more complex and inefficient, thus it is important to choose the most respective bands. By applying Karhunen-Loeve transform to raw data of FY-2C, the infrared images are analyzed. By comparing Eigen image of these infrared images with visible image in the same batch, it is concluded that data of IR3 contribute to the first Eigen image mostly, which shows that the newly added IR3 channel of FY-2C has greatly improved the ability of distinguishing short time weather phenomena. Producing composite images by calculation and analysis at sequential period of time can clearly show changes of fog coverage. The improvement of the geostationary satellite instruments that have come to pass will encourage more widespread use of these derived products in the coming years.
Geostationary satellites play a key role in observing the spatial and temporal variations in surface and atmospheric features, which are important for monitoring short-term weather. While limited in spatial resolution compared to low-earth orbiting satellites, geostationary platform provides a significant advantage. The primary advantage is the ability to characterize the radiance by obtaining numerous views of a specific earth location for any time of day. This allows the production of a composite image to monitor short-term weather better. It will be shown that only 14% of days were missed in retrials, compared to the likely miss rate of 60-70% of a polar orbiting satellite. Mie theory is applied to calculate scattering characteristics of dust particles, and analyze dust channel selection of meteorological satellite. It can be seen that the visible channel is the primary channel available for dust aerosol observation. Also, the shortwave channel has some sensitivity to aerosols. Therefore, optically thick dust can have significant effect on infrared radiation. The radiative temperature of dust in shortwave infrared and longwave infrared is different also because emissivity is the function of wavelength. We compare the emissivity of rock, soils, vegetation, water and ice in shortwave and longwave channel, referring to the paper of John W. Salisbury. It can be seen that the different emissivity of soils is the most obvious. We apply difference of two wave bands to distinguish dust storm. The practical difference image proves the point. Geostationary orbit satellite is main tool in distinguishing short time weather phenomena such as dust storm due to its observational frequency and range.
CMOS process is mainstream technique in VLSI, possesses high integration. SE402 is multifunction microcontroller, which integrates image data I/O ports, clock control, exposure control and digital signal processing into one chip. SE402 reduces the number of chips and PCB's room. The paper studies emphatically on USB video image controller used in CMOS image sensor and give the application on digital still camera.