We are at the beginning of a new chapter in the history of printing. But first, let's look back. I would like to share with you Indigo's perspective of the past. Throughout the history of printing, the common denominator of virtually all printing processes has been liquid ink. Liquid ink images are of high resolution, brilliant color, and low cost.
Bitmap printing from digital sources is commonly termed "digital printing"; however, machine retrieval of information content from printed images is typically inaccurate and unreliable in terms of digital information systems standards. For example, good optical character recognition systems provide error rates on the order of 1%. In another example, digital color copying, the loss of information compared to the original digital print file is much higher. Fundamentally, hardcopy printed for the needs of human readers does not provide a reliable digital data channel. In this sense these hardcopies are analog rather than digital documents. Embedded data technologies provide the means for encoding and retrieving digital information marked on hardcopy documents; they enable reliable information exchange between paper and electronic domains. A distinctive embedded data technology called self-clocking glyph codes has been developed by Xerox. Glyph codes provide digital data recording on hardcopy documents with high data density, robustness, pleasing appearance, and allowance for natural graphical integration with other printing. A good example of this code is composed of a rectangular lattice of mark centers with a linear or elliptical mark on each center oriented in one of two directions for recording one bit pei mark or one of four directions for recording two bits per mark. This code has the property of uniform visual texture and appears as a uniform gray or uniform color when printed at suitable scale compared to the viewing distance; for example, 5x5 pixel glyph cells at 300 dpi. Two-dimensional codes for data, synchronization, error correction, and other functions proide remarkable digital system robustness and functionality. Arbitrary digital data may be encoded to represent a vast variety of information. Examples include data representing visible printed content such as text files or color information and nonprinted document aspects such as formatting, revision histories, mathematical relations in spread sheets, and network file references. Thus, glyph codes provide high capacity digital data channels enabling document systems to bridge electronic and hardcopy domains with the functionality and reliability that users have come to expect of digital technologies. The format and appearance of glyph codes facilitate their graphical integration in documents. Together these features enable truly digital hardcopy documents that combine excellence in appearance and function.
A computer was used to examine the surface colors and compositions of Piero della Francesca's frescoes The Resurrection of Christ, St. Julien, and a section from the panel La morte di Adamo from the Leggenda della Vera Croce to electronically analyze Piero's palettes, repair cracks in his works, replace damaged areas, and identify sections in which earlier restorers changed the original work. Electronic restoration now serves as a road map to art restorers, as well as provides the public with a closer representation of works as the artist originally intended. The analysis also reveals surprising new insights into Piero della Francesca's The Resurrection of Christ, including the thorn-based structure of a symbolic tree, and resolves a historical controversy.
The paper explains how to use color as an effective tool to improve communication, especially in technical and business documents. The paper presents a collection of practical recommendations for color selection, semantic references, and pragmatic issues. The text also discusses a process for achieving effective use of color. Researchers and developers can use these recommendations in typical document design projects and can incorporate them into development strategies for hardware and software.
Modern radiation therapy planning depends heavily on computer- generated graphics for the display of anatomical images and for the design and display of beam placement. Hardcopy output of the planning results are a vital part of the patient's permanent record. The types of graphics involved include gray-scale images acquired from computerized x-ray tomography and other imaging modalities. These images are used to aid in localizing the volume to be treated as well as neighboring anatomy. This presentation will provide a summary of the types of images used for radiation therapy planning and an analysis of the current and near future hardcopy requirements.
Students at Michigan State University design postcard-size maps to learn about the nature of design and to gain experience in executing design decisions. Student maps demonstrate the use of color for its various functions. Both students and faculty have also engaged in a variety of color-related cartographic research projects. Research results include a printed approximation of the Munsell Student Charts for use in selecting colors for maps, a better understanding of how to design maps for people with color- vision impairments, an understanding of the effects of color on scale perception, a model for assessing the adequacy of color selections relative to problems caused by surround, and systematization of color schemes for mapping. A current project is attempting to find a set of computer monitor RGB values that approximates the characteristics of the OSA Uniform Color Scales. A review of these works serves to illustrate the role and importance of color in the art and science of mapping and also the interplay of art and science in the area of color in cartography.
The color scheme typology that I propose matches the organization of the perceptual dimensions of color to the logical orderings in data sets that are displayed graphically. The appropriate use of color in complex data displays, such as thematic maps, allows patterns to be easily observed. In this paper, ten color scheme types are matched with systematic paths through HVC perceptual color space to provide guidance on color selection for a complete range of data visualization challenges. I also describe errors in the perceptual structuring of HLS, HSV, and HSB color models commonly available for computer graphics.
A fluorescent light source (FLS) addressed by a CMOS polysilicon thin-film transistor (poly-Si TFT) driver has been fabricated for the first time. The FLS has 800 pixels with a 200 dpi resolution. 5000 fL brightness has been achieved with only 40 V driving voltage because of driving each pixel in static mode. The FLS feasibility has been confirmed by a print head utilized the FLS. The poly-Si TFT driver has been fabricated by a low-temperature process under 600 degree(s)C, utilizing excimer-laser annealing. This enables the use of low-cost glass substrates and allows integrating the driver within the FLS. Therefore, this technology presents compact and low-cost print heads.
The products and technologies employed to produce hardcopy documents, and short-run reproductions of these documents, have changed dramatically in the last two decades. Impact printing devices such as typewriters, daisy wheel printers, and dot matrix printers have been replaced to a significant extent by devices employing nonimpact printing technologies such as laser electrophotography, ink jet, and thermal transfer printing processes. The dramatic growth in the use of nonimpact printers can be attributed to a variety of factors. This paper examines these factors, with an emphasis on image enhancement techniques that have been incorporated in these products and which have facilitated the broad market acceptance of nonimpact printing.
Since the invention of thermal ink jet in 1978, Hewlett-Packard has introduced a series of products based on this technology that have evolved from low resolution monochrome all the way to medium/high resolution color and black and white printers. The first HP product, introduced in 1984, had a resolution of 96 dots per inch. The latest HP offering is the DeskJet 1200C with a 300 by 600 dots per inch resolution for black and white and 300 dpi for color. The 1200C also utilizes HP's resolution enhancement technology to provide high-quality text printing. Given this range of development, the question that arises is "how far can this technology be taken?" Certainly the capability of thermal ink jet to deliver excellent color quality at an affordable cost is hard to challenge. Expectations in this area are still very high and even better color print quality is anticipated in the future. Although lasers deliver superb black and white text quality, the color capability of thermal ink jet makes the technology extremely attractive. There are two ways to look into the future of thermal ink jet. One would be to analyze the engineering issues, along with the manufacturing challenges, that will determine the practical limits of the technology. This obviously involves a discussion of the product plans for the future. A second approach, to be taken in this summary paper, will be to look at the physical limits and contrast them with the current state of the technology. To do this, four sub-systems common to all printing technologies will be defmed. The first, ink storage and delivery, is directly dependent to the issues that affect the life of the pen and overall print speed. Secondary concerns in this area are user re-supply and serviceability. The second subsystem is that of transfer. The transfer process involves the separation of a defined amount of colorant and the mechanism whereby that colorant is deposited on the substrate. Fundamental questions in this area focus on ultimate dot size; i.e., resolution, and print speed. The third subsystem is that or addressing; i.e., receiving and electronically converting print data into the drive voltages for the thermal ink jet resistors. Limitations in addressing technology will have a direct bearing on the size of printhead swath for scanning printheads or the cost of a page-wide array. The final, and possibly most complex subsystem, is fixing of the image to the substrate. This brings into play all of the chemistry involved in ink and paper interactions and the demands for the devices to be able to print on a wide variety of substrates. This is a situation unlike any in commercial printing where inks are selected for the appropriate paper stock. Technology issues here will have a strong bearing on overall print quality with particular emphasis of thermal ink jet's ability to print color graphics and images. This paper will provide a brief summary of the issues involved in these four subsystems from the point of view ofthe physical limits that constrain them.
'Solid' or 'hot melt' ink jet printing has become known as a true plain paper imaging process. This is because the image quality is not only excellent, but it is excellent on virtually any substrate. This results because the spot size is well-controlled on all substrates. This paper identifies and discusses those critical parameters affecting the final spot size. The bottom line is that the limiting phenomenon is freezing of the ink, which is dominated by thermal characteristics of the ink and substrate.
This paper is concerned with experimental studies on process control for driving the heating elements of bubble-jet printheads. Process control plays a key role for print quality as well as for life time of bubble-jet printers. An experimental set-up was built which allows the automatic registration of a series of video images. With this set-up it is possible to study the influence of the driving signal on the drop formation from bubble nucleation to printing. The pulse width of a square reference signal has been split into a burst consisting of multiple pulses with smaller pulse widths. The delay and the number of the individual pulses have been varied. An optimal drive signal has been determined and applied to a commercial printhead. As a consequence of this optimization, a reduction of the input energy and an increase of the droplet speed of approximately 30 percent compared to the single square pulse have been achieved while the mass of the droplet remained unchanged. As a result, burst pulses with optimally modulated pulse width contribute to a substantial increase of the life time of printheads. Additionally, in case of satellite droplets their size has been reduced and merging with the main droplet occurs faster which improves the print quality.
In this paper an improved three-dimensional model of a bubble-jet printhead is presented. It is based on the governing physical phenomena of the bubble growth and its subsequent collapse. It contains the complete heat conduction through the different layers and a realistic geometry of the firing chamber and the nozzle. Furthermore, the temperature dependence of various thermodynamic parameters such as the densities of liquid and vapor, the heat of vaporization and the thermal conductivity of most of the substances are included. Numerous parameter studies have been carried out: water was chosen as test liquid for comparison reasons with other models. The complicated properties of ink on the basis of empirical formulae for mixtures are included also. For this improved model, first, water/diethylene glycol mixtures with various concentration ratios and second, different dyes are included in the model simulations. The results are supported by experimental studies which have been carried out in our inkjet laboratory.
The bleed of one color into another is detrimental to perceived print quality of color-printed images, and is one of the problems encountered in ink-jet color printing. Rapid absorption of ink dye and vehicle into the paper acts to prevent coalescence of color droplets, but too strong an absorption of the vehicle along the paper fibers causes spreading and feathering of the image boundary. The process is therefore very delicate and sensitive to the physical and chemical characteristics of the paper surface. In this work, color bleed of characters printed on experimental sheets by an HP 500C DeskJet printer was measured quantitatively by image analysis. The effects of variation of internal sizing on color bleed and color optical density were measured, as well as effects resulting from surface treatments with different levels of starch and polymeric surface size. Results were compared with analogous measurements for printing without an adjacent color, and also for black ink printing on the same paper. The level of starch in the surface treatment was most important in controlling color bleed, whereas surface size was most helpful in preventing image spread in black ink printing, and in increasing the optical density of both black and composite black images.
Xerox thermal ink jet print cartridges utilize a print element fabricated via VLSI technology. This fabrication process enables high productivity and precise dimensional control as well as the required electronic functionality. However, modifications to the device geometry involve a complex cycle of mask generation, VLSI fabrication, and assembly. The development of the print element is assisted by the use of experimental wafers which contain TIJ devices with variations in many important geometrical parameters. Analysis of data from measurements on these experimental geometries leads to designs.
This paper presents first principles simulations of xerographic imaging where photoreceptors are exposed and discharged to result in a spatial image charge distribution on the surface. The exposure stage involves the solution of the Maxwell equations for wave propagation through the lossy transport layer. The real component of the Poynting vector is then calculated to estimate the intensity of illumination. A direct boundary integral equation method (BIEM) is used to formulate and solve the coupled Helmholtz equations for the tangential component of electric field and its normal derivative. The discharge stage requires consideration of both charge conservation and current continuity. A hybrid BIEM-MOC (boundary integral equation method - method of characteristics) algorithm is used to solve for the electrostatic fields, and to track the space charge migration. Both steady- state and time-transient cases may be treated. Electron-hole pair generation is controlled by a field-dependent quantum generation efficiency. Computed results include: discharged surface voltage, contours of potential through the photoreceptor cross-section, and adjacency effects at the edges of exposed lines. This simulation is implemented within a distributed computing environment that supports interactive steering, interactive browsing and visualization, concurrent processing, iconic assembly of dataflow networks, and multi-level (process-device) simulation. Parallel Virtual Machine, a public domain parallel programming software toolkit from Oak Ridge National Laboratory, is used to attain desktop supercomputing level performance by harnessing the computational power of a heterogeneous cluster of Unix workstations into a single loosely coupled parallel computing resource.
Photoinduced discharge curve (PIDC), which represents the photoconductor surface potential as a function of the input exposure, is a critical parameter in xerographic processes. In general, the PIDC defines a suitable exposure energy to achieve the surface potential of photoconductor for different operating conditions. In order to reach the same surface potential, one can increase the exposure energy or to increase the incident light of flux. Two experimental methods were used to measure the PIDC of organic photoconductor drum. One method is to fix the exposure spot size and changed exposure energy. Another method is to fix the flux of incident light by changing spot size and exposure energy.
Various nonimpact printing methods used printers are in general use. We have investigated an imaging method using the Toner Jet in which a visible direct image is obtained using an electrostatic field to make the toner stick to the paper. Images obtained from the Toner Jet are of equal quality to those obtained by an electrophotographic method. We have confirmed that the Toner Jet method will be an effective nonimpact printing method in the near future.
The present paper attempts to generate an understanding of some of the processes involved in soft roll fusing and the analytical tools needed to describe them. Specifically, a fusing and material process model will be described and its elements discussed. With such capabilities at hand, we are in a good position to synthesize and design the fusing process and optimize performance.
ANSI/IT8, ANSI/CGATS and ISO/TC130 are actively working on standards that help define color for graphic arts applications. This paper describes the standards already in place as well as a summary of the current status of the work underway and/or planned. This includes standards that define colorimetric measurement, targets and data sets for calibration of input devices such as scanners as well as color hard copy output (including printing) devices and standards for the definition of printing processes. The work to develop characterization data sets for use by color management systems and common data formats for the exchange and presentation of this data are also discussed. Key additional areas include the definition of default three-component color data spaces, their practical encoding in 8- and 16-bit formats, and procedures for definition of ink colors in a printing independent manner.
An instrument for measuring and characterizing the color of reflective, transmissive, and emissive samples has been developed and its application to electronic prepress and desktop publishing is described. Total color control throughout the process requires the measurement of color at each step: Reflection or transmission spectrophotometry is needed to accurately measure the color of the original. Spectral radiometry is used to measure the color of the CRT to assure that the color that is scanned in will be displayed on the monitor correctly. The printed proof is measured with a reflection spectrophotometer to assure that the color is the same as the original, or if changes have been made on the CRT the printed output will appear the same as the CRT. Spectral radiometric measurements of the controlled lighting is needed to assure that the ambient lighting is the same when evaluating originals against the CRT display and the printed copy.
This paper is concerned with the effect of spectral range and resolution on spectral colorimetric measurements in graphic arts. The Committee for Graphic Arts Technologies Standards (CGATS.5) specifies CIE measurement conditions for graphic art color. The data shall be measured from at least 400 nm to at least 700 nm at not greater than 20 nm intervals. The reference for spectral data shall be based on computed data at 10 nm intervals. Values, representing the product of CIE illuminant D50 and 2 degree standard observer data, to be used for weighting spectral reflectance data shall be those given in the tables for 10 nm and 20 nm from ASTM E303. These values are based on triangular bandpass characteristics with 10 respectively 20 nm bandwidth at the half power point. The weighted values must be adapted if the measured spectral data begin at a wavelength greater than 360 nm or the last measured spectral data are at a wavelength less than 780 nm. Instrumentation with different spectral ranges and different intervals will produce different results. This paper compares these different results for some printing and proofing systems.
Portable spectrophotometers have been developed with experience gained from producing portable densitometers and have proven to be practical instruments for measuring color in almost any application including graphic arts. The geometry allows for calculation of density and correlation to portable densitometers, while the sphere geometry is preferred in the package printing industry for measuring printing on metallized foil substrates. This paper provides a brief review of instrument design and operation. Considerations in optical geometry, wavelength discrimination and error compensation techniques required to produce a rugged reliable instrument are discussed. Instrument performance testing is discussed with emphasis on ensuring instrument repeatability and inter-instrument agreement. Proper care and maintenance for portable instruments are also discussed including field monitoring techniques for verifying long-term performance.
A grading system for the quality of hardcopy color prints is suggested. It uses two index numbers: one of them describes the size of the available color space in terms of psychooptical discriminability; the other describes the size of the digitization error with respect to the detection threshold. Several examples illustrate the grading system developed at the Lehrstuhl Feingeratebau of the Technical University of Munich. In the CIELAB color space the number of printable and distinguishable colors is determined. It can be approximately calculated from the gamut values and the contrast of black and white. The number can be used as a color quality number for comparisons between hardcopy devices. A typical color inkjet printer now available on the market and using the recommended paper reaches about half the size of the color solid available in offset-printing. Digitization errors are brought in relation to the detection threshold. Contouring, texture, and positioning errors are examined separately in the frequency domain. The overall print quality is determined by the largest error beyond the detection threshold. The psychooptical basics of assessing digitalization errors are summarized. The influences of the dither method and of the halftoning cell are described. The connections between halftoning method, print resolution, and visibility of digitalization errors are shown. Orthogonal halftoning cells are compared to hexagonal cells. Improvements by using different dot sizes and presentation modes are discussed.
An analysis of why people are willing to spend more money to buy color systems versus monochrome systems shows that the colorimetric methods used in today's color management systems are insufficient. To fulfill the user's requirements, it is necessary to preserve the appearance of color when an electronic image is reproduced. After proposing formal definitions for color perception and for color appearance, I will present two problems requiring an appearance model to solve: the color selection problem, and gamut mapping.
The printing process of textile materials uses an ink set dependent of the image to be printed and referred as the primary color palette. The colors of the printed textile material depend on the printing sequence of the ink masks and are referred as the secondary color palette. A single primary color palette may conduct to different secondary color palettes, as a function of printing sequence. This paper provides an analysis of the mechanism of color appearance on the printed textile materials. The analysis conducts to a model to simulate on the computer display the appearance of the textile printed colors as a function of a number of parameters. The simulation includes a generalized Neugebauer model. A hierarchical structure is introduced for the colors of the secondary palette in order to provide the coefficients of the Neugebauer model. For a certain textile material, the color hierarchy is dependent on the ink set and the printing sequence. The color hierarchy is established as result of color calibration process. Printed samples are used for calibration procedure.
Low-cost, color input and output are being increasingly adopted for desktop color applications. Many of these input devices (scanners) share common design attributes and objectives. Output devices (printers) are more diverse in their print engine technology and design intents. As would be expected, the image quality produced by the devices varies considerably from manufacturer to manufacturer. Fortunately, principles of image science and color science can be applied to quantify the image quality performance of these devices and summarize these characterizations with several simple figures of merit.
Color image quality prediction models for two typical documents used as input for color copying machines have been developed to relate subjective image quality ratings to physical image quality metrics using stepwise multiple regression analysis. The typical documents consist of colored map and portrait images. The models were consistent with technical knowledge and achieved high correlation between predicted ratings and measured subjective image quality ratings. By utilizing the models, subjective color image quality can be measured by instrumental measurements, and also color imaging system of preferred image quality can be designed by physical image quality metrics, and this leads to effective image quality design.
Print samples of continuous-tone image by the methods of half- tone and error diffusion are compared and their image qualities are estimated by opinion test. The print quality by the error diffusion method shows better result in quality than that of half-tone method, when optical density increases linearly with the number density of dots.
The technology and architecture for using device-independent color in documents is presented. The expectations and needs of the desktop users of color are examined and contrasted with the technical realities of digital color systems. Several major areas of color research are identified to address the problems of device independent color.
In order to achieve an ideal colorimetric calibration for CMYK (Cyan, Magenta, Yellow, and Black) printers, we propose a novel CMYK determination technique under the restrictions of a colorimetric match, use of the entire color gamut and a smooth gradation of each primary color. The algorithm proposed here includes the following steps: (1) Obtain the two extreme conditions for CMYK combinations to be able to utilize the entire CMYK gamut, namely Maximum Black technique and Minimum Black, (2) Determine an arbitrary black amount between the two conditions, (3) Apply a smoothing technique for the black amount in a uniform color space, (4) Determine the remaining colors, CMY, for colorimetric match. An iterative smoothing technique in a uniform color space is introduced to obtain visually "smoothed" black gradations. The gradation quality for each primary color is evaluated with a hypothetical unstable printer. The smoothed CMYK technique eliminates sudden changes for each primary color, so that a printer using this technique becomes robust against a change of characteristic curves of the printer such as dot gain.
A new approach to the gray component replacement (GCR) has been developed. It employs the color mixing theory for modeling the spectral fit between the 3-color and 4-color prints. To achieve this goal, we first examine the accuracy of the models with respect to the experimental results by applying them to the prints made by a Canon Color Laser Copier-500 (CLC-500). An empirical halftone correction factor is used for improving the data fitting. Among the models tested, the halftone corrected Kubelka-Munk theory gives the closest fit, followed by the halftone corrected Beer-Bouguer law and the Yule-Neilsen approach. We then apply the halftone corrected BB law to GCR. The main feature of this GCR approach is based on the spectral measurements of the primary color step wedges and a software package implementing the color mixing model. The software determines the amount of the gray component to be removed, then adjusts each primary color until a good match of the peak wavelengths between the 3-color and 4-color spectra is obtained. Results indicate that the average (Delta) Eab between cmy and cmyk renditions of 64 color patches is 3.11 (Delta) Eab. Eighty-seven percent of the patches has (Delta) Eab less than 5 units. The advantage of this approach is its simplicity; there is no need for the black printer and under color addition. Because this approach is based on the spectral reproduction, it minimizes the metamerism.
This paper describes GamOpt, a tool for visualization and optimization of color gamuts. In GamOpt, a gamut may be viewed on a computer display, manipulated interactively, or optimized based on constraints. A gamut may be visualized for geometric and color intuition. Gamut points specified in L*a*b* color space may be plotted in projected 3-space. The display may be interactively manipulated to obtain insight about distribution patterns. The gamut may also be color coded in a variety of ways. Multiple gamuts may be visualized at the same time using color or geometrical cues to differentiate them. Two optimization schemes are provided, interactive and analytical. In interactive optimization, a gamut may be modified either interactively or by arbitrary user-defined functions, changing the shape and orientation of the gamut. The analytic optimization approach is based on defining numerical metrics for the goodness of gamuts. The gamut is transformed analytically to optimize these metrics. Using algebraic and neural techniques, we have implemented algorithms to estimate the transfer function between input parameters such as pigment concentrations and the L*a*b* coordinates they generate in a palette. We can then generate the values of the input parameters required to produce an optimized gamut.
When a full-color document is printed on a black-and-white printer (such as when producing black-and-white handouts for a talk that uses colored slides) there can be a substantial loss of information. Usually the colors are mapped to shades of gray according to their luminance. This is fairly effective in the case of pictorial images where most of the useful information is in the luminance channel, but can be less successful for graphical images where hue and saturation may play a more important role. One way to preserve more of the color information is to map colors into visible textures. This paper describes an algorithmic method of performing such a mapping. The technique is an extension of halftoning and produces a binary bitmap for the image. The method accepts and processes any color value, mapping similar colors to similar textures. The texture patterns can be designed so as to preserve the luminance of their corresponding color. Decisions are made locally on a pixel basis, allowing the textured binary image to be incrementally constructed or modified.
To enable optimal reproduction of a compound document containing both text and halftones, it is necessary to identify the halftone regions of the document and process them separately from the text and line arts portion. A technique for halftone detection using the auto correlation of the video stream has been improved to reduce the false classification of small kanji and to enable detection of non-45-degree screens. Typical examples of the detection map and the processed images are presented.
The DCT based coding process of full-color image is standardized by the JPEG. The JPEG method is applied widely, for example a color facsimile. The quantization table in the JPEG coding influences image quality. However, detailed research on quantization tables is not sufficiently available. Therefore we study the relations between the quantization table and image quality. We examine first the influence of image quality given by the quantization table. The quantization table is grouped as 4 bands by frequency. When each value of bands is changed, the merit and demerit of color image are examined. At the present time, we analyze the deterioration component of a color image. We study the relationship between the quantization table and the restoration image. Color image is composed of continuous-tone level and we evaluate the deterioration component visually and numerically. An analysis method using the 2-D FFT can catch a change of a color image data by a quantization table change. On the basis of these results, we suggest roughly an optimal quantization table coefficient distribution.
A digital sharpening algorithm which operates in the luminance, hue and saturation (LHS) color space was applied to simulated images and digital images scanned on a CMYK scanner. The spatial algorithm was the same as the one used in normal operations. Edge detection was performed in the HS space on the luminance (L) image and the sharp signal derived from it was added to the original unsharpened L image. The effect of sharpening the saturation (S) image was also tested. The resulting images were transformed back to the CMYK space, printed and compared to images sharpened in the RGB space with various conventional algorithms. Results show that sharpening in the LHS space avoids contours with color different from that of the object and maintains the original color of very fine details, in contrast to the conventional RGB sharpening algorithm. Some aspects of this method, however, should still be improved before its advantages can justify the changes necessary for implementation in the color separation scanner.
Today, the desktop digital scanner has become a common office peripheral with applications diverse as clip art acquisition, character recognition and document management. With this increase in acceptance, the understanding of computer imaging has left the realm of black magic, known only by a select group of scientist and engineers, and entered into the mainstream of computer literacy. Competitive benchmarking articles in popular computer magazines no longer look at just the price of the scanner, how clear the user manual describes using the software that comes with it and if the technical support department of the manufacturer picks up the phone when they call. Today, simple tests based on sophisticated imaging concepts are employed to compare scanners whose published specifications would indicate that the same results will be achieved if the same target were scanned. These tests are all designed to provide a measure of the relative image quality between scanners. Manufacturers of desktop scanners need to be concerned not only with the image quality of their scanners compared to their competitors, but also the unit to unit consistency from their own production line. With the increase in end user understanding of imaging come in increase in expectations regarding scanner imaging performance
Photoelectrographic printing is a technology which utilizes photoelectrographic masters and conventional electrophotographic toners for short run printing applications. Masters based on onium salt acid photogeneration have many desirable attributes. One shortcoming, however, is their sensitivity to changes in relative humidity. We have previously reported on a class of polymeric binders which largely overcome this problem. We have now found a class of non-ionic acid photogenerators which further enhance the performance of such masters with respect to changes in relative humidity and which enable the use of a broader spectrum of polymeric binders. We examined representative compounds from several classes of non-ionic acid photogenerators. The best results were obtained with sulfonate esters of N- hydroxyimides. Standard polycarbonate or polyester-based formulations containing these compounds along with a near-UV sensitizer exhibit contrast potentials near 90% of the initial surface potential upon exposure with a 500-W mercury arc lamp. This contrast potential remains nearly constant over the range of 30-70% relative humidity.