Humans and other living organisms have evolved various known and unknown photoreceptors and responses under full-spectrum optical radiation of sun and sky at the earth's surface. The trend to indoor living over the past 100 years has produced a self-exile from this radiation environment resulting in the disease conditions of rickets and erythema. Concerns for these diseases have pointed attention to the delicate balance between the beneficial and detrimental effects of full-spectrum optical radiation and has confounded the concepts of illumination and radiation safety criteria. Further knowledge has to be obtained to better evaluate the known and unknown effects of full-spectrum optical radiation in a modern society.
We are increasingly being warned of the possible effects of so called "polluted" light, that is light that differs in spectral content from that of sunlight. We should be concerned, we are told, because all animals and plants have evolved under this natural daylight and therefore any difference between that illuminant and the artificial illuminants that are on the market today, is suspect. The usual presentation of the differences between the sunlight and the artificial illuminants are as shown in Figure 1. Here we are shown the spectral power distribution of sunlight and Cool White fluorescent light. The spectral power distributions of each have been normalized to some convenient wavelength so that each can be seen and easily compared on the same figure. But this presentation is misleading for one does not experience artificial illuminants at the same intensity as one experiences sunlight. Sunlight intensities are ordinarily found to be in the 8000 to 10,000 footcandle range whereas artificial illuminants are rarely experienced at intensity levels greater than 100 footcandles. Therefore a representative difference between the two types of illumination conditions is more accurately represented as in Figure 2. Thus if evolutionary adaptations require that humans and other animals be exposed to sunlight to ensure wellbeing, it is clear that one must be exposed to sunlight intensities. It is not feasible to expect that artificially illuminated environments will be lit to the same intensity as sunlight
Ocular effects of ultraviolet radiation, 200-400 nm, are reviewed. Depending upon the exposure parameter involved, UV radiation may be harmful to the cornea, lens and/or retina. Ranges of exposure parameters (wavelength, exposure duration, etc.) for which each of the tissues is susceptible are specified and the nature of the tissue is described. Present understanding of the thermal and photochemical damage mechanism operative for various conditions of exposure are discussed Ocular damage thresholds for wide ranges of exposure parameters are summarized and compared to existing safety standards.
The various types of UV effects on the skin are classified according to the part of the spectrum and their beneficial or deleterious nature. Some hazardous ultraviolet sources used in industrial processes are described, and examples of photoallergy, phototoxicity, and photosensitization resulting from UV exposures are given. The incidence of skin cancer as a function of geographical location and exposure to sunlight is discussed in relation to natural and artificial exposures to long and short wavelength UV, especially in connection with tanning booths. The conclusion is reached that there is enough ultraviolet in a normal environment to propose a hazard, and additional ultraviolet exposure from industrial or consumer sources is not necessary, and should be eliminated wherever possible.
The potential value of medical surveillance examinations for laser workers and other workers potentially exposed to high intensity optical radiation is evaluated. A review of the known adverse biological effects leads to the conclusion that most effects are related to acute and subacute exposures which do not lend themselves to effective medical surveillance. In addition, surveys of thousands of laser workers conducted since 1965 demonstrated that routine periodic medical surveillance has been unnecessary and/or impractical using currently available ophthalmic screening methods. Examination techniques to detect early changes in cataract formation or retinal degeneration (two potential chronic effects of optical radiation) are not sufficiently reliable or specific to be of value as routine screening tests. Epidemiologic studies of large worker groups are suggested to evaluate the potential for these chronic conditions to develop as the result of job exposure; however, specific recommendations for routine medical surveillance are limited to preplacement and termination examinations with appropriate evaluation of individuals following acute injury.
The present concern for the so called nonionizing radiations arises partly from the proliferation of man made devices that emit electromagnetic energy in the ultraviolet, visible, infrared, radiofrequency and microwave portions of the electromagnetic spectrum. A detailed enumeration of these sources is unnecessary other than to say that they range from UV-emitting lamps to high intensity light sources, including lasers, to a wide variety of heating and communications devices.
Recent research has shown that blue light exposure is an important factor in certain types of retinal injury. The mammalian ocular media transmits the spectral band 400-1400 nm to the retina. The short wavelengths (400-550 nm) produce a photochemical or actinic type of damage, while the longer wavelengths (550-1400 nm) produce thermal damage. Distinction between the two types of retinal damage are discussed briefly and the importance of the blue light effect for solar retinitis and eclipse blindness is emphasized. The significance of blue light retinal injury is summarized for various environmental and occupational exposures.
During the last 6 months of 1976, the fovea of two trained rhesus monkeys were exposed to a 10 nm bandwidth source, centered at 441 nm, at energy levels representing up to three times that required in the paramacula to cause minimal, ophthalmoscopically visible lesions to appear 48 hours postexposure (30 J/cm2 to 90 J/cm2, 1000 sec). After the acute period, a visual decrement lasting beyond 30 days postexposure occurred only for the 90 J/cm2, 1000 second fovea' exposure; the statistically significant criteria for recognition of the 20/20 (1.0 min of arc) Landolt ring target was not met, although the criteria for the 20/30 (1.5 min of arc) target was continuously met after 33 days of recovery. The animals are still regularly tested in the original protocol for visual acuity, and additionally have been examined for spectral sensitivity and changes in reaction time to the visual stimulus. During the last 6 months of 1979, only the eye originally exposed at the 90 J/cm2 level failed at any test session to meet the recognition criteria for 1.0 min targets, and all eyes met the criteria for 1.5 min targets at every session. This indicates no long-term changes in the visual acuity, ruling out physiologic repair mechanisms operating beyond the first 30 to 60 days postexposure, or long term degenerative changes accumulating after the initial recovery period. Additionally, the spectral sensitivity of the subject exposed to higher levels shows a mild red but no blue deficit for 20/50 (2.5 min) targets compared to the CIE photopic function. At 1.5 min however, the spectral sensitivity would seem to be better fit by a single photopigment curve, centered at 535 nm.
Although present laser safety standards are based on an adequate data base for acute viewing situations, they are limited in predicting the type of change in visual function that might be induced from prolonged or repetitive viewing of laser sources. Viewing requirements in holography, laser display systems, and, in general, repeated exposure to low levels of laser radiation require a more complete data base for optimizing the environmental protection of individuals who will be required to work in such environments. In these studies, we have simulated very low-level radiation environments and determined the effects of repetitive prolonged exposure on the visual function of the Rhesus. Our data suggest that prolonged viewing of such sources, even though they are well below present laser safety standards, can produce permanent changes in visual processes that underlie normal human day (photopic) and night (scotopic) vision, although preliminary studies of morphology have shown no morphological correlate. The coherency of laser light is implicated as a significant factor in inducing these effects. It is recommended that individuals required to work in these situations be frequently evaluated for changes in visual function by presently available clinical instruments for assessment of visual function. Further confirmation of these studies will determine the impact of these research findings on present laser safety standards.
Selective loss of sensitivity to blue and green parts of the spectrum following intermit-tent, repeated exposures to intense spectral lights persists longer than three years following blue lights and between 18 and 40 days following green lights. The "blue-blindness" involves complete and sole loss of the response of the short-wavelength responsive cones. The "green-blindness" complete and sole loss of response of middle-wavelength sensitive cones. Histo-pathology of cones in a "blue-blinded" retina in comparison with cytochemical labeling of short-wavelength cones, reveals that they follow a similar distribution: are sparse in the foveola, reach a peak of about 16 percent of the cones near 1° and fall to 8-12 percent of the cones at 7°. Continuous as distinct from intermittent exposures to similar blue lights produces a wholly different pricture of gross pigment-epithelial damage with little photoreceptor degeneration.
This explains the techniques used by the Laser Branch, Laser Microwave Division, US Army Environmental Hygiene Agency, to evaluate non-laser optical sources. The present hazard criteria and spectroradiometric data reduction techniques are presented. The methods of radiometric measurement are not included.
Ucular damage resulting from exposure to intense light, has been a long standing concern--with solar eclipse burns, snow blindness, and glass blowers cataracts being examples. The development of intense light sources by man, culminating to date with lasers, has increased the possibility of accidental exposures. Systematic laboratory study of ocular damage was initiated in the early 1950's and has continued more or less continuously ever since. Probably the most thoroughly understood mechanism of injury is that described as thermal. This mechanism has been rather thoroughly modeled and the model validated reasonably well within the limits of its applicability. However, other mechanisms of injury such as acoustical shock waves and photochemical interactions have been identified and have received considerable attention in the past decade. The results of the research efforts of many investigators over a considerable span of time have been incorporated into numerous Laser Safety Standards, typified by the American National Standards Institute Z136.1 Standard for the Safe Use of Lasers. These standards, although carefully conceived and based upon a large body of empirical information are neither complete nor final and should be updated as additional information is uncovered.
The National Institute for Occupational Safety and Health (NIOSH) is a major component of the Center for Disease Control in the Department of Health, Education, and Welfare (DHEW). Headquarters for NIOSH are located in Rockville, Maryland, with laboratory facilities in Morgantown, West Virginia and Cincinnati, Ohio. NIOSH has approximately 900 employees with about 60% located in Cincinnati.
Knowledge of both acute and chronic biological effects is currently used to evaluate lamp safety. In some cases, a quantitative basis for avoiding exposures greater than a certain value can be stated. In other cases, however, only a qualitative estimate of the hazard is available. In a discussion that uses mercury vapor lamps, tanning booths, and sodium vapor lamps as examples, the interplay between the two types of data leading to an evaluation of lamp safety is described.
Optical radiation hazard evaluation for broad band sources must usually be performed by a highly trained specialist with the use of sophisticated instrumentation. While some direct reading ultraviolet radiation hazard evaluation instruments have been developed, no similar instrumentation for retinal hazard evaluation is available. The general principles of optical radiation hazard evaluation are reviewed along with the quantities, units, and measurement systems used. A new prototype ultraviolet radiation hazard monitor utilizing a spectrograph and spectral weighting mechanical mask, which provides a direct reading of the effective irradiance according to the ACGIH Threshold Limit Values, has recently been developed. The sensitivity for this prototype instrument is 10-7 W/cm2 (effective irradiance) with an uncertainty of less than 30%.
The exposure limits for lasers, and for other high radiance optical sources have a common biological basis. Hazard evaluation for broad band 400-1400 nm radiation may be based on the laser safety standard ANSI Z136.1-1976, or on the ACGIH occupational exposure standard for light and near infrared radiation TLV 1979. Results should be similar or closely related. The compatibility of the two standards has been examined for black body radiation and other continuous spectral radiation distributions. The influence of the source parameters on the exposure limits has been investigated. Results are presented for two cases: short time exposure to intense light sources, and long time exposure to near infrared sources. Retinal burn hazard exposure limits obtained from the two standards show a rea-sonable general agreement for both cases. Discrepancies between them can be blamed on the different models of the dependence of the exposure limits on the angular subtense of the source.
Ocular dose-response relationships were experimentally determined by ophthalmoscopy and biomicroscopy for selected exposure conditions at the following laser wavelengths: 1.064 μ (neodymium), 1.318 μ and 1.338 μ (neodymium), 1.54 μ (erbium), and 2.06 μ (holmium). The ocular responses were observed in Rhesus monkey eyes. Corneal effects were produced at 1.3 μ, 1.54 μ, and 2.06 μ, and no retinal or lenticular effects were observed for the conditions tested. Both the dose required to produce a minimal corneal lesion and the depth of the response exhibit a wavelength dependence. The corneal damage thresholds were indicative of the relative absorption properties of the cornea. These results suggest that current permissible exposure limits for wavelengths in this region should be elevated to reflect the relative absorption properties of the ocular media. The 1.3 μ neodymium laser appears to offer an advantage in ocular safety, an important consideration in system applications.
The cataracts resulting from IR exposure are compared with those linked to UV exposure. The IR exposure produces changes in the lens proteins, the crystallins, while UV exposure seems to attack specific amino acids. Gel electrophoresis of lens proteins have been used to detect the earliest changes possible in cataract formation following exposure to IR from broad band and laser source irradiation. Cataracts can be easily formed in rabbit lenses in vivo when the laser radiation is restricted to the lens alone at power levels above 1 W for 1 minute. Lower power levels do not produce immediate cataracts but changes in lens proteins can be detected by thin layer isoelectric electrophoresis of plain polyacrylamide gels and with sodium dodecyl sulfate (SDS) or 6 M urea. The plain gels (pH 3.5 to 10) showed a decrease in the a crystallins indicating a possible change of soluble a crystallin to an insoluble high molecular (HM) weight form. However, small amounts of Q and y crys-tallins may also be involved in the formation of a HM insoluble aggregate. Soluble HM weight crystallins often were detected as the a crystallin disappeared. This HM soluble fraction may be an intermediate step in the process in forming insoluble a crystallin. Following higher laser power levels the crystallin has a markedly decreased mobility which also might be a precursor for the insolubilization of all crystallins. Similar changes in the lens proteins are seen following broadband IR exposure in vivo, or in vitro. Lenses incubated in vivo at various temperatures showed some, but not an, of the same changes. The IR exposure can be considered an acceleration of the aging process.
It is well known that for about ten years after it was invented the laser was a solution looking for an application. The laser was used only when the processing operation could not be reasonably done in any other manner. Consequently, controlled bursts of high energy density laser beams were first used to machine hard abrasive materials or to weld in heat sensitive areas. Current implementation has broadened so that the laser provides economically competing technology. In addition, components are now being designed so that they can only be laser processed by a laser. This evolution will be illustrated, with emphasis on safety features.
Methods are presented for determining the retinal exposures produced by broad-band ophthalmic light sources and for comparing these results with safety standards developed for the safe use of lasers (ANSI, Z136.1). Measurements of exposures produced by fundus cameras and indirect ophthalmoscopes are compared with both Z136.1 maximum permissible exposure levels (MPE) and experimental retinal damage thresholds. It is found that while the retinal exposures used in fundus photography are below MPEs, the irradiances used in indirect ophthalmoscopy may actually equal or exceed current MPEs. Methods are suggested for reducing retinal irradiance while maintaining retinal image luminance, and emphasis is placed on the importance of both acquainting the users of ophthalmic devices with any potential retinal hazard and providing the users and designers of ophthalmic devices with realistic protection standards.
Recent consumer awareness of previously unquestioned devices, tne misuse of products and developments in optical technology have created a demand for a better understanding of the potentially adverse effects from intense optical sources. The commercial application of lasers and the various standards for laser product performance have also accelerated this demand for an understanding of the effects of broad band optical sources. The potential for adverse eye changes while viewing the sun during solar eclipse has been recorded since early timer and the reddening of skin and sunburn have been experienced by most people. The Biomedical studies to quantify the effects of spectral emission and dose relationship from manmade sources has recently been undertaken. It is only within the last several decades that the need for understanding the biological effects of optical radiation upon man has become of interest. Much of the early work in understanding these effects resulted from interest in therapeutic application. It is only within the last decade that research has considered the non-therapeutic effects. To our knowledge only a few organizations have thus far established exposure criteria from broad band optical sources to prevent the harmful effects of overexposure. This paper will detail the techniques used by the Xerox Corporation to evaluate broad band optical sources for use in office machines. While the specific examples will be for fluorescent lamp sources, the limits and procedures are equally valid for other broad band optical sources such as arc, flash and lamp arrays found in most industrial, commercial and military applications.
From time to time during the past several years, questions have been raised concerning the potential health implications associated with nonionizing electromagnetic energy emitted from video display terminals (VDTs). For example, so called "editor's cataract," has been attributed (by one investigator) to the use of these devices, presumably caused by the emission of microwave energy. Because of these questions and allegations, a study was undertaken to characterize the electromagnetic emissions from a number of VDTs considered representative of those commonly used. For each VDT a band of frequencies from 10kHz to 18GHz and a band of wavelengths from 200 to 800 nm were examined under normal operation conditions. In addition, the various controls associated with each device were adjusted in order to try to maximize the emissions. In all cases, the sweep frequencies and their first fifty or so harmonics, and the digital clock frequencies and their harmonics were detected but in no case did the individual levels or the sum of all levels even remotely approach any exposure standards or guidelines used in the United States or by any other nation. No levels of electromagnetic energy at frequencies normally considered microwave (greater than 1GHz) were detected that could be directly associated with any terminal. Although complaints of fatigue and eye strain may occur, these problems can generally be traced to local factors such as ambient lighting, glare, poor brightness and contrast, extended viewing time, poor biomechanical posture and, level of job interest and motivation. Based on current medical knowledge, there is no evidence to indicate, nor is it even a subject of speculation, that the emission levels associated with VDTs will have any deleterious effects on the health of those persons using such devices.
An analysis of the electromagnetic emissions of an IBM Model 32772 visual display unit showed no hazardous levels in any portion of the spectrum. The actual level of emission was measured throughout the spectrum from low frequency radio waves through x-radiation, extending from 10 KHz through 10 GHz, then 0.2 to 10 μm, and from 5 to over 40 keV. In many parts of the spectrum, the level of emission was below the sensitivity of available instrumentation. In the radio frequency range, including the microwave region, measurements were also made on black and white and color television sets for the purpose of comparison.