Today an important convergence is taking place between video and computing. Fifteen years ago, computers and television had little, if any, common technology base. Consumer television receivers employed analog signal processing technology and were the only volume market for video displays. Computers, on the other hand, were the primary market for digital technology, and were used with various forms of paper input and output media. In the late 1970s and early 1980s, the computer industry went through a revolution. In addition to increasing memory, processing capabilities, and the use of microprocessors, video displays developed for television became output devices for computers for the first time. These developments spawned a plethora of new concepts and products, including personal computers, word processing, and computer graphics. Computer displays have now become a significant factor in the display industry, once dominated by television. Computer graphic applications, such as computer-aided design and computer-aided manufacturing, are now the technology drivers of display resolution, although consumer television remains the technology driver of display brightness. Furthermore, virtually every television receiver now employs digital circuitry and incorporates at least one microprocessor. In the future, television receivers will be even more heavily dependent upon digital technology.