Monitors

Monitors

Monitors

A monitor or a display is an electronic visual display for computers. The monitor comprises the display device, circuitry and an enclosure. The display device in modern monitors is typically a thin film transistor liquid crystal display (TFT-LCD) thin panel, while older monitors used a cathode ray tube (CRT) about as deep as the screen size.

Originally, computer monitors were used for data processing while television receivers were used for entertainment. From the 1980s onward, computers (and their monitors) have been used for both data processing and entertainment, while televisions have implemented some computer functionality. The common aspect ratio of televisions, and then computer monitors, has also changed from 4:3 to 16:9 (and 16:10).

Early electronic computers were fitted with a panel of light bulbs where the state of each particular bulb would indicate the on/off state of a particular register bit inside the computer. This allowed the engineers operating the computer to monitor the internal state of the machine, so this panel of lights came to be known as the 'monitor'. As early monitors were only capable of displaying a very limited amount of information, and were very transient, they were rarely considered for programme output. Instead, a line printer was the primary output device, while the monitor was limited to keeping track of the programme's operation.

In time this array of light bulbs was replaced by a cathode ray tube which could display the equivalent of several dozen light bulbs with greater reliability.

As technology developed it was realized that the output of a CRT display was more flexible than a panel of light bulbs and eventually, by giving control of what was displayed to the programme itself, the monitor itself became a powerful output device in its own right.

Technologies

Further information: Comparison of CRT, LCD, Plasma, and OLED and History of display technology

Multiple technologies have been used for computer monitors. Until the 21st century most used cathode ray tubes but they have largely been superseded by LCD monitors.

Cathode ray tube

The first computer monitors used cathode ray tubes (CRTs). Prior to the advent of home computers in the late 1970s, it was common for a video display terminal (VDT) using a CRT to be physically integrated with a keyboard and other components of the system in a single large chassis. The display was monochrome and the image was far less sharp and detailed than on a modern flat-panel monitor, necessitating the use of relatively large text and severely limiting the amount of information that could be displayed simultaneously.

The capabilities provided by the combination of the CRT and computer systems driving it have increased in capability progressively over the years in step with the general increase in the computer industry's technological capability. Color displays became more common in the late 1970s with the industry generally being led by Apple, Inc. and Atari. Lagging several years behind Apple's 1977 introduction of color display capability in the Apple II, IBM introduced, in 1981, the Color Graphics Adapter, which could display four colors with a resolution of 320 by 200 pixels, or it could produce 640 by 200 pixels with two colors. In 1984 IBM introduced the Enhanced Graphics Adapter which was capable of producing 16 colors and had a resolution of 640 by 350.

CRT technology remained dominant in the PC monitor market into the new millennium partly because it was cheaper to produce and offered viewing angles close to 180 degrees.

Liquid crystal display

Main articles: Liquid crystal display and TFT LCD

There are multiple technologies that have been used to implement liquid crystal displays (LCD). Throughout the 1990s, the primary use of LCD technology as computer monitors was in laptops where the lower power consumption, lighter weight, and smaller physical size of LCDs justified the higher price versus a CRT. Commonly, the same laptop would be offered with an assortment of display options at increasing price points: (active or passive) monochrome, passive color, or active matrix color (TFT). As volume and manufacturing capability have improved, the monochrome and passive color technologies were dropped from most product lines.

TFT-LCD is a variant of LCD which is now the dominant technology used for computer monitors.

The first standalone LCD displays appeared in the mid-1990s selling for high prices. As prices declined over a period of years they became more popular, and by 1997 were competing with CRT monitors. Among the first desktop LCD computer monitors was the Eizo L66 in the mid-1990s, the Apple Studio Display in 1998, and the Apple Cinema Display in 1999. In 2003, TFT-LCDs outsold CRTs for the first time, becoming the primary technology used for computer monitors. The main advantages of LCDs over CRT displays are that LCDs consume less power, take up much less space, and are considerably lighter. The now common active matrix TFT-LCD technology also has less flickering than CRTs, which reduces eye strain. On the other hand, CRT monitors have superior contrast, have superior response time, are able to use multiple screen resolutions natively, and there is no discernible flicker if the refresh rate is set to a sufficiently high value. LCD monitors have now very high temporal accuracy and can be used for vision research.

Organic light-emitting diode

Organic light-emitting diode (OLED) monitors provide higher contrast and better viewing angles than LCDs but they require more power when displaying documents with white or bright backgrounds. In 2011, a 25-inch (64 cm) OLED monitor cost $7500, but the prices are expected to drop