|
A framebuffer (frame buffer, or sometimes framestore) is a portion of RAM〔(【引用サイトリンク】title=What is frame buffer? A Webopedia Definition )〕 containing a bitmap that is driven to a video display from a memory buffer containing a complete frame of data. The information in the memory buffer typically consists of color values for every pixel (point that can be displayed) on the screen. Color values are commonly stored in 1-bit binary (monochrome), 4-bit palettized, 8-bit palettized, 16-bit high color and 24-bit true color formats. An additional alpha channel is sometimes used to retain information about pixel transparency. The total amount of the memory required to drive the framebuffer depends on the resolution of the output signal, and on the color depth and palette size. Framebuffers differ significantly from the vector displays that were common prior to the advent of faster graphics (and consequently the concept of a framebuffer). With a vector display, only the vertices of the graphics primitives are stored. The electron beam of the output display is then commanded to move from vertex to vertex, tracing an analog line across the area between these points. With a framebuffer, the electron beam (if the display technology uses one) is commanded to trace a left-to-right, top-to-bottom path across the entire screen, the way a television renders a broadcast signal. At the same time, the color information for each point on the screen is pulled from the framebuffer, creating a set of discrete picture elements (pixels). == History == Computer researchers had long discussed the theoretical advantages of a framebuffer, but were unable to produce a machine with sufficient memory at an economically practicable cost. In 1969, A. Michael Noll of Bell Labs implemented a scanned display with a frame buffer.〔Noll, A. Michael, "Scanned-Display Computer Graphics," Bell Telephone Laboratories, Technical Memorandum, TM69-1234-8, November 21, 1969.〕〔Noll, A. Michael, “Scanned-Display Computer Graphics,” Communications of the ACM, Vol. 14, No. 3, (March 1971), pp. 145-150.〕 Later on, the Bell Labs system was expanded to display an image with a color depth of three bits on a standard color TV monitor. An even earlier scanned display was implemented at the Brookhaven National Laboratory.〔Ophir, S., S. Rankowitz, B. J. Shepherd, and R. J. Spinrad, "BRAD: The Brookhave Raster Display," ''Comm. ACM'', Vol. 11, No. 6 (June 1968), pp. 415-416.〕 Advances in integrated-circuit memory in the 1970s made it more cost practical to create framebuffers capable of holding a standard video image. In 1972, Richard Shoup developed the SuperPaint system at Xerox PARC. This system had 311,040 bytes of memory and was capable of storing 640 by 480 pixels of data with 8 bits of color depth. The memory was scattered across 16 circuit boards, each loaded with multiple 2-kilobit shift register chips. While workable, this design required that the total framebuffer be implemented as a 307,200 byte shift register that shifted in synchronization with the television output signal. The primary drawback to this scheme was that memory was not random access. Rather, a given position could be accessed only when the desired scan-line and pixel time rolled around. This gave the system a maximum latency of 33 ms for writing to the framebuffer. Shoup was also able to use the SuperPaint framebuffer to create an early digital video-capture system. By synchronizing the output signal to the input signal, Shoup was able to overwrite each pixel of data as it shifted in. Shoup also experimented with modifying the output signal using color tables. These color tables allowed the SuperPaint system to produce a wide variety of colors outside the range of the limited 8-bit data it contained. This scheme would later become commonplace in computer framebuffers. In 1974 Evans & Sutherland released the first commercial framebuffer, costing about $15,000. It was capable of producing resolutions of up to 512 by 512 pixels in 8-bit grayscale, and became a boon for graphics researchers who did not have the resources to build their own framebuffer. The New York Institute of Technology would later create the first 24-bit color system using three of the Evans & Sutherland framebuffers.〔(【引用サイトリンク】title=History of the New York Institute of Technology Graphics Lab )〕 Each framebuffer was connected to an RGB color output (one for red, one for green and one for blue), with a Digital Equipment Corporation PDP 11/04 minicomputer controlling the three devices as one. In 1975, the UK company Quantel produced the first commercial full-color broadcast framebuffer, the Quantel DFS 3000. It was first used in TV coverage of the 1976 Montreal Olympics to generate a picture-in-picture inset of the Olympic flaming torch while the rest of the picture featured the runner entering the stadium. The rapid improvement of integrated-circuit technology made it possible for many of the home computers of the late 1970s (such as the Apple II) to contain low-color framebuffers. While initially derided for poor performance in comparison to the more sophisticated graphics devices used in computers like the Atari 400, framebuffers eventually became the standard for all personal computers. Today, nearly all computers with graphical capabilities utilize a framebuffer for generating the video signal. Framebuffers also became popular in high-end workstations and arcade system boards throughout the 1980s. SGI, Sun Microsystems, HP, DEC and IBM all released framebuffers for their workstation computers. These framebuffers were usually of a much higher quality than could be found in most home computers, and were regularly used in television, printing, computer modeling and 3D graphics. Framebuffers were also used by Sega for its high-end arcade boards, which were also of a higher quality than on home computers. Amiga computers, due to their special design attention to graphics performance, created in the 1980s a vast market of framebuffer based graphics cards. Noteworthy to mention was the graphics card in Amiga A2500 Unix, which was in 1991 the first computer to implement an X11 server program as a server for hosting graphical environments and the Open Look GUI graphical interface in high resolution (1024x1024 or 1024x768 at 256 colors). The graphics card for A2500 Unix was called the A2410 (Lowell TIGA Graphics Card) and was an 8-bit graphics board based on the Texas Instruments TMS34010 clocked at 50 MHz. It was a complete intelligent graphics coprocessor. The A2410 graphics card for Amiga was co-developed with Lowell University. Other noteworthy Amiga framebuffer based cards were: the (Impact Vision IV24 ) graphics card from GVP, an interesting integrated video suite, capable of mixing 24-bit framebuffer, with Genlock, Chromakey, TV signal pass-thru and TV in a window capabilities; the (DCTV ) an external graphics adapter and video capture system; the (Firecracker ) 32-bit graphics card; the Harlequin card, the (Colorburst ); the (HAM-E ) external framebuffer. The (Graffiti ) external graphics card is still available on the market. Most Atari ST (Mega STE model), and Atari TT framebuffers were created for the VME rear connector slot of Atari machines dedicated to video expansion cards: Leonardo 24-bit VME graphics adapter, CrazyDots II 24-bit VME graphics card, Spektrum TC graphics card, NOVA ET4000 VME SVGA graphics card (capable of resolutions up to 1024x768 at 256 colors or 800x600 at 32768 colors), whose design came from the ISA/PC world (it was effectively an ATI Mach32 S: with 1 MB of video RAM). 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「Framebuffer」の詳細全文を読む スポンサード リンク
|