Fill Rate

Fill rate refers to the number of pixels that a video card can render or write to memory every second. It is measured in megapixels or gigapixels per second, which is obtained by multiplying the clock frequency of the graphics processing unit (GPU) by the number of raster operations (ROPs). GPUs with higher fill rates are able to display video at higher resolutions and frame rates compared to GPUs with lower fill rates.

There is no standard for calculating and reporting the fill rate, so companies have come up with their own ways of calculating it. Some multiply the clock frequency by the number of texture units, while others multiply the frequency by the number of pixel pipelines. Whatever the method, the calculation produces a theoretical value that may or may not fully represent real-world performance.

The fill rate is a GPU performance rating that corresponds to its ability to render pixels and produce high-quality video. The actual fill rate depends on a lot of factors, including other system hardware, and even drivers. The fill rate has been used as a performance indicator in the past, but as GPU technology shifts, so do the performance indicators.

The complexity of a scene can be increased by overdrawing pixels, which happens when one object is drawn over another, covering it up. This complexity is a waste because one of the objects is obscured from view. When the scene is more complex than the fill rate can handle, the frame rate will drop, causing the visuals to stutter.

Post a Comment

0 Comments