The process of determining how client application performance varies
with low-level resources is often labor-intensive and
error-prone. Except in very special circumstances, programmers are unlikely to know “exactly” how low-level
resources affect their performance goals. For example,
the programmer may want a given frame rate but have
no idea how much memory bandwidth is required to meet
that rate This conundrum is a form of “impedance mismatch” between the units of hardware resources and the
programmer-specified QoS requirements