Real-time reverse transcription PCR (real-time RT-PCR) is an established technique for quantifying mRNA in biological samples. Benefits of this procedure over conventional methods for measuring RNA include its sensitivity, large dynamic range, and the potential for high throughout as well as accurate quantification. Its enhanced specificity is particularly useful for immunological research, which frequently involves analysis of proteins derived from different splice variants of the original transcript.1 Furthermore, many of the key proteins (eg cytokines and transcription factors) are found in such low abundance that real-time RT-PCR quantification of their mRNAs represents the only technique sensitive enough to measure reliably their expression in vivo.2,3 Although real-time RT-PCR is widely used to quantitate biologically relevant changes in mRNA levels, there remain a number of problems associated with its use. These include the inherent variability of RNA, variability of extraction protocols that may copurify inhibitors, and different reverse transcription and PCR efficiencies.4 Consequently, it is important that an accurate method
of normalisation is chosen to control for this error. Unfortunately, normalisation remains one of real-time RT-PCRs most difficult problems.5 Several strategies have been proposed for normalising real-time RT-PCR data. These range from ensuring that a similar sample size is chosen to using an internal housekeeping or reference gene (Table 1). These approaches are not mutually exclusive and can be incorporated into a protocol at many stages (Figure 1). Here we discuss the respective advantages and disadvantages of each technique.