This paper analyses theoretically and experimentally the temperature dependence of
metal–oxide–semiconductor field-effect transistors (MOSFET) with the aim of using them as a
temperature sensor in on-chip thermal testing applications. The proposed analysis provides rules for
the selection of the dimensions and the bias current of the MOSFET in order to have a high sensitivity
to on-chip thermal variations generated by the circuit under test (CUT). These theoretical predictions
are then contrasted with simulations and experimental data resulting from MOSFETs fabricated in a
commercial 0.35 m CMOS technology. Simulations and experimental results with MOSFETs are also
compared with those obtained using a parasitic bipolar junction transistor (BJT). Such a comparison
shows that MOSFET-based temperature sensors offer, in the context of on-chip thermal testing, the
following advantages: fully compatible with the fabrication process, less area required around the CUT,
and more sensitive to on-chip thermal variations caused by the CUT.