The indirect assessment model would quickly produce a more comprehensive set of measures than the direct model. Further, as the quote highlights, involving managers in learning to develop performance measures is likely to have pedagogical effects (Oakes et al., 1998). By getting managers to struggle with attempts to measure performance, they would learn how and what to measure, and become committed to the measures. This way, the auditors could focus on their pre-existing expertise – for example, demanding better evidence about attendance records for Provincial cultural and historical sites, ensuring that departments use consistent definitions of water and air quality, and requiring that departments accurately transpose and calculate measures from one source to their own records. The indirect assessment model enabled auditors to comment on non-financial measures using the same techniques as had been developed in financial audits. At the same time, auditors used inscriptions and guidance notes from other jurisdictions to learn about the assessment of customer surveys, ensuring not simply that surveys were consistent over time, but also that the questions asked and analysis conducted (e.g., to form scales) seemed consistent with “best practice”. Thus in 1998 the Office produced its own guidance on conducting client surveys (Office, 1998), and more recently has produced guidance for “Preparing an Integrated Results Analysis”.