Explicitly study change strategies.
There is a lack of research on the study and improvement of change strategies in STEM. We believe
that this dearth of research has allowed the development and dissemination model of reform to remain
as the implicit or explicit model behind STEM change strategies despite its lack of proven success.
Some may argue that a poor record of success is no reason to think that the development and
dissemination change model is not effective. Rather, the problem is that change is just difficult and slow
and we should not expect large impacts from our change efforts. The argument that change is just slow
is problematic for several reasons. First, there are examples of sweeping change in education, such as
the move toward high‐stakes testing, which have not been slow. Under the right circumstances, change
can and does take place much faster than the current change in STEM toward less lecture‐based
teaching. Evidence that educational change can be fast indicates that the slow rate of reform in STEM
instructional practices is likely a result, at least in part, of ineffective change strategies and models
rather than the inherent difficulties of change. The change is slow argument is also problematic in that
it encourages a passive attitude. Change in STEM instructional practices has been difficult and much
slower than we would like. But, we should not use this as an excuse to avoid critically evaluating our
current change strategies and models and seeking to improve them. Insanity has been defined as
“doing the same thing, over and over again, but expecting different results”.1 It is not clear why one
should expect the development and dissemination model to work now when it has so far failed to yield
the desired results.
We find it ironic that STEM reformers who chastise more traditional STEM faculty for not treating
their teaching “scientifically” (see, for example, Ref. 1) will, in turn, not take a scientific approach to
their reform efforts. As part of the interdisciplinary literature review mentioned earlier we examined
claims of success or failure of the change strategy investigated [22]. Most of these articles (67%)
claimed that the strategy studied was successful and few (12%) claimed that the strategy studied was
not successful. We found that only 21% of the articles presenting strong evidence to support their
claims of success or failure. An additional 28% presented adequate evidence, 39% presented poor evidence, and 12% presented no evidence. These results suggest that the standards for publishing
research about educational change efforts in peer reviewed journals are not particularly high.
This low percentage of the articles in the literature review by STEM change agents that carefully
document the impact of the change efforts is also particularly striking given that most STEM reformers
do not even write articles about their change efforts. The articles that STEM reformers most often
publish are typically designed to describe and disseminate information about a new instructional
strategy. The many articles like this did not meet the inclusion criteria of the literature review (i.e., they
did not explicitly discuss change models or strategies). Had they been included, the percentage of
articles that were judged to not present evidence of success or failure of a change strategy would almost
certainly be much higher.