This work-in-progress paper outlines the implementation, evolution, methodological approach, and design of an evaluation study set in train to monitor and report on the National Mental Health in Schools Program. The first section explores the nature of the evaluation brief and what factors had a particular influence on design. The bulk of the paper is devoted to the design, its strengths and limitations and how threats to validity are being met given that the program being evaluated has many complexities and there are constantly emerging challenges. Many of the challenges facing the evaluation team arise from the fact that each one of the twenty-four volunteer participant schools is involved in a different way, has different characteristics and will use the products and advice that are at the centre of the evaluation (the Mind Matters curriculum materials being piloted in those schools) to suit their own needs. Lack of uniformity of implementation has necessitated a mixed method and extremely flexible evaluation design. As might be expected, a key design consideration has been how to obtain data that will allow for meaningful comparison of pilot sites and of data collected by different methods.