Using the Atlas to Formulate Questions for Program Review

… one way of Using the Atlas

Illustrative review questions on objectives, curriculum, assessment, and comparators

As described in the Background section below, program reviews typically involve a self study and an external review, and they typically address a number of areas, from program design and student success to faculty quality and research output. The examples from two universities highlighted below illustrate that exactly what is to be reviewed can be described in different ways and organized under different headings. We identify four broad areas of review for which Atlas resources might prove useful: a) articulation of program mission and learning objectives; b) curriculum design; c) assessment of student learning; and d) selection of appropriate program comparators. The following questions under these four areas are illustrative; such questions could be posed during the self study and also by the external reviewers.

A – Questions on learning objectives
  1. Does the program’s statement of mission address the special requirements of a professional Master’s program as outlined in Professional Master’s?
  2. How does the program’s statement of mission compare with the expectations of the relevant accrediting body, such as the expectations listed in the mission-based standards for the Canadian Association of Programs in Public Administration (CAPPA) which can be found at CAPPA Standards for Student Competencies?
  3. How does the program’s description of learning outcomes compare with best practice recommendations such as those in Describing Learning Outcomes, and illustrated in ANU Crawford Learning Outcomes and Melbourne MPPM Learning Outcomes?
B – Questions on curriculum design
  1. How well does the required-course curricular content contribute to the student competencies expected by the relevant accreditation body, such as those specified in the universal standards for student competencies described in CAPPA Standards for Student Competencies?
  2. How well does the required-course curricular content align with the competencies expected of practitioners, such as the competencies set out in the Policy Profession Competencies Table?
  3. How well does the required-course curricular content align with the subjects and topics generally thought to constitute the MPP/MPA core, such as those set out in Table 1 of Atlas Topics?
  4. How does the conceptual rigour of the required courses compare with that in the required courses of comparable programs as illustrated, for example, in Top 50 Concept Readings for Toronto MPP Courses?
  5. To what extent do courses use live cases and real clients, and does their use compare with that in highly regarded programs such as described in Live Cases and Real Clients in MPP/MPA Courses?
  6. How do the hours-of-study demands in each required course compare with those in highly regarded programs, as illustrated, for example, in Comparing Course Workload – Quantitative Methods?
C – Questions on assessment of student learning
  1. Does the program have a mechanism for assessing student comprehension of the key concepts taught in its courses, along the lines of Concept Comprehension Questions?
  2. How well would students score on multiple choice quizzes that assesses concept comprehension, for example, Quiz 7 – Fifty Concepts in Governance and Institutions, Quiz 8 – Twenty Concepts in Policy Analysis and Process, or Quiz 9 – Forty Concepts in Implementation and Delivery?
D – Questions on appropriate comparators
  1. Is the program appropriately categorized in MPP/MPA Curricular Types and which of the other programs in that category would be appropriate comparators?
  2. Which other MPP or MPA programs listed in Programs are appropriate comparators?
  3. Are there design elements in any of four Atlas Reference Programs that might provide useful models?

Background on program review, using two examples

In many universities, all graduate programs are required to undertake a program review on a periodic basis, typically every seven years. Program reviews involve a self study followed by an external review by persons from outside the program. The program review process is described below for two Canadian universities.

Examples – University of Toronto and University of Manitoba

The University of Toronto Quality Assurance Process (UTQAP) is described at For ease of reference, three key documents have been converted to pdf and uploaded to the Atlas:

The University of Toronto’s MPP program can be found at Toronto Munk and at

The University of Manitoba Graduate Program Review process is described at and the key document has been converted to pdf and uploaded to the Atlas:

The University of Manitoba’s MPA program can be found at Manitoba-Winnipeg and at

Professional programs, program objectives, and learning outcomes

Programs that deliver the MPP, MPA, and similar degrees are often characterized as professional Master’s programs. The review processes described in most universities do not distinguish between professional and non-professional programs. Nevertheless, the characteristics of Professional Master’s reproduced in Exhibit 1 below could be expected to be incorporated into the program objectives and learning outcomes.

Exhibit 1: Characteristics of Professional Master’s Programs
  1. Students acquire knowledge and skills relevant to defined career areas.
  2. The program provides exposure to a profession through one or more of the following: co-op/intern opportunities; case studies; interaction with members of professions; applied research opportunities.
  3. A requirement of the program is that students apply what they have learned to “real life” situations or problems, through theses, projects or other examinable media; the intent of such a requirement is to develop critical, inquiring attitudes in those pursuing a profession.
  4. Acquisition of a professional graduate degree is likely to enhance employment opportunities and/or salary levels.

For many professional programs, there exist professional associations that provide guidance on quality assurance, including accreditation. For Canadian MPP and MPA programs that body is CAPPA, the Canadian Association of Programs in Public Administration ( CAPPA operates a voluntary accreditation process, described at For American MPP and MPA programs the professional association is NASPAA (, and its voluntary accreditation process is described at

Program design, curriculum, assessment, and comparators

Program reviews typically include a review of the program design and the curriculum, including student assessment. This is illustrated by the bullets in Exhibit 2 below, reproduced from the program evaluation criteria in University of Toronto, UTQAP Cyclical Review Report Template. Program reviews also usually call for a comparative assessment. For example, the Template calls for an “Assessment of the division/unit and the program(s) under review relative to the best in Canada/North America and internationally, including areas of strength and opportunities.”

Exhibit 2: University of Toronto Program Evaluation Criteria
  • Consistency of the program with the University’s mission and Faculty/unit’s academic plans
  • Program requirements and learning outcomes are clear, appropriate and align with the relevant undergraduate and/or graduate Degree Level Expectations
Admission requirements
  • Appropriateness of admission requirements for the learning outcomes established for completion of the program
Curriculum and program delivery
  • Curriculum reflects the current state of the discipline or area of study and is appropriate for the level of the program
  • Appropriateness and effectiveness of the program’s structure, curriculum, length and mode(s) of delivery to its learning outcomes and degree level expectations; clarity with which these have been communicated
  • Evidence of innovation or creativity in the content and/or delivery of the program relative to other such programs
  • Opportunities for student learning beyond the classroom
  • Opportunities for student research experience
Assessment of learning
  • Appropriateness and effectiveness of the methods used for assessing student achievement of the defined learning outcomes and degree-level expectations, especially in the students’ final year of the program
Quality indicators
  • Assessment of program against international comparators
  • Quality of applicants and admitted students
  • Student completion rates and time to completion
  • Quality of the educational experience, teaching and graduate supervision
  • Implications of any data (where available) concerning post-graduation employability
  • Availability of student funding
  • Provision of student support through orientation, advising/mentoring, student services
  • Program outreach and promotion
Additional graduate grogram criteria
  • Monitoring and management of students’ time to completion in relation to the program’s defined length and program requirements
  • Quality and availability of graduate supervision
  • Faculty commitment to student mentoring
  • Student quality, including for example grade level for admission, scholarly output, success rates in provincial and national scholarships, competitions, awards and commitment to professional and transferable skills
  • Evidence of a program structure and faculty research that will ensure the intellectual quality of the student experience
  • Sufficient graduate-level courses that students will be able to meet the requirement that all course requirements be met through courses at the graduate level
Quality enhancement
  • Initiatives taken to enhance the quality of the program and the associated learning and teaching environment
  • Extent to which initiatives have been undertaken to enhance the program’s accessibility (i.e., for students requiring physical or mental health accommodations) and diversity
Self study and external review

At the University of Toronto, the self study process is set out in University of Toronto, UTQAP Cyclical Review Self-Study Template and the external review process is set out in University of Toronto, UTQAP Cyclical Review Report Template.

At the University of Manitoba, the University of Manitoba, Faculty of Graduate Studies, Periodic Review of Graduate Programs document calls on the self study to provide:

  1. Program description (objectives, areas of specialty, innovative features, particular strengths – with evidence, contributions to the university’s reputation and to the needs of the province and country, and the program requirements such as admission and course requirements)
  2. Human resources (faculty, support staff, and other)
  3. Physical resources (space, equipment, computer, and library)
  4. Graduate students (enrolment, completion, entrance GPA, employment on graduation, student support, publications)

It calls on the external review (The Review Committee) to: “assess the quality of the graduate program(s) and comment on the program(s) in relation to the stated strategic directions of the unit and the parent Faculty” and provides 15 review headings as reproduced in Exhibit 3.

Exhibit 3: University of Manitoba Review Headings

The Review Committee may be guided by the following headings although not be restricted to them. It is requested that the committee conclude its report by classifying the program(s) in one of the stated categories and provide justification for the category chosen. The Review Committee must articulate clear recommendations and/or priorities of choice where appropriate to do so.

  1. Strategic importance of the program(s) in relation to the strategic directions of the budget Faculty.
  2. Whether the concerns raised in the first-cycle review have been adequately addressed.
  3. Comparisons of related program(s) with which the review committee is familiar.
  4. Quality of graduate student supervision.
  5. Quality of students.
  6. Critical mass of students – mix of Masters vs. PhD, and Canadian vs. International.
  7. Time(s) to completion of degree.
  8. Excellence of the faculty and breadth of expertise.
  9. Impact of research done in the unit.
  10. Adequacy of facilities, space, and other resources.
  11. Strengths and weaknesses of the program(s).
  12. Extent to which program objectives are met.
  13. Advertising to prospective students – publications, website, events.
  14. Any recommendations for improvement.
  15. Classification of program(s) in one of the stated categories:

I – Continue as is; OR

II – Requires minor revision or restructuring to enhance effectiveness or appeal; OR

III – Major change, restructuring or amalgamation required to continue.

The purposes of program review

The the University of Manitoba, Faculty of Graduate Studies, Periodic Review of Graduate Programs document provides a compelling description of the purposes of program review:

“There are many reasons why institutions conduct reviews or participate in evaluations of their graduate programs. The primary purpose of a program review is the improvement of graduate programs, as measured by the quality of the faculty, the students, library and other educational resources, the curriculum, available facilities, and the academic reputation of the program among its peers. Institutions of higher education, like individuals, require regular scrutiny and self-examination to improve, and the systematic review of academic programs is an integral part of this process of improvement. In the face of the many external pressures on institutions to review programs – from government, public interest groups, and accrediting societies – and the many internal pressures in the form of budget adjustments, space needs, and organizational restructuring, it is imperative that this primary purpose be kept in mind.

“In addition to the improvement of graduate programs, program review, whether at the provincial or institutional level, has several associated objectives or goals. For the individual university, program review helps in long-range planning and in setting both institutional and departmental priorities. It gives administrators and academic leaders critical information about the size and stability of a program, its future faculty resources and student market, its equipment and space needs, its strengths and weaknesses, and its contribution to the mission of the institution. It helps set goals and directions for the future, and ensures that overall academic plans and budget decisions are based on real information and agreed-upon priorities, not vague impressions or theoretical schemes.

“Program review also provides a mechanism for change. Graduate programs, like all social structures, evolve slowly; intellectual differences, bureaucracy, time pressures, vested interests, concern for survival, and simple inertia all make change difficult. By creating a structured, scheduled opportunity for a program to be examined, program review provides a strategy for improvement that is well-reasoned, far-seeing, and as apolitical as possible. Changes in graduate programs which are made in the heat of the moment or in response to a particular action (e.g., annual budget decisions, turnover in administrators, individual faculty promotions, student admissions decisions, or new course approvals) seldom contain the kind of solid information, broad collegial involvement, and careful thought which a program review promotes, and which is necessary for lasting program improvement.

“From an external point of view, program review has two very important purposes. First, it provides a mechanism whereby universities are accountable to society for their activities and for the quality of their programs. Provincial governments, funding agencies, private donors, taxpayers, and tuition-paying students can be reassured through the program review process that the institutions, which receive their support have graduate programs of high quality, are regularly reviewed and revised, and are responsive to the needs of the society and consistent with the aims and objectives of the universities involved. Second, program review assists the universities in their efforts to garner financial, philosophical, and political support from provincial government, federal funding agencies, and other constituencies. The information gathered in the review process, and the assessment of program strengths and needs, provide strong and compelling evidence of the quality of graduate programs, the areas of greatest need, and the foundation on which future improvements should be built. This information can and should support decisions about resource allocation, enrollments, special initiatives, research grants, and even private gifts. The stronger and more careful the program review process, the more persuasive the results.”


Office of the Dean of Graduate Studies, Simon Fraser University, Professional Master’s Programs, at, accessed 17 April 2019.

University of Toronto, University of Toronto Quality Assurance Process (UTQAP), at, accessed 16 April 2019.

University of Manitoba, Faculty of Graduate Studies, Periodic Review of Graduate Programs, at, accessed 16 April 2019.

Page created by: Ian Clark, last modified 2 May 2019.

Image: J&S CPAs, at, accessed 16 April 2019.