Using the Atlas to Formulate Questions for Program Review
… one way of Using the Atlas
Illustrative review questions on objectives, curriculum, assessment, and comparators
As described in the Background section below, program reviews typically involve a self study and an external review, and they typically address a number of areas, from program design and student success to faculty quality and research output. The examples from two universities highlighted below illustrate that exactly what is to be reviewed can be described in different ways and organized under different headings. We identify four broad areas of review for which Atlas resources might prove useful: a) articulation of program mission and learning objectives; b) curriculum design; c) assessment of student learning; and d) selection of appropriate program comparators. The following questions under these four areas are illustrative; such questions could be posed during the self study and also by the external reviewers.
A – Questions on learning objectives
- Does the program’s statement of mission address the special requirements of a professional Master’s program as outlined in Professional Master’s?
- How does the program’s statement of mission compare with the expectations of the relevant accrediting body, such as the expectations listed in the mission-based standards for the Canadian Association of Programs in Public Administration (CAPPA) which can be found at CAPPA Standards for Student Competencies?
- How does the program’s description of learning outcomes compare with best practice recommendations such as those in Describing Learning Outcomes, and illustrated in ANU Crawford Learning Outcomes and Melbourne MPPM Learning Outcomes?
B – Questions on curriculum design
- How well does the required-course curricular content contribute to the student competencies expected by the relevant accreditation body, such as those specified in the universal standards for student competencies described in CAPPA Standards for Student Competencies?
- How well does the required-course curricular content align with the competencies expected of practitioners, such as the competencies set out in the Policy Profession Competencies Table?
- How well does the required-course curricular content align with the subjects and topics generally thought to constitute the MPP/MPA core, such as those set out in Table 1 of Atlas Topics?
- How does the conceptual rigour of the required courses compare with that in the required courses of comparable programs as illustrated, for example, in Top 50 Concept Readings for Toronto MPP Courses?
- To what extent do courses use live cases and real clients, and does their use compare with that in highly regarded programs such as described in Live Cases and Real Clients in MPP/MPA Courses?
- How do the hours-of-study demands in each required course compare with those in highly regarded programs, as illustrated, for example, in Comparing Course Workload – Quantitative Methods?
C – Questions on assessment of student learning
- Does the program have a mechanism for assessing student comprehension of the key concepts taught in its courses, along the lines of Concept Comprehension Questions?
- How well would students score on multiple choice quizzes that assesses concept comprehension, for example, Quiz 7 – Fifty Concepts in Governance and Institutions, Quiz 8 – Twenty Concepts in Policy Analysis and Process, or Quiz 9 – Forty Concepts in Implementation and Delivery?
D – Questions on appropriate comparators
- Is the program appropriately categorized in MPP/MPA Curricular Types and which of the other programs in that category would be appropriate comparators?
- Which other MPP or MPA programs listed in Programs are appropriate comparators?
- Are there design elements in any of four Atlas Reference Programs that might provide useful models?
Background on program review, using two examples
In many universities, all graduate programs are required to undertake a program review on a periodic basis, typically every seven years. Program reviews involve a self study followed by an external review by persons from outside the program. The program review process is described below for two Canadian universities.
Examples – University of Toronto and University of Manitoba
The University of Toronto Quality Assurance Process (UTQAP) is described at https://www.vpacademic.utoronto.ca/reviews-academic-plans/. For ease of reference, three key documents have been converted to pdf and uploaded to the Atlas:
- University of Toronto, UTQAP Cyclical Review Terms of Reference
- University of Toronto, UTQAP Cyclical Review Self-Study Template
- University of Toronto, UTQAP Cyclical Review Report Template
The University of Toronto’s MPP program can be found at Toronto Munk and at https://munkschool.utoronto.ca/publicpolicy/.
The University of Manitoba Graduate Program Review process is described at http://umanitoba.ca/faculties/graduate_studies/admin/123.html and the key document has been converted to pdf and uploaded to the Atlas:
The University of Manitoba’s MPA program can be found at Manitoba-Winnipeg and at http://umanitoba.ca/faculties/arts/departments/political_studies/master_pa/index.html.
Professional programs, program objectives, and learning outcomes
Programs that deliver the MPP, MPA, and similar degrees are often characterized as professional Master’s programs. The review processes described in most universities do not distinguish between professional and non-professional programs. Nevertheless, the characteristics of Professional Master’s reproduced in Exhibit 1 below could be expected to be incorporated into the program objectives and learning outcomes.
Exhibit 1: Characteristics of Professional Master’s Programs
|
For many professional programs, there exist professional associations that provide guidance on quality assurance, including accreditation. For Canadian MPP and MPA programs that body is CAPPA, the Canadian Association of Programs in Public Administration (https://cappa.ca/en/). CAPPA operates a voluntary accreditation process, described at https://cappa.ca/en/what-we-do/accreditation/. For American MPP and MPA programs the professional association is NASPAA (https://www.naspaa.org/), and its voluntary accreditation process is described at https://www.naspaa.org/accreditation.
Program design, curriculum, assessment, and comparators
Program reviews typically include a review of the program design and the curriculum, including student assessment. This is illustrated by the bullets in Exhibit 2 below, reproduced from the program evaluation criteria in University of Toronto, UTQAP Cyclical Review Report Template. Program reviews also usually call for a comparative assessment. For example, the Template calls for an “Assessment of the division/unit and the program(s) under review relative to the best in Canada/North America and internationally, including areas of strength and opportunities.”
Exhibit 2: University of Toronto Program Evaluation CriteriaObjectives
Admission requirements
Curriculum and program delivery
Assessment of learning
Quality indicators
Additional graduate grogram criteria
Quality enhancement
|
Self study and external review
At the University of Toronto, the self study process is set out in University of Toronto, UTQAP Cyclical Review Self-Study Template and the external review process is set out in University of Toronto, UTQAP Cyclical Review Report Template.
At the University of Manitoba, the University of Manitoba, Faculty of Graduate Studies, Periodic Review of Graduate Programs document calls on the self study to provide:
- Program description (objectives, areas of specialty, innovative features, particular strengths – with evidence, contributions to the university’s reputation and to the needs of the province and country, and the program requirements such as admission and course requirements)
- Human resources (faculty, support staff, and other)
- Physical resources (space, equipment, computer, and library)
- Graduate students (enrolment, completion, entrance GPA, employment on graduation, student support, publications)
It calls on the external review (The Review Committee) to: “assess the quality of the graduate program(s) and comment on the program(s) in relation to the stated strategic directions of the unit and the parent Faculty” and provides 15 review headings as reproduced in Exhibit 3.
Exhibit 3: University of Manitoba Review HeadingsThe Review Committee may be guided by the following headings although not be restricted to them. It is requested that the committee conclude its report by classifying the program(s) in one of the stated categories and provide justification for the category chosen. The Review Committee must articulate clear recommendations and/or priorities of choice where appropriate to do so.
I – Continue as is; OR II – Requires minor revision or restructuring to enhance effectiveness or appeal; OR III – Major change, restructuring or amalgamation required to continue. |
The purposes of program review
The the University of Manitoba, Faculty of Graduate Studies, Periodic Review of Graduate Programs document provides a compelling description of the purposes of program review:
“There are many reasons why institutions conduct reviews or participate in evaluations of their graduate programs. The primary purpose of a program review is the improvement of graduate programs, as measured by the quality of the faculty, the students, library and other educational resources, the curriculum, available facilities, and the academic reputation of the program among its peers. Institutions of higher education, like individuals, require regular scrutiny and self-examination to improve, and the systematic review of academic programs is an integral part of this process of improvement. In the face of the many external pressures on institutions to review programs – from government, public interest groups, and accrediting societies – and the many internal pressures in the form of budget adjustments, space needs, and organizational restructuring, it is imperative that this primary purpose be kept in mind.
“In addition to the improvement of graduate programs, program review, whether at the provincial or institutional level, has several associated objectives or goals. For the individual university, program review helps in long-range planning and in setting both institutional and departmental priorities. It gives administrators and academic leaders critical information about the size and stability of a program, its future faculty resources and student market, its equipment and space needs, its strengths and weaknesses, and its contribution to the mission of the institution. It helps set goals and directions for the future, and ensures that overall academic plans and budget decisions are based on real information and agreed-upon priorities, not vague impressions or theoretical schemes.
“Program review also provides a mechanism for change. Graduate programs, like all social structures, evolve slowly; intellectual differences, bureaucracy, time pressures, vested interests, concern for survival, and simple inertia all make change difficult. By creating a structured, scheduled opportunity for a program to be examined, program review provides a strategy for improvement that is well-reasoned, far-seeing, and as apolitical as possible. Changes in graduate programs which are made in the heat of the moment or in response to a particular action (e.g., annual budget decisions, turnover in administrators, individual faculty promotions, student admissions decisions, or new course approvals) seldom contain the kind of solid information, broad collegial involvement, and careful thought which a program review promotes, and which is necessary for lasting program improvement.
“From an external point of view, program review has two very important purposes. First, it provides a mechanism whereby universities are accountable to society for their activities and for the quality of their programs. Provincial governments, funding agencies, private donors, taxpayers, and tuition-paying students can be reassured through the program review process that the institutions, which receive their support have graduate programs of high quality, are regularly reviewed and revised, and are responsive to the needs of the society and consistent with the aims and objectives of the universities involved. Second, program review assists the universities in their efforts to garner financial, philosophical, and political support from provincial government, federal funding agencies, and other constituencies. The information gathered in the review process, and the assessment of program strengths and needs, provide strong and compelling evidence of the quality of graduate programs, the areas of greatest need, and the foundation on which future improvements should be built. This information can and should support decisions about resource allocation, enrollments, special initiatives, research grants, and even private gifts. The stronger and more careful the program review process, the more persuasive the results.”
Sources
Office of the Dean of Graduate Studies, Simon Fraser University, Professional Master’s Programs, at https://www.sfu.ca/dean-gradstudies/administration/curriculum-planning/creating-new-programs/professional-master-s-programs.html, accessed 17 April 2019.
University of Toronto, University of Toronto Quality Assurance Process (UTQAP), at https://www.vpacademic.utoronto.ca/reviews-academic-plans/, accessed 16 April 2019.
University of Manitoba, Faculty of Graduate Studies, Periodic Review of Graduate Programs, at http://umanitoba.ca/faculties/graduate_studies/admin/123.html, accessed 16 April 2019.
Page created by: Ian Clark, last modified 2 May 2019.
Image: J&S CPAs, at http://www.jscpaapc.com/audit-review-and-compilation/, accessed 16 April 2019.