Answers to questions about the Academic Portfolio Review can be found in the accordion menu below. For updates on the implementation of the Academic Portfolio Review or to submit a question, please visit the OIRE Projects & Initiatives webpage.

As an additional reference, please see the Academic Portfolio Review Task Force's report, Recommendations for a Comprehensive Portfolio Review of Academic Programs at the University of Illinois Springfield,

Academic Portfolio Review FAQs

Last updated on March 21, 2025

Q1 - Why aren’t global experience courses included?

Global experience courses can be included in the Contribution to UIS Service Courses metric.

Q2 - Why isn’t student feedback, scholarly activity, and diversity metrics included?

The Academic Portfolio Review Task Force discussed these metrics, along with others that were not included in the final report. Reasons for excluding these metrics include incomplete data, low response rates, and other university review processes that already include these metrics. Programs may choose to include discussions about their performance on these metrics in their responses during the fall semester. These metrics may be included in future reviews.

Q3 - What is the definition of curriculum revision?

Curriculum Revision is defined as revisions that have required governance approvals at the Campus Senate level. Any changes to curriculum that did not go through governance at the Campus Senate level can be included in the Ongoing Program Assessment metric.

Q4 - Why does catalog maintenance bear weight in this evaluation process?

According to HLC’s Criterion 2.B. of their Current Criteria for Accreditation, “The institution presents itself clearly and completely to its students and to the public...The institution ensures the accuracy of any representations it makes regarding academic offerings...”  Including courses that are not being taught in our course catalog risks presenting our course offerings inaccurately to students and the public.

Q5 - What are the criteria for the evaluation of the assessment plans?

For this initial portfolio review, if the plan is current and shows evidence of thoughtful consideration, it will receive full marks.

Q6 - How do Lightcast data account for highly-varied career paths of students in liberal art programs?

Lightcast data are based on the program’s 6-digit Classification of Instructional Programs (CIP) code. The CIP is mapped to Standard Occupation Classification (SOC) data based on the National Center for Education Statistics’ CIP-to-SOC mapping. As an example of the results that we get when using Lightcast, the top occupations for students graduating from a program with the same 6-digit CIP code as our BA in Sociology/Anthropology are: (a) Managers, All Other, (b) Social & Human Service Assistants, (c) Compliance Officers, (d) Labor Relations Specialists, (e) Social Science Research Assistants, (f) Anthropologists & Archeologists, (g) Geographers, (h) Sociologists.

Q7 - Who is going to score each program? What are the mechanisms to reduce bias in reporting?

 A data source has been identified for each metric. Depending on the nature of each metric, at least two different individuals will be involved in collecting the data and/or validating the data to help with accuracy and completeness. As an additional validation check, each UEO or identified program contact will receive and review their data so that any questions or concerns with their data values can be addressed before programs are scored across the portfolio. This step will also help to ensure that credit for grant work, community engagement, etc., is applied appropriately to programs.

Q8 - How will the data be determined for students who double-major?

The graduation rate metric depends on the declared major(s) at the time of entry at UIS. If the student had declared a double-major at that time, they will be counted for both programs. The program completion rates and time-to-degree completion metrics will include double-majors.

Q9 - What is the consequence if our program does not complete the UEO survey?

 Programs will receive a score of 0 for those metric items for which data are not received by the deadline.

Q10 - Why are a third of the programs going to be sunsetted or reformed?

Categorization in this way helps ensure that the review is significant, thoughtful, and leads to decision making.  Reforming a program can take on many forms, including reevaluating curricular offerings and requirements, marketing approaches, application processes, etc. Sunsetting or reforming programs can be part of a strategic effort to ensure that resources, both financial and non-financial, are allocated to programs that best meet current needs.

Q11 - What role does the UPCEA report play in the APR process?

Programs that were included in the UPCEA portfolio review will receive their UPCEA review results alongside the UIS APR results.

Q12 - Which programs will receive feedback?

All programs will receive a dashboard report from the Provost’s Office, providing detailed feedback on their performance across various metrics.

Q13 - How should virtual and hybrid events be counted in the Community Engagement metrics?

Virtual events should be included in the Broad-Reaching category since they can be attended by people outside of the greater Springfield and central Illinois region. Hybrid events that have a local component can be included in both the Local category and the Broad-Reaching category.

Q14 - Doesn’t this scoring system create competition between programs?

Across the portfolio: Currently, programs are competing for limited resources. By reviewing all programs holistically, decision-makers will be better positioned to reallocate resources or identify programs that could  make the best use of additional resources. 

Within metrics: The Academic Portfolio Review Task Force agreed to this comparative scoring system within several metrics after identifying challenges in determining threshold benchmarks.