Chief Academic OfficersAdvanced Placement Upon granting unified accreditation to the University of Maine System in July 2020, our regional accreditor, the New England Commission of Higher Education (NECHE), asked us to prepare a self-study in advance of a Fall 2022 visit by a NECHE-appointed evaluation team. Standard Eight: Educational Effectiveness can be found on this page.

Read the full self-study

Description

Assessment of student learning outcomes

Each University of Maine System (UMS) university measures learning in its academic programs to improve student outcomes and the curriculum. Examples and descriptions of assessment processes and findings are shared throughout this Standard.

Assessment organization

Each university has an office or committee responsible for collecting and reporting data on academic program learning outcomes. Typically, the Provost’s office oversees academic program assessment efforts in coordination with university and/or UMS institutional research staff, university assessment staff, and the academic deans.

University of Maine (UM) and University of Maine at Machias (UMM) have an Office of Institutional Research and Assessment with an Assessment Coordinator and an Assessment Data Analyst, while University of Southern Maine (USM) has an Office of Academic Assessment. University of Maine at Presque Isle (UMPI)’s assessment work is housed with the Executive Director of Academic Development and Compliance. University of Maine at Farmington (UMF)’s education programs are assessed through a Teacher Education Accreditation office. University of Maine at Fork Kent (UMFK), UMF’s non-education programs, and the Law School use assessment committees comprising faculty and staff. At the University of Maine at Augusta (UMA), assessment is the responsibility of academic programs with assistance from an Office of Institutional Research and Assessment.

Student learning outcomes

Every university publishes learning outcomes for its academic programs on departmental, college, or university websites. Additionally, UMF, UMPI, and UMFK publish their outcomes in their course catalogs, and UMFK, UMA, and the Law School also publish course outcomes in course syllabi.

Graduate programs at UM participate in a program learning outcomes assessment process as defined in the UM assessment plan. A three-year phased approach has been taken for onboarding graduate programs. All programs are expected to have created program learning outcomes by May 2022. In the 2022-23 academic year, programs will map curricula and draft assessment plans. Reporting of annual assessment results and cohort-based three-year assessment reporting will start in 2023-24. Review of three-year assessment reports by the UM Office of Institutional Research and Assessment and the UM Assessment Advisory Board—with subsequent feedback to programs— will commence in 2024-25.

Assessment data collection

Tools and processes used to collect program assessment data vary across UMS, but all of the universities collect learning artifacts to measure outcomes. The most commonly reviewed artifacts are student course evaluations, portfolios, papers, exams, projects, capstones, rubrics, and surveys of students, graduates, and employers.

In UMF’s teacher education program, data collected every semester includes dispositions assessment, essential areas of teaching assessment, end-of-program surveys, classroom management observation checklist, teacher work sample/contextual factors analysis, and unit- wide lesson plans and rationales. Additional data is collected on an annual or biannual basis, including focus group data, and employer and graduate surveys.

Some UMS universities have centralized and rendered uniform their collection and storage of assessment data. For example, UMPI uses MaineStreet to collect students’ program learning outcomes scores and course grades, Google Drive to collect all faculty course evaluations and faculty-generated course assessments, and Google Forms to administer student surveys. UM uses Google Sheets and Forms for all assessment reporting. USM uses an in-house form— the Assessment of Student Learning Plan— that all academic programs complete at the end of each academic year. TK20/Watermark is used to collect and track student data in teacher education programs at UM, UMA, UM, and USM.

As described in the university’s E Series data, UM’s kinesiology and physical education program evaluates student learning through an extensive set of assessments, including candidacy portfolio, pre-service teacher dispositions, lesson plans, seminar evaluation, Praxis test scores, student teaching reviews, student teaching portfolio review, and exit review. These are in addition to the assessment of key course artifacts (such as projects, tests, and essays).

Photo of people conducting research on a simulated beached whale outside near the water.

Review of assessment data

At most UMS universities, the Provost’s office and the academic deans review program assessment data (e.g. UMPI, UMFK, USM, UMA, and UMF). At some universities, additional offices and staff review the data. At UMPI, for example, the Executive Director of Academic Development, academic program coordinators, and program faculty also review it. At USM, academic assessment offices join the Provost’s office and the deans in reviewing the data. UM, UMM, and the Law School use committees. For example, at UM an Assessment Advisory Board evaluates all programs’ assessment reports using a detailed rubric and holds a feedback meeting with each program.

Although there is variation in the timelines for reviewing program assessment reports across the universities, most encourage or require annual reporting. UM, UMM, USM, UMA, and UMF all require annual reporting, with some exceptions, and UMA’s annual reports culminate in a five-year program review. UM also requires a cumulative report every three years. UMPI requires a program report for existing academic programs every seven years and for new programs every three years. The Law School reports on select outcomes each year, culminating with a complete report of all program outcomes every three years.

Examples of closing the loop

Examples abound of how assessment of data has led to programmatic or curricular changes. UMPI has changed course requirements and sequencing based on student learning outcome data, while UMFK’s nursing program made curricular revisions based on the licensure pass rate of its graduates. The Law School added required courses after mapping its curriculum to its stated outcomes.

At UMF, the Division of Psychology determined that the writing proficiency of students in its Research Methods course did not consistently meet faculty expectations, despite students’ ability to describe and interpret their research findings accurately. In response, the division created a Sophomore Seminar to improve the writing skills of all psychology majors prior to enrollment in Research Methods. UMF has continued to use student essays to refine and improve the design and implementation of its Sophomore Seminar.

General Education outcomes at USM’s undergraduate programs are reviewed annually by a Core Curriculum Committee that uses the university’s new course proposal process and related curriculum and policy changes to improve student learning. As USM’s E Series data indicates, factors considered include “[s]ignificant changes in curricular design (e.g. implementation of a new writing sequence), curriculum policies, transfer policies and equivalencies (e.g. establishment of new writing requirement transfer equivalents), and assessment approaches (e.g. writing faculty focus groups and plans for direct assessment of student writing products at three levels).”

General education

Each UMS university has established general education learning outcomes and processes for assessing those outcomes. For example, UMA gathers student artifacts for assessment from a small number of classes, and plans to expand collection when its methodology is finalized. UM’s Faculty Senate recently voted to look at ways to modify and improve assessment. UMFK is working on a holistic approach and is augmenting its rubrics. Start dates for general education assessment work across UMS are typically fairly recent: e.g. 2013 for UMA, 2015 for UMPI, and 2018 for UM.

UM assesses one general education category (out of nine) each semester using rubrics based on the American Association of Colleges and Universities (AAC&U) VALUE rubrics. UMPI uses in-class assessment, and UMFK uses e-portfolios. UMA and USM use different methods to assess different types of general education requirements. UMA uses outcome-based assessment, tasking each program with developing its own assessment. Programs then use writing samples from English 101 to evaluate written communication, standardized tests (e.g. the California Critical Thinking Test) to assess critical thinking, and in-class quizzes to assess cultural diversity and ethics. USM uses tracking studies, syllabus/assignment review, student National Survey of Student Engagement (NSSE) responses, student focus groups, and other course- level surveys.

In 2014, UMPI embarked on an ambitious general education revision. Over several years, UMPI is converting its program to an outcomes-based program. In the revised program, courses provide students with instruction and experience in five general outcomes: 1) Effective Written and Oral Communication, 2) Critical and Creative Thinking, 3) Quantitative and Scientific Reasoning, 4) Information Literacy, and 5) Global Consciousness and Intercultural Awareness. Each is split into 22 sub-outcomes.

UMPI faculty use the sub-outcomes to clarify what students will know, understand, and do upon completing their general education program. Faculty have designed formative and summative assessments that use General Learner Outcomes (GLO) rubrics to assess student proficiency by the end of a course. If students do not meet proficiency but are still developing their skills and knowledge, they can earn an NP (not sufficiently proficient) and are given additional time. If students do not meet proficiency in that time or did not earn an NP, they do not pass the course.

Block transfer

While each UMS university operates its general education and assessment programs separately, there have been encouraging efforts to increase coordination. For example, UMM and UMPI recently changed parts of their general education requirements to bring them into closer alignment with UM’s requirements. In 2015, UMS established block transfer agreements between its universities and the Maine Community College System facilitating student transfer within and into UMS.

The block description includes only existing common outcomes, with the understanding that each local general education program is more extensive and includes other outcomes. A working group of faculty and staff from the seven UMS universities used the LEAP Essential Learning Outcomes as a common framework and language for describing and forming common outcomes in general education programs across UMS.

Block transfer has been useful in helping UMS deliver degree programs across two or more of its universities, such as UMA’s partnership with UMF to offer the UMA nursing program to UMF students. UMF nursing students satisfy UMF’s general education requirements and then transfer them as a block to UMA, a move that frees those students from fulfilling some UMA-specific requirements.

Co-curricular assessment

Advising

In addition to program and general education assessment, UMS universities assess related functions vital to student learning and success. One of the most critical is advising. The universities follow broadly similar advising practices, but how they assess advising differs. In 2011, UMS developed a System-wide Advising Group (SWAG) to provide faculty and staff an arena for sharing best practices and methods for assessing and improving academic advising.

Experiential learning

Each UMS university offers extensive opportunities for experiential learning, including internships, service learning, and field experiences. Assessment of these programs is conducted through student course evaluations and end-of-program intern and supervisor evaluations. At USM, Engaged Learning is included and assessed as one of thirteen general education outcomes. UMM requires a service learning experience in each major. The UM Experiential Programs Innovation Central (EPIC) program offers in-depth learning and skills in research, interdisciplinary experiences, new technologies, innovation, and design and prototyping through participating units: the Center for Undergraduate Research; the Advanced Manufacturing Center; the Center for Innovation in Teaching and Learning; the Foster Center for Innovation; and the Innovative Media Research and Commercialization Center.

Each UMS university offers study abroad, faculty-led travel courses, international internships, and student teaching experiences. International Programs offices make global education accessible through partnerships that permit students to pay tuition at their home universities (with the benefit of all scholarships and financial aid); through travel courses that are less expensive and shorter (for students who need to work during the academic year and/or are enrolled in programs with extensive academic requirements); and through scholarships that target low-income and first-generation students.

Community partnerships

Photo of four people in a lounge setting, including two playing foosball.

Community-engaged scholarship, teaching, and learning are central to fulfilling USM’s commitment to the success of the region and the state. For example, USM’s Community Engagement and Career Development Office (CECD) works with community partners, faculty, and students to develop meaningful and impactful community-based learning experiences. To enhance its programs, USM is committed to ongoing assessment of community engagement as reflected in its annual CECD report.

The 9,000 square-foot UMF community garden serves as an educational center that integrates coursework, research, student clubs, campus events, and outreach in surrounding communities. Through the garden, growing community and growing food take place side by side. The vegetables produced are donated to on-campus food closets or pantries and off-campus food banks to help address food insecurity in Maine.

A recent example of both experiential learning and community partnerships is UMA’s co-taught Garden Seminar, which applies theory and research in organizational sociology and community psychology to the practice of growing a community garden and maintaining a successful student organization to support it. In 2019, UMA and its partners harvested 695 pounds of vegetables delivered to food-insecure families through the Augusta Food Bank.

Student retention, course completion, and graduation

UMS Institutional Research maintains public student success metrics for all universities, in addition to metrics tracked and reported by university Institutional Research (IR) and assessment staff. These resources allow UMS to track enrollment and retention over time at all seven universities and the Law School and study student progress, course completion rates, graduation rates, and transfer activity.

The universities collect and analyze data at multiple points in the year and for different purposes. Participating in surveys such as the (NSSE) helps UMS and university leaders assess the overall quality of the academic environment and intellectual and personal engagement of students.

There has been strong collaboration and administrative coordination among UMS universities to respond to the challenges posed by fluctuating retention rates. Additionally, universities have developed a suite of analytical reports to guide academic departments in evaluations of their curriculum and course offerings for enrollment planning purposes. Further planning resources are expected in phase II of the Education Advisory Board (EAB) Navigate implementation as more data become available for predictive analysis.

UMS has several mechanisms for awarding undergraduate credit based on prior learning, each governed by policies outlined in the undergraduate catalogues, including College-Level Examination Program® (CLEP), Advanced Placement Specific Subject Test  (AP DANTES), ACE, review of certifications or credentials, military trainings, and challenge exams. In addition, prior learning assessment (PLA) offices and transfer officers across UMS work with academic departments in leading students through a portfolio review process as needed.

Appraisal

Update on assessing academic programs across UMS

UMS continues to develop and refine its processes for the systematic review of academic programs. To maintain a cycle of continuous improvement, academic program evaluation occurs at all levels, from UMS reporting on strategic priorities to local program review and assessment. Currently, academic review falls under three distinct processes: the Annual Academic Program Report (AAPR), the Academic Practice Letter (APL) on academic program review, and external accreditations at the program level. Taken together—and conducted in conjunction with qualitative measures such as student evaluations, stakeholder group feedback, and benchmark data— these processes provide a thorough and effective review of the academic program portfolio.

Originally intended as a mechanism for fostering broader collaborative discussions among faculty and academic administrators, the AAPR has evolved since 2018 to serve as an annual assessment of the health and sustainability of academic programs. The AAPR provides a set of metrics for academic leaders at each UMS university to use to identify challenges, opportunities for collaboration, and strategic academic goals.

Cumulatively, the universities’ annual assessments create a System-wide structure addressing critical state needs, with a newly established System-wide assessment group taking responsibility for evaluating the AAPR and confirming the efficacy of its methodology and data. The assessment group will use benchmark data and metrics from peer institutions and NECHE to develop robust performance metrics. Those metrics will be combined with data collected from student and stakeholder groups to inform the review process and continuously improve individual programs and the overall portfolio.

The AAPR has undergone two major revisions since 2018, and UMS has reconfigured its data collection format to add new metrics based on two years of collected data. The data guides program approvals and program suspension or elimination decisions affecting all UMS universities. For example, USM recently added a faculty position to its Tourism program, while other reviews have prompted programs to make improvements to marketing, curricular redesign, and shifts in (or additions to) course modalities.

The APL process assesses programs at the university level. These five-year reviews help program leaders evaluate curriculum and research offerings and gauge overall program vitality in relation to the rest of a university’s academic portfolio. University-level review is further supported by evaluation requirements and findings of professional program accreditors, and by feedback from NECHE. The results of external reviews provide longitudinal checks on programs as benchmarked against national best practices.

General education assessment

Photo of three female students in a river. Two are swimming and one is sitting on an inner tube floating in the river.

General education assessment continues to develop and be embedded as institutional practice. Some UMS universities are further along than others.

For example, UMPI’s initial assessment of its general education efforts shows that a high percentage of students from fall 2015 to spring 2018 were able to achieve proficiency or advanced proficiency in 22 general education learning outcomes as evidenced by assessment scores. At UM and UMA, general education assessment cycles are progressing well but not rapidly. Although considerable efforts have gone into designing and administering assessment tools and gathering assessment data, some programs have been slow to close the loop and enact curricular changes based on their review of student learning outcomes.

Assessing student demographics and academic changes

Maine’s high school graduation rate of 87% is comparable to the New England average of 88% (2019). College enrollment rates, however, reveal a widening gap. Maine’s college enrollment decreased from 62% in 2011 to 58% in 2018, while the New England rate has increased from 62% to 66% during that same time frame.

From 2015-20, there was an 11% decrease in total credit hours taken by in-state undergraduate students while out-of-state undergraduate (40%), New England Board of Higher Education New England Regional Student Program (NEBHE) (9.4%), and Early College (95.8%) saw increases in credit hours attempted. There was an overall 11.7% increase in credit hours taken by graduate students across all groups and universities. Total credit hours (undergraduate and graduate) taken from 2015-20 grew 0.4%. First-year average credits taken and passed improved on average across UMS between 2013 and 2019: a 1.1% increase in credits taken and a 1.3% increase in credits passed.

Rates of low and failing grades improved across the UMS from 2015-20. UMF historically had the lowest rates of any UMS university; its rates have ticked upward while others have begun to decline. UM’s efforts had a significant impact in improving their rates from 2018-19 to 2019-20. From 2014-19, cumulative UMS fall-to-spring return rates revealed a decline in retention of first- year and second-year students, but there were overall gains in fall-to-fall return rates among all student groups in that period.

At universities that used the EAB Navigate tool for at least two semesters and scheduled student appointments through it, students who interacted with advisors in that medium persisted at higher rates in both semesters than those who did not. This is notable: these were students whose faculty and professional advisors had concerns about them in the first place. This data is helping UMS emphasize the value of participation in structured progress reports and the importance of time-intensive, high-touch interventions.

Photo of a person kayaking

Early College outcomes

A strong majority of UMS Early College participants go on to enroll in postsecondary institutions following graduation. Across five recent cohorts (2014-19), between 70.5% and 74.2% of UMS Early College students later enrolled in college. This stands in contrast to the overall college-going rate among all Maine high school graduates, which has averaged 62.0% over the last seven graduating classes.

UMS Early College participants increasingly go on to enroll in a UMS university. Of the fall 2014-summer 2015 cohort, 31.8% of participants went on to enroll in UMS. This figure has increased year over year, with the fall 2018-summer 2019 cohort showing the highest level of subsequent UMS enrollment at 36.4%.

The average UMS-going rate among all Maine high school graduates was 19.7% for the last seven graduating classes. Of those who go on to enroll in UMS, the majority enroll at UM (42.5-53.8% of the last five cohorts) or USM (10.3-21.4%). Of UMS Early College participants who subsequently enrolled outside UMS, most chose to attend an in-state university (between 38.4% and 52.0% over the last five cohorts).

Early College (EC) students are more likely to persist in their first year of college. Males and underrepresented students benefit most, with retention rates 12% and 14% higher, respectively, than those of peers who do not participate in EC.

Course evaluations

Course evaluations are given to students in all classes in accordance with the The Associated Faculties of the Universities of Maine (AFUM) contract. Evaluations are delivered through a commercial platform (Explorance Blue) or a university’s campus portal. Course evaluation questions vary slightly by university and program. Questions typically use a Likert scale and also provide students with a space to offer comments, giving faculty quantitative and qualitative data about the student’s experience and views on instructional effectiveness.

The online course evaluation process has the advantages of not absorbing class time, saving on resources required for paper surveys, and allowing for quicker analysis and sharing of results.

Most UMS universities did not actively track response rates for paper-based course evaluations, but anecdotally, completion rates were high. In general, response rates are lower for electronic completion. Some faculty and instructional support staff attribute that to instructors not providing in-class time for students to complete evaluations. (UM did a small study in Fall 2019 comparing response rates when in-class time was provided versus when it was not. When it was allotted, the mean response rate was 64%, compared to 49% when it was not.)

UM did see a dip in response rates in fall 2020 compared to fall 2019, but attributed that to the pandemic. The spring response rate for UM in spring 2021 was 44.4%, while UMM’s rate was 51.6%. UMM response rates went up from previous semesters. This was attributed to the adoption of a new instrument containing fewer questions.

Evaluating student completion trends

Photo of five people outside on campus

A review of UMS completion reports (2011-20) reveals several common themes. Across UMS, women consistently represented over 60% of graduates, and the highest number of degrees and certificates conferred by discipline was in health professions, followed by education and business disciplines.

During that same span, UMS universities saw an increase in completion of certificates and master’s degree programs and a decline in associate, law degrees, and doctorates. UM and UMF produced high graduation rates compared to other UMS universities, and USM steadily improved its graduation rates over the last four years.

Projection

Growing the culture of assessment

Each UMS university has worked to grow its culture of assessment. These efforts will continue as a UMS assessment identity is developed. In support of effective assessment oversight and practices, a UMS Assessment Committee advisory to the Vice Chancellor for Academic Affairs (VCAA) will consider ways to build System-level reporting and develop and share assessment best practices and tools. Consistent with goals identified in the June 2020 UMS substantive change request, the Committee will also work with the VCAA on a common language for assessment and student success, identify ways to help academic programs measure student learning, and advise the VCAA and Chief Academic Officers (CAOs) on assessment reporting processes.

Photo of several people working on computers in the Geographic Information Systems (GIS) Laboratory and Service Center at the University of Maine at Machias.

Improving data-sharing and communication about data

Strengthening communication between UMS and the universities and Law School about data sharing will be an ongoing priority for the UMS Office of Institutional Research, Data Governance staff, and university IR and other functional areas. UMS recognizes that shared definitions of student success must be developed and then supported by consistent and reliable data definitions, services, and communication. UMS will seek to integrate different areas of assessment (e.g. student affairs assessment and academic assessment) to develop a clearer picture of how UMS universities and the Law School are preparing students. This work was interrupted during the pandemic, compelling some universities, such as UM and UMM, to defer assessment reporting for a year. While university-level decisions about how to assess students and programs remain important, it is likely that establishing a centralized means of reporting assessment data will enhance accountability and lead to analyses that serve all.

Photo of six people sitting outside on campus


Data and tables for Standard Eight: Educational Effectiveness are available in the campus Data First Forms linked below.

Back to 2022 UMS NECHE Self Study