How to Make Sure an AoL System Is Working
- Too often, schools treat assurance of learning simply as an accreditation exercise. They collect too much data or the wrong data, or they fail to use their data to implement meaningful changes.
- Administrators must identify and address the pain points in AoL systems that make faculty reluctant to engage with and take ownership of the process.
- Seminars, conferences, and videos created by Ó£ÌÒµ¼º½ provide schools with resources for implementing and improving AoL processes to drive meaningful improvements.
Developing a robust assurance of learning (AoL) process is one of the most critical tasks a school can undertake to ensure continuous improvement. AoL processes measure what students have learned by the time they complete their programs, determine where student learning is deficient, and outline methods for improving student learning and performance.
Despite the importance of AoL, many business school administrators and faculty feel overwhelmed by the work of implementing an effective AoL system or modifying one that is not delivering the desired results. This fact is illustrated by the number of schools that were cited for AoL concerns in the 2020–21 and 2021–22 academic years as they were seeking Ó£ÌÒµ¼º½ accreditation. Among those that required a second year of continuous improvement review, almost 68 percent were cited by their peer review teams (PRTs) for deficiencies in assurance of learning.
However, if administrators can pinpoint the problems in their AoL processes, they can correct the way they collect and use data—and return the focus to student learning.
A Little Background
It’s useful to begin by reviewing the purpose of AoL. Most programs have two types of goals and objectives: those that identify content knowledge that students should master, and those that identify skill sets students should acquire. Of course, goals should be appropriate to the level of the program—those for a bachelor’s degree would differ greatly from those of a doctoral program.
Schools can gather information through both quantitative and qualitative measures, depending on their objectives. For example, a test can assess whether students have mastered program content, but a survey of employers can more accurately determine when employers are satisfied with the skills that graduates have acquired. The key is for assessment stewards—administrators and faculty responsible for the AoL process—to conscientiously consider which source is the best measure of student learning.
Once assessment results are gathered, schools can use them to drive meaningful curriculum improvement in two different ways. Improvements might involve making changes in courses, such as adjusting assignments and reading materials, adding more tutorials and tutoring sessions, giving students more opportunities to practice their skills, or providing enhanced feedback. It is important for stewards to understand that improvements should be implemented not only in the courses where data have been collected, but in other appropriate courses.
Alternatively, changes can be made to a program’s curriculum. Improvements sparked by AoL at this level require navigating the curriculum management process and possibly creating new courses, changing entrance requirements, or modifying prerequisites. On most campuses, such curricular improvements take substantial time to implement, so they tend to occur less frequently.
Division of Labor
One of the first questions any school should ask when it is developing an AoL process is who should be responsible for organizing and maintaining it. The answer really depends on the context of each institution. A school that is assessing a few small programs will have a much different process than a school that has large enrollments in multiple programs.
In any case, certain individuals or groups should be assigned responsibility for specific assessment tasks such as setting goals and objectives, designing measures, mapping curricula, developing data collection plans, keeping the data collection process running, analyzing and disseminating results, and ensuring implementation of recommended improvements. When schools don’t have a clear division of labor, the AoL process is dysfunctional and inefficient; there are stops and starts in the assessment process, and balls get dropped along the way.
When schools don’t have a clear division of labor, the AoL process is dysfunctional and inefficient; there are stops and starts in the assessment process, and balls get dropped along the way.
For example, problems can arise if the school isn’t intentional about determining competency goals and learning objectives. Generally speaking, the faculty who deliver the program should develop the program’s goals, although they often seek input from other stakeholders, such as employers, advisory boards, alumni, and students. However, if each program or department separately develops its own goals, creates its own measures, and collects its own data, there will be a considerable duplication of efforts, incongruity of data, and inefficiency in the process. Unfortunately, these problems will likely become evident only over time.
Therefore, stewards should consider the AoL process from a systems perspective, ensuring that program goals and objectives are well-aligned across a program’s courses to ensure students have a meaningful learning experience. If they approach the challenge that way, stewards can be strategic and efficient in how they make decisions about systems mechanics (including goals, objectives, and measures) and how they determine who should be responsible for what tasks.
Typical Missteps
There are several common mistakes schools make in their AoL processes:
They confuse course assessment with program assessment. Course assessment (grading) is about gathering data to determine whether students are learning in a single course. Program assessment is about gathering data to determine whether students are meeting competency goals and learning objectives across multiple learning experiences and courses throughout the program.
For example, it takes more than a single course for students to develop skills such as critical thinking, communication, and teamwork. Specified courses in the program are designed to advance students’ abilities from introductory-level learning to mastery as students complete the program. If course assessment is masquerading as program assessment, meaningful program assessment is unlikely to be occurring.
They view AoL as a data collection process rather than a data use process. There are two purposes behind gathering AoL data: to identify where students need help and to determine if modifications driven by AoL data have improved student learning and performance. If schools collect mountains of unnecessary data, they’re just wasting faculty time and distracting faculty from what’s really important—changing the curriculum in ways that benefit students.
They view AoL as something they must implement to meet accreditation standards. When schools focus solely on achieving accreditation, they tend to set performance targets very low. If schools are not actually using AoL activities to make improvements to their programs, they still will not comply with Ó£ÌÒµ¼º½ standards. If AoL results in data-driven improvements that benefit students, the continuous improvement process is working as intended.
They fail to act upon the data. AoL measures often show that schools have not met certain goals and objectives, but stewards don’t always make meaningful improvements in response. Sometimes the problem arises because their AoL measures have yielded only muddy data that are not granular or specific enough to allow them to identify deficiencies and devise beneficial interventions.
If course assessment is masquerading as program assessment, meaningful program assessment is unlikely to be occurring.
In some cases, stewards find it difficult to take disparate course-level assessment measures and aggregate them into data that can be used at the program level. In other cases, they might view data at only the aggregate level, such as the average score on a test or a rubric, and thereby miss areas where students need assistance and improvement.
By approaching AoL as a data-driven student improvement process, stewards can address mistakes quickly or prevent them altogether. When they do that, they will make strategically sound decisions as they design a process that is efficient and simple and has a meaningful impact on student learning.
The Major Pain Point
If a school wants to improve its AoL processes, the first thing stewards should do is examine what’s not functioning well so they can determine where the process is dysfunctional or inefficient.
Are there too many goals and objectives that make data collection onerous? Is the school focusing only on collecting data, rather than on incorporating results into the curriculum management process? Is the process overly complicated? Are faculty resistant to engaging in and taking ownership of the AoL process? By examining the pain points, administrators can determine how the process can be improved.
Faculty resistance to AoL can be one of the major obstacles. Even the best-designed AoL process won’t function if faculty don’t understand how it works, why it’s important, and how they can contribute. And, faculty resistance can occur at any point in the process—whether the school is developing program goals and learning objectives, developing measures, collecting data, reviewing results, or implementing improvements.
The key is for stewards to discover why faculty are hesitant about participating in assessment activities. Do they resist because the AoL process is burdensome, time-consuming, or inefficient? Do faculty understand the basics of AoL, why it’s important, and how it can be used as a tool to help manage the curriculum? Do they understand how their participation in assessment counts in terms of their workloads and how it is a valued activity for instructors at all levels? Are faculty members afraid that administrators will use course-embedded program measures to evaluate teaching effectiveness (which is something that schools should never do)?
Administrators should listen to faculty concerns and then dispel any mistaken notions faculty have or make adjustments to their systems.
A Role for Technology
Another common mistake is to seek technological solutions to problems that can’t be solved by technology. For instance, technological solutions can’t ensure that the AoL process is well-developed and smoothly implemented, nor can they ensure faculty engagement.
In fact, in some instances, technology actually acts as an impediment to meaningful AoL. Faculty often are unwilling to learn new software systems, and schools that rely on such systems only reinforce the mistaken notion that AoL is just a data collection exercise to check the box.
Technological solutions can’t ensure that the AoL process is well-developed and smoothly implemented, nor can they ensure faculty engagement.
That being said, after a school has designed and implemented a sound AoL process, stewards might find it advantageous to explore the possibilities of technology. For instance, once a school has developed rubrics for collecting data and a curriculum map to determine where measures might be embedded in courses, stewards can use course management software (CMS) or learning management software (LMS) to support data collection. Since most faculty are already using CMS or LMS systems as they deliver their courses, they face only a minimal learning curve in understanding its use in AoL.
As another example, data visualization tools such as PowerBI and Tableau can help stewards create engaging, high-impact graphics that highlight opportunities for improvement and suggest the types of improvements that are most likely to be effective.
Likewise, once meaningful results have been produced from data analyses and have been captured in high-impact graphics, technological tools such as MS SharePoint or MS Teams can be used to disseminate results that could spark improvements. Furthermore, such communication technologies allow schools to archive results, which contributes to the sustainability of the AoL process over time.
Where to Find Help
If administrators and faculty want to improve AoL processes, but don’t know where to start, they can consider attending various events hosted by Ó£ÌÒµ¼º½. For instance, the Assurance of Learning I and II seminars cover subjects such as assessment terminology, how to design an AoL system, and how to gather data efficiently. The Innovative Curriculum Conference and the Accreditation Conference provide excellent opportunities for administrators to learn from Ó£ÌÒµ¼º½ experts and representatives from other schools.
Administrators also can watch Ó£ÌÒµ¼º½’s new video series about common mistakes that schools make in the AoL process. The series includes five modules that address specific areas of the assessment process: the development cycle, the measurement cycle, the closing-the-loop cycle, the efforts to build AoL culture and faculty engagement, and the development of indirect measures. Each module contains a video, slides, and a diagnostic exercise. Discussions cover the typical causes of common mistakes, their likely consequences, and potential solutions.
There are also other helpful resources. Indiana University–Purdue University Indiana (IUPUI) offers , and Drexel University holds . In addition, administrators can consult significant scholarship in the area of teaching and learning, although they should make sure that the resources focus on program-level rather than course-level assessment.
When stewards understand how to design and implement efficient AoL systems, they will be able to continuously improve programs and help ensure that students have the best possible learning environment to prepare them for their futures.