Image

AQIP Category 7

MEASURING EFFECTIVENESS

 


Context for Analysis

Category Seven Contents
7P1
7P2
7P3
7P4
7P5
7P6
7P7
7R1
7R2
7R3
7I1
7I2


The University of Saint Mary has made significant changes regarding the processes and improvements for Measuring Effectiveness since the 2008 portfolio review.  Processes have become more aligned with the Mission, Strategic Plan, and Ongoing Improvement model as evidenced by more consistent evaluation of why, when, and how data is collected and used across the institution.  Faculty and staff have become more integrated in how they communicate throughout the duration of action projects from the declaration of the project to its reporting of findings.  Two key committees, the Assessment Committee and the Process Improvement Committee, comprised of faculty and staff, monitor and assess academic and non-academic processes with increased regularity as well as increased input from all faculty and staff.  In addition, the hiring of an Institutional Researcher in April 2012 has begun to centralize data collected and shared.  A total of 10 action projects were developed since 2008.  Results from these action projects have contributed to the institution’s awareness of areas needing improvement in its process for measuring effectiveness. 

Improvements in measuring effectiveness since 2008 are now stable and more inclusive as well.  USM has also become more aligned with external benchmarks such as accrediting bodies and standardized use of measures to assess student achievement.  Internal improvements have been made technologically to stabilize, and to make more reciprocal, the communication processes between the Board of Trustees, the administration, faculty, and staff.  The first institution-wide Assessment Day in October 2012 will focus on internal and external measures, as well as generate discussion for clarity and direction for all to achieve departmental and institutional goals. 

Overall, it appears that USM has matured in four years, but the university continues striving to improve communication across the institution and to maintain consistency with personnel in strategic committee positions, all of which enhance its ability to regularly monitor how effectiveness is measured.


7P1 Selecting, Managing, and distributing data and performance information to support instructional and non-instructional programs and services


USM data selection and performance information are in direct alignment with, and are driven by, its mission.  USM’s Strategic Plan, University Learning Outcomes (ULO), and Ongoing Improvement Model (OGI) provide direction for collection, analysis, and application to adjust and improve programs and services at the instructional and non-instructional levels.  AQIP Action Plans are designed to target specific process areas needing improvement across the institution.  Feedback to faculty, staff, and administration is dynamic and continual, with concentrated focus on closing the loop as data selection and collection processes remain in concert with the adjusting needs of the university.  The Assessment Committee, comprised of faculty and staff, monitors and oversees data management for the ULOs and OGI relating to instructional programs. The Process Improvement Committee (PIC), comprised of key staff directors, meets weekly to identify and assess how departments such as financial aid, admissions, registrar, and information systems integrate for optimal functioning.

Figure 7P1-1 graphically depicts the centrality of mission and student learning in the OGI process which utilizes a data-based, collaborative decision making approach.

Figure 7P1-1: In pursuit of Excellence….Circle of OnGoing Improvement



Top »



7P2 Selecting, managing, and distributing data and performance information to support planning and improvement efforts


Financial, recruitment, enrollment, academic, and student support key indicators described in the 2008 portfolio continue to be in place in 2012 and are listed in tables 7-2 through 7-4.  It is important to note here that the OGI model continues to be the standard for selecting, analyzing, and utilizing the data from these sources to make ongoing improvements campus-wide.  Based on the 2008 portfolio feedback, the review schedules were re-assessed based on whether the frequencies provided systematic information.  An analysis of the schedules indicated no changes would improve the process at this time.

TABLE 7P2-1: Financial Indicators
 

Report

Review Frequency

Budget

Monthly and Quarterly

Annual Fund

Monthly

Investments

Quarterly

Cash Flow

Weekly

Fund Raising

Quarterly

Tuition Discount Rate

Annually

Core Expenses per FTE

Annually

 

TABLE 7P2-2: Recruitment/Enrollment
 

Report

Review Frequency

Admissions

  1. Main Campus

  2. Overland Park Campus/Extended Sites

  3. Online

Weekly

Enrollment

Quarterly

Retention

  1. First-time Full-time Freshmen

  2. Transfer Students

  3. Athletes

Semester

Graduation Rates (4, 5 and 6 years)

Annually

Growth/Decline Rates & Faculty-Student Ratio

Annually

 

TABLE 7P2-3: Academic and Student Support
 

Report

Review Frequency

Number of students in each major by department

Semester

Faculty Teaching Loads

Semester

Faculty Evaluations

Semester

Course Evaluations

Semester

Course Enrollment

Semester

Assessment Reports

Semester

Number of students in practica/internships

Annually

Institutional Academic Support

Annually

Student Services Support

Annually

ACT Average

Annually

Faculty Status

  1. Tenured

  2. Untenured – tenure track

  3. Non-tenure track

Annually

Rank and Tenure

  1. Faculty Evaluations/R&T

  2. Assessment Reports

Annually

Academic Annual Reports/OGIs/ABCs

  1. Innovations & Improvements

  2. Accreditation Criteria

  3. Professional Development

  4. Activity Based Costing

Annually

5-Yr Program Reviews

  1. Alumni Surveys

  2. Enrollment Trends

  3. Curricular Changes

  4. General Education

Annually

Criteria, Indicators Met

  1. AQIP Reports, Annual Reports

  2. NCATE, IACBE, CCNE, APTA

  3. CLA, NSSE

Annually
 
Pre- and post- annually freshmen and seniors

 


The evidence presented in 7P2 and 7P4 demonstrates that we meet Core component 5D: the institution works systematically to improve its performance. In some areas, as demonstrated in our last portfolio, USM has done well in documenting its performance through the gathering and analysis of pertinent data.  This has been especially true in university finances and financial planning.  In areas where we were not as strong we have improved – in  gathering and analyzing data related to recruitment, for example, and especially in our academic and support areas.   The reader should look at Category One, in addition to this section, for further information on academic assessment, where our improvements have been significant and promising.  Further information on selecting, managing, and distributing data can also be found in 7P1.   In 7P2 we have put before you, in chart form (which we believe is easier to follow), information on what data we collect, how, and when.  Included are Financial Indicators, Recruitment/Enrollment data, and Academic and Student Support information.  In 7P4, we report that two major committees have undertaken the responsibility to increase the availability of data to administration, faculty, and staff through shared drives, on Jenzabar, and on eSpire.  The Assessment Committee takes the lead on gathering and analyzing data on academic performance and returns that data to faculty at our annual Assessment Day.  The Administrative Council, consisting of the President and Vice Presidents, with the help of our Institutional Researcher/Data Analyst, is responsible for gathering data in other areas.

Top »


7P3 Determining the needs of departments and units related to the collection, storage, and accessibility of data and performance information


Realizing the need to become more transparent in this area, concerted efforts have been made since 2008 to more systematically identify what and how data are collected, stored, and made accessible.  The Academic Leadership Council, comprised of the Academic Vice President and Department Chairs, meets bi-weekly to discuss the academic issues of the institution.  One issue realized since 2008 is the need for standardized course evaluations across departments, a project that is as yet uncompleted but in process.  A second issue is the accurate tracking of majors within departments and efforts have begun to collaborate with the Registrar’s office to improve the tracking of these students.  Non-academic issues are monitored by the Process Improvement Committee and provide direction to departments regarding recruitment, enrollment, advising, financial aid, student accounts, student life, alumni, and development systems.   Data is stored in Jenzabar and retrieved via InfoMaker.  Supplementary software such as EXCEL and ACCESS allow for departmental decentralized utilization of information. The university has enhanced the withdrawal process to insure that it knows about the areas that students identify as obstacles to retention in order to have a stronger understanding of its performance and delivery to students.  USM has identified that much of this information that it is beginning to capture needs to be presented across campus in order to identify direct reports from students so that the university can become more proactive in its approach to serving them.

Top »


7P4 Analyzing and sharing data and information regarding overall performance throughout the institution


Since 2008 two major committees have undertaken the responsibility to increase the availability of data to faculty and staff through shared drives, on Jenzabar, and on eSpire.  The Assessment Committee, a long-standing committee comprised of faculty and staff, adjusted its goals to become more proactive in sharing information with faculty and staff as well as becoming more conscientious in its efforts to make improvements academically. An Assessment Day, the first of its kind, is planned for October 2012 to specifically address these issues. The second committee, the Process Improvement Committee, was formed as a result of the 2008 accreditation process, and meets weekly to monitor and assess administrative processes across the institution.  In addition, an Institutional Researcher was hired in April, 2012. This position is responsible for the research and production of IPEDS, 20th day, and Kansas reporting as well as assisting with the analysis of ULO, CLA and NSSE data. The Administrative Council, comprised of the President and Vice-Presidents, serves as the liaison between departments and the Board of Trustees, regularly receiving input from department chairs and department directors to make strategic plan recommendations.  The Administrative Council also shares information regularly through monthly University Assembly meetings with faculty and staff.

Although data from the CLA and NSSE have only just been received and are yet to be analyzed with respect to the university’s learning outcomes, we believe that the evidence presented in 7P2 and 7P4 demonstrates that processes are now in place to meet Core Component 5D. (covered in 7P2)

Top »


7P5 Determining the needs and priorities for comparative data and information; criteria and methods for selecting sources of comparative data and information within and outside the higher education community


One criterion for selecting comparative data outside the university is the use of benchmark standards set by state and national accrediting bodies for education, business management, nursing, psychology programs and, most recently, physical therapy and health information management, that provide quantifiable admission and graduation standards for these respective programs.  In fall 2011 USM began using the CLA and NSSE instruments to obtain academic performance scores on its students (freshmen and seniors) as a measure of comparative data with national scores.  This data provides the university with percentile rankings of its students on measures such as critical thinking, writing, and reading comprehension.  A second criterion for outside the university comparisons is data from the Kansas Independent Council Association (KICA) and Council of Independent Colleges (CIC), that allow USM to know how it compares with similar institutions on salaries, enrollment trends, graduation rates, and financial aid.  It is also important to note that Noel-Levitz, an academic consulting firm, was contracted to provide assistance and recommendations in areas such as enrollment and retention strategies. The third criterion for outside the university comparison is recognition by the National Association for Intercollegiate Association (NAIA) Champions of Character and Academic All-Americans as well as recognition from the Princeton Review and the Council for Higher Education Accreditation (CHEA).  Internal comparative criteria include admission demographic data such as ACT scores, high school GPA, transfer GPA, and athletic eligibility at admission are tracked longitudinally to determine academic and non-academic programmatic needs.  Equally important are similar data collected on graduates.  Collection, analysis, and use of alumni data that includes employment and graduate school profiles needs to be improved.

Top »


7P6 Ensuring that department and unit analysis of data and information aligns with institutional goals for instructional and non-instructional programs and services and sharing this analysis


As a result of feedback from the 2008 portfolio, a significant change to improve the alignment of institutional goals with instructional programs was the reduction of our eight university learning outcomes to four.  This was accomplished by involving all faculty in determining how the reduction to four would continue to provide academic programs with direction in the teaching and learning process as the university focused on its mission.  The Assessment Committee then re-designed a matrix to measure student achievement of these learning outcomes (See Link 7P6-1:University Learning Outcomes Assessment Forms Spring 2012).  Data was provided by faculty at the end of the spring 2012 semester and the results will be presented in October 2012 at our Assessment Day.  A second significant improvement to increase the alignment of goals with programs and services was the hiring of an institutional researcher whose primary focus is to oversee the collection of, and the sharing of, data that will provide ongoing improvement for all programs across the university. USM’s efforts at “closing of the loop” specifically address the concern of the 2008 reviewers, and it is the university’s goal to continue to strive for increased efficiency in measuring institutional effectiveness. Non-instructional programs are guided by USM’s Strategic Plan with emphasis on retention and strengthening communication between, and within, departments.  Departmental annual reports are submitted to respective vice presidents, and are then summarized and submitted to the President for her report to the Board of Trustees. 

Top »


7P7 Ensuring the timeliness, accuracy, reliability, and security of information system(s) and related processes


The timeliness, accuracy and reliability of the university’s systems and processes are monitored by the Process Improvement Committee, which meets weekly to discuss issues of timeliness and security of information.  Information Services staff oversees and monitors systems and processes throughout the university.  Security of data is maintained internally through firewalls and software protection and maintained externally through a daily back-up system that stores data off-site. Imaging System was purchased in fall 2012 to provide paperless storage and retrieval of data. 

Top »


7R1 Measures of the performance and effectiveness of our system for information and knowledge management that is collected and analyzed regularly


Realizing the need for a more continuous and systematic collection and analysis of effectiveness measures, USM increased its use of data provided in annual reports, student learning outcome reports, and rank and tenure and post-tenure reviews.  In addition, Information Technology satisfaction surveys were put in place and an Institutional Researcher was hired April, 2012.  The Process Improvement Committee (PIC) was formed, meeting weekly to review transactional processes and to identify areas needing improvement.  The PIC consists of key players, representing a wide variety of university departments (e.g. Registrar, Admissions, Information Services, Finance & Administration, and Student Life), who oversee the effectiveness of process changes across the institution and design adaptations as necessary.  The Director of Information Services maintains agendas for weekly PIC meetings as well as determining project priority based on feedback from module managers.

Top »


7R2 Evidence that the system for measuring effectiveness meets the institution’s needs in accomplishing its mission and goals


Significant steps have been taken since 2008 to improve USM’s system for measuring effectiveness to demonstrate if, and how, the university is accomplishing its mission and goals.  A total of 10 action plans were developed to strengthen this process, all of which aimed directly at improving how effectiveness is measured. Progress and completion data from the action plans were regularly reported to faculty, staff, and administrative council via departmental meetings, Academic Leadership Council, and University Assemblies.  USM  has redone its Strategic Plan to reflect some new initiatives, particularly geared towards improvements, additional programming/majors, and retention issues. Table 7R2-1 summarizes evidence of system effectiveness highlights.

Table 7R2-1: Evidence of System Effectiveness Highlights


Areas

Evidence Highlights

Student Learning

• Program and University databases collect, store, and analyze student achievement on outcomes; updated each semester; flexible reports for faculty inquiry
• Jenzabar reports run every semester for Dean's List and Probationary students; ratio calculated by semester and over time
• Faculty-Student Ratios calculated by semester and over time

Institutional Operations

• Weekly admissions reports run at department level and provide status report; results direct weekly objectives
• Early registration reports run to monitor returning vs. drop-out numbers
• Course enrollment reports run to determine specific course needs (additional sections, technology, etc.)
• Financial report design allows cash flow to be run weekly and full budget reports quarterly; short term and long term monitoring needs are built-in; report criteria aligned with annual comparative analysis done by KICA/CIC
• Security and safety reports integrated within national database for comparative analysis

Collaborative Partnerships

• Service Learning survey results compiled in office and summary results presented to faculty and VPAA
• Systematic reporting of goals and results to Kansas Compact
• Volunteers tracked and recorded in Development office and reported in annual reports
• Level of giving tracked through Jenzabar reports and compared with annual goals

Strategic Initiatives

• Metrics determined during planning process; data needs identified; reporting system identified and constructed; quarterly reports used by AC and submitted to BOT 

 


7R3 How results for the performance of processes for measuring effectiveness compare with the results of other higher education institutions and, if appropriate, of organizations outside of higher education


As a result of feedback from the 2008 portfolio, USM made significant improvements in its efforts to measure effectiveness in comparison to other institutions of higher education as well as comparisons with national norms of student performance.  Evidence from two representative programs at USM indicates comparison data and benchmarking with peer institutions. The Nursing Program, accredited by the Commission on Collegiate Nursing Education (CCNE), assesses program effectiveness annually. Ten factors of program effectiveness are assessed and compared to six select institutions (local and of similar size), as well as compared to all institutions (national and of similar size).  On overall program effectiveness, USM is ranked 3rd out of a closely selected external comparison of six other institutions. In a more expanded comparison, USM was ranked 86th out of 234 comparison institutions. Sub ratings for ten factors (quality of nursing instruction, work and class size, course lecture and interaction, facilities and administration, classmates, professional values, core competencies, technical skills, core knowledge, and role development) rate USM either 2nd or 3rd of the seven schools, with only one rating higher (4th, in facilities and administration). Expanded ratings on the ten factors range from 18th out of 234 (technical skills) to 118th (facilities and administration), with all the rest being 86th or better.

USM also tracks nursing licensure pass rates both within Kansas as well as initial license in other states (with several external comparison groups). USM reported a 94% licensing pass rate in 2011 as compared to a national rate of 87%, similar program rate of 89%, and jurisdiction rate of 86%. While USM’s pass rate dropped a bit from 2010 (97%), it was up significantly from 2009 (82%). Moreover, the lower pass rate reflects a national trend, as this was the first year of test-taking exposed to a revised pass rate benchmark set by the National Council of State Boards of Nursing.

The Education Program provides a comparison of our candidates’ performance on standardized testing as compared to the state average. The Summary Pass Rate indicates the total number of students taking all assessments (elementary and secondary), the pass rate for the institution, and the pass rate for the entire state which allows for comparison.  In 2010-2011 USM reported a 100% pass rate across elementary and secondary education graduates, as compared to a statewide average of 93%. This was up from 92% in 2009-2010 (statewide 95%) and 93% in 2008-2009 (statewide 96%). Secondary content rates (biology, chemistry, English-language-literature-composition, and mathematics) only graduated one person each so individual statistics are not comparable. Additional data for the undergraduate and graduate programs are available on our NCATE webpages: http://www.stmary.edu/ncate.

USM is interested in developing a wider assessment of students with external benchmarks. In fall 2011 USM implemented the Collegiate Learning Assessment (CLA), and in spring 2012 implemented the National Survey of Student Engagement (NSSE) (designed to obtain information about student participation in programs and activities and used to promote student engagement in inside and outside the classroom activities, and facilitate national and sector benchmarking). New, first time USM freshmen were administered the CLA (92% participation). While there is no internal comparison data yet (USM freshmen to the same USM seniors), the fall 2011 CLA indicated an overall 11th percentile rank (performance task 12th, analytic writing task 10th, make-an-argument 10th, and critique-an-argument 11th , with an entering academic ability ranking of 32nd ). Data collected spring 2012 on the CLA and NSSE have only just arrived and are being analyzed as measures of effectiveness.

Top »


7I1 Recent improvements made in this category; how systematic and comprehensive are processes and performance results for measuring effectiveness


Numerous improvements have been made that address more effective ways to close the loop in measuring effectiveness.  Academically, the reduction of eight learning outcomes to four outcomes resulted in more consistent evaluation of student learning by faculty.  USM now has in place a procedure for data collection each spring semester and an institutional researcher who is responsible for the analysis of the data and the reporting of results to faculty the following fall semester on Assessment Day.  The university has also made sincere efforts at closing the loop for non-academic services by forming the Process Improvement Committee that meets weekly to identify projects and review systems in place or in need of adjustment to strengthen effectiveness of how institutional intra-communication and collaboration is measured.  Greater diligence has been given to how USM measures effectiveness through the initial administration of comparable higher education standards of student learning by administering the CLA and NSSE beginning fall 2011 and spring 2012.  In addition, departments such as education, nursing, and business that are required to meet national accreditation standards are utilizing benchmarks as measures of effectiveness in their respective programs.  Improvements in technology have made generating and retrieving weekly, monthly, and annual reports easier, as well as making these reports more accessible to more people involved in the reporting process.  The university does, however, recognize that continual improvement is needed and action plans will be developed to address these needs.

Top »


7I2 How culture and infrastructure help us to select specific processes to improve and to set targets for improved performance results in measuring effectiveness


With reference to 7I1, the Assessment Committee has already begun re-evaluation of the data collection matrix on the four learning outcomes.  For example,  there appears to be a paucity of courses identified for certain learning outcomes whereas for other outcomes identified courses do not appear to provide the best measure of that learning outcome.  There is still concern about validity and reliability of the data from faculty due to the subjective nature of the ratings that faculty provide about student learning.  This needs serious attention and an action project developed to address this concern may be the next step.  The university also hopes that data obtained from the CLA and NSSE will provide a baseline of student learning in comparison to other higher education institutions, thus decreasing the internal bias of limiting comparisons to USM students only.  Also, as mentioned, the institutional researcher will facilitate greater communication within and between departments as data is collected, analyzed, and reported.  The Process Improvement Committee will continue to meet and action projects will be developed as needed based on this committee’s ability to communicate and identify areas needed for improvement across the institution that involves technology and access to information via technology.

Top »