Image

AQIP Category Seven


MEASURING EFFECTIVENESS

Context for Analysis

7C1 Information collection, storage, and access

The University of Saint Mary (USM) has built a technology infrastructure that supports sharing information and ideas among administrators, faculty, and students. Access and compatibility are strengths within the system. In support of comprehensive assessment and general computer functions, the University converted in 2005 to Jenzabar EX, an integrated campus solution in an easy-to-use Windows environment. This system expands access to administrative information and sophisticated data analysis via its report writer, InfoMaker. The InfoMaker reports provide specific information to facilitate recruitment, enrollment, retention, development, and financial analysis. Users receive training to construct reports that meet their analysis needs. Through the Report Portal, users access data via the web. In addition to Jenzabar EX, USM uses the Jenzabar Internet Campus Solution; a web based portal known as eSpire that contains the Learning Management System and Constituent Relationship Modules. This system provides students and faculty 24/7 access to e-learning and course management. For security, this information is password protected and encrypted. Access is granted based upon user needs.

The Academic Assessment System for tracking achievements on the University Learning Outcomes (ULO) is built into the Jenzabar data system (see 1P11, 1P13, 1R1). The Information Services staff provide regular assistance to collect data and distribute informative reports to faculty and the assessment committee. While the ULO database is centralized, program learning outcome data are not. Each program collects and monitors the performance of its majors each year and formally reports on progress every 5 years through the Program Review process which includes recommendations based on the data results (see 1R1-2).

Additionally, program directors collect data that are not managed or integrated in the Jenzabar system. These include surveys which are managed through web-based Zoomerang software with results downloaded and shared by email or on eSpire. Also, offices collect and compile results that are pertinent to their objectives and information needs. For example, Financial Aid, Human Resources, and Information Services have databases that use industry software with built-in analysis features unique to these areas. Some data reports produced at the office and program level stay at these levels for monitoring of operations. Summary results that provide trend analysis and information for ongoing improvement are used by supervisors, Administrative Council (AC), University community, and posted on the USM webpage for interested stakeholders.

7C2 Key institutional measures of effectiveness

USM's key institutional measures of effectiveness currently reflect the initiatives of the Strategic Plan (see category 8). The AC is defining additional measures of effectiveness based on other indicators for institutional success. The AQIP Action Projects: Data-based Decision Making Phase 1 & 2 have propelled USM's work in this regard. The assessment committee conducted analysis of all the reports generated and with which indicators they were aligned. Then a review of indicators followed to focus meaningful measurement. This process is currently underway for indicators not directly related to the strategic plan. Key measures of strategic initiatives include financial, recruitment, enrollment and academic based data. These indicators are depicted in Tables 7-1A through 7-1C.

TABLE 7-1A: FINANCIAL INDICATORS
 

Report

Review Frequency

Budget

Monthly and Quarterly

Annual Fund

Monthly

Investments

Quarterly

Cash Flow

Weekly

Fund Raising

Annually

Tuition Discount Rate

Annually

Core Expenses per FTE

Annually

Activity Based Costing

  1. Cost per Program
  2. Cost per traditional student
  3. Cost per Online Student
  4. Cost per OPC nontraditional student
  5. Cost to recruit Admissions and Marketing
  6. f. Comparison to other IHEs

Annually

TABLE 7-1B: RECRUITMENT/ENROLLMENT

Report

Review Frequency

Admissions

  1. Main Campus
  2. Overland Park Campus/Extended Sites
  3. Online

Weekly

Enrollment

Quarterly

Retention

  1. First-time Full-time Freshmen
  2. Transfer Students
  3. Athletes

Semester

Graduation Rates (4, 5 an 6 years)

Annually

Growth/Decline Rates & Faculty-Student Ratio

Annually

TABLE 7-1C: ACADEMIC and STUDENT SUPPORT

Report

Review Frequency

Number of students in each major by department

Semester

Faculty Teaching Loads

Semester

Faculty Evaluations

Semester

Course Evaluations

Semester

Course Enrollment

Semester

Assessment Reports

Semester

Number of students in practica/internships

Annually

Institutional Academic Support

Annually

Student Services Support

Annually

ACT Average

Annually

Faculty Status

  1. Tenured
  2. Untenured – tenure track
  3. Non-tenure track

Annually

Rank and Tenure

  1. Faculty Evaluations/R&T
  2. Assessment Reports

Annually

Academic Annual Reports/OGIs/ABCs

  1. Innovations & Improvements
  2. Accreditation Criteria met
  3. Professional Development
  4. Activity Based Costing

Annually

5-Yr Program Reviews

  1. Alum Surveys
  2. Enrollment Trends
  3. Curricular Changes
  4. General Education

Annually

Criteria, Indicators Met

  1. AQIP Reports, Annual Reports
  2. NCATE, IACBE, CCNE

Annually


Processes

7P1 Select, manage, and use information and data

USM selects, manages, and uses data that measure the key indicators which are aligned with mission, ULOs, teaching and learning, strategic initiatives, and distinctive objectives. Strategic initiatives have key indicators identified above that are managed on a weekly, monthly, quarterly, semester, and annual basis. Category 1 describes the indicators used for assessing ULOs and teaching effectiveness and reported in faculty evaluations, annual reports, and program reviews. In collaboration with faculty, the Assessment Committee constructed the Learning Framework Matrix which identifies data points for assessing ULOs (see Table 1-1 in 1C2). Faculty developed and use an evaluation rubric to generate student ratings for the ULO database. Information and data on service learning, USM's distinctive objective, is described in Category 2, which shows how information is related to each of the AQIP criteria.

Using the OnGoing Improvement (OGI) process (see 8P1), faculty, staff, and administrators deliberate upon research-based information and identify relevant professional standards, e.g. teaching and learning methodologies or financial stability indicators. Applying these standards to the USM environment yields criteria for effectiveness measures and then relevant and meaningful data points are defined. Information Services staff work with faculty and staff to determine appropriate tools for data collection, leading to either centralized or decentralized collection procedures. Data are generated during the OGI Action and Implementation phase and results are analyzed leading to improvement decisions during the OGI phase of Ongoing Evaluation.

7P2 Determining needs for data collection, storage, and accessibility

The needs for data collection, storage, and accessibility are determined by office staff and program faculty based upon a variety of factors including strategic initiatives, program reviews (see 1P1), accrediting and regulatory bodies, Integrated Postsecondary Education Data System (IPEDS) and other governmental reporting, and internal policies and procedures. Working with Information Services, data management systems are identified. Customized screens are designed for data entry which populates fields defined by the Information Services staff. This facilitates data retrieval in the form of InfoMaker reports. For centralized information, InfoMaker reports are designed and program staff can request these reports anytime. Decentralized data management systems are designed by the program or unit staff, often using Microsoft Excel and Access to collect, manage, and access data for decision making.

The AQIP Action Project for Data-based Decision Making spearheaded an analysis of USM's needs for data management. The project examined current collection of information and needs for monitoring organization effectiveness using AQIP criteria. Based upon these findings, USM has focused its work to define key indicators, streamline processes for collecting and storing relevant data, and establish reporting processes that allow access to information.

7P3 Determining needs and priorities for comparative data

There are three primary criteria for setting priorities for comparative data. First, tracking data over time enables performance comparisons and trend analysis, e.g. individual and cohort student learning achievements over time allow analysis of growth and impact of curriculum and instruction while enrollment and budget data over time provides assessment of effort in meeting strategic initiatives. Second, comparing performance on key competitive indicators (e.g. admissions, financial aid, salaries) with peer institutions selected from Kansas Independent College Association (KICA) and Council of Independent Colleges (CIC) databases allows monitoring of strategic initiatives (e.g. fiscal stability) and performance expectations in relation to industry standards. Third, standing within the higher education industry through recognitions (e.g. Princeton Review, National Association of Intercollegiate Athletics [NAIA]), awards (e.g. Council for Higher Education Accreditation [CHEA], Service Learning), and accreditation reports provides a benchmark for program quality.

7P4 Institutional analysis of overall performance data and information

The Administrative Council analyzes data and information regarding overall institutional performance on a routine basis. Specific indicators are analyzed on weekly, monthly, quarterly, semester, and annual basis as shown above on Table 7-1A-C. Offices, departments, and committees submit data and information to vice-presidents according to this schedule and governance procedures. AC reviews and interprets results, trends, and conclusions and then shares summaries in Board of Trustees (BOT) reports and through University Assembly updates. The OGI process outlines how this analysis leads to improvement of programs and processes (see Category #8)

7P5 Aligning department and unit analysis with institutional student learning goals and overall objectives

The alignment of assessment criteria with curriculum goals, syllabi, and student learning outcomes is evident in annual reports and program review guidelines (see 1P1, 1P11). Outcomes data collection and analysis of ongoing improvement are directly linked with mission, areas of investigation, ULOs, and strategic initiatives. (See 1R for trend data for Learning Communities [LC] and Idea Seminars [IS]). Administrators revise guidelines for annual and program growth reports as new initiatives are prioritized during strategic planning processes. For example, recent additions included program improvement efforts related to allied health and attracting majors. Departmental members address these issues as they produce annual reports which summarize results. Then VPs summarize departmental findings in division reports and provide these to the President who integrates results in her BOT reports. The VPs and President share overall patterns and subsequent actions with the community at University Assemblies.

7P6 Ensuring effectiveness of information systems and related processes

A life cycle management system ensures the reliability of USM hardware and software. Most of the computer equipment on the USM network has a five year life cycle. Annually the Information Services staff review the systems that are at the end of their life cycle, identify specifications, and project the system configuration that will provide satisfactory performance over the next five years. Information Services requests a budget to replace 20% of USM servers, switches, network printers, firewalls, routers, etc. every year. Software vendors notify Information Services and the various module experts (end users) of patches and releases as they are made available. During the annual budget cycle Information Services identify current licenses and project any new license requirements, and then coordinate software upgrades with the effected departments throughout the University. USM also conducts a user group meeting, most recently known as the Six Term Committee, to review processes. Through discussions with module experts from various departments Information Services gets an understanding of how the integrated system effects other departments and which changes can be made to improve system performance for end users.

USM has trained module experts in all of its administrative modules. These experts from the business office, financial aid, registrar, etc., identify any data integrity problems. Information Services works with the module experts to restore data integrity as needed. To assist in these matters Information Services runs a backup process every night and makes an additional shadow copy of critical databases. With the installation of a new server in the Fall of 2008 the University will increase its online archived databases (shadow copies) from one day to 5 or more days of archived databases.

Information security and system integrity are maintained through the use of firewalls, antivirus, and anti-spam systems on the outside of the USM network, antivirus software on the end user computers, encrypted web pages for data collection, and passwords. Users are assigned to various groups that permit different levels of network access. For instance, student groups only have access to web based systems while some faculty and staff groups have access to administrative systems. The system administrator controls access by granting edit or view only access to the user based on their role/position.

7P7 Measuring system effectiveness

The Information Systems work order database, along with feedback from various users, indicates the effectiveness of systems and processes. The work order database records incidents related to various systems. All open work orders are reviewed every other week. Working closely with the user allows Information Services to understand needs and provide reliable, supportable solutions. Feedback from the Assessment Committee led to the development and implementation of a system to collect data for University Learning Outcomes. Feedback from the Six Term Committee has helped to identify several processes to improve including the Admissions to Registration data transfer process, the online application process, and the process for reporting Admissions recruiting information.

Results

7R1 Evidence for system effectiveness in meeting institutional needs

The evidence for the system's effectiveness in meeting institutional needs is based upon whether reports can be configured and accessed in a timely manner to provide information directly related to targeted objectives. This does occur as demonstrated, for example, through USM's strategic planning process and in the development and evaluation of AQIP action projects. To conduct internal scans, staff received and used data reports to identify patterns and target need areas. To evaluate progress, administrators receive and monitor reports according to schedule.

Another example of monitoring system's effectiveness occurred during the 2007-08 school year when Information Services had a growing number of work orders. It was the user feedback to the Administrative Council that led to the hiring of another system administrator and bringing a part time employee on full time for 2008-09. As a result of this decision, the technology staff was able to complete 32% more work orders in the 3 rd quarter of 2008 than they did in the third quarter of 2007 (see Figure 7-1). User feedback is critical to the ongoing improvement process.

Figure 7-1: Completed Work Orders by Month



Evidence also indicates effectiveness of the system for measuring student learning, institutional operations, collaborative partnerships, and strategic initiatives. Highlights of the evidence follow in Table 7-2.

Table 7-2: Evidence of System Effectiveness Highlights
 

Areas

Evidence Highlights

Student Learning

•  Program and University databases collect, store, and analyze student achievement on outcomes; updated each semester; flexible reports for faculty inquiry

•  Jenzabar reports run every semester for Dean's List and Probationary students; ratio calculated by semester and over time

•  Faculty-Student Ratios calculated by semester and overtime

Institutional Operations

•  ABC report design completed and annual reports tracked; core expenses by FTE

•  Weekly Jenzabar admissions reports run at department level and provide status report; results direct weekly objectives

•  Financial report design allows cash flow to be run weekly and full budget reports quarterly; short term and long term monitoring needs are built-in; report criteria aligned with annual comparative analysis done by KICA/CIC

•  Security and safety reports integrated within national database for comparative analysis

Collaborative Partnerships

•  Service Learning survey results compiled in office and summary results presented to faculty and VPAA

•  Systematic reporting of goals and results to Kansas Compact and AmeriCorps

•  Volunteers tracked and recorded in Development office and reported in annual reports

•  Level of giving tracked through Jenzabar reports and compared with annual goals

Strategic Initiatives

•  Metrics determined during planning process; data needs identified; reporting system identified and constructed; quarterly reports used by AC and submitted to BOT


As part of the institutional OGI process (see 1P2 and 8P1), staff and faculty provide quantitative and qualitative results through database reports, survey results, and annual reports. With the exception of Jenzabar reports, all others are generated by the primary user thus making accessibility and relevance high. During the OnGoing Evaluation (OGE) step in the process those responsible for implementing the objectives have access to information to ensure informed decision making. Checks and balances are in place through the governance structure where supervisors, AC, and BOT review and monitor this information.

7R2 Comparing Results with Others

USM's system for measuring effectiveness is compared with higher education standards defined by accreditation bodies. USM results compare favorably as evidenced by education, nursing, and business accreditation. In some cases USM results are rated exceptionally. USM was one of 5 that achieved the 2007 CHEA Award for Institutional Progress in Student Learning Outcomes which evaluated the University's articulation and evidence of outcomes, success with regard to outcomes, information to public about outcomes, and using outcomes for improvement. Also, in 2005 USM education department's Assessment System was nationally recognized by the National Council for the Accreditation of Teacher Education (NCATE) in its Spotlight on Schools of Education as being on the leading edge of practice. Based on a review of 58 programs seeking new or continuing accreditation, USM was one of 6 programs selected for success in addressing Standard 1: Candidate Knowledge, Skills, and Dispositions and Standard 2: Assessment System and Unit Evaluation. The two standards speak to the program's effectiveness in implementing a systematic approach to assessment where student and program outcomes are monitored, evaluated, and lead to ongoing improvement.

Improvement

7I1 Improving processes and systems for measuring effectiveness

The approach USM uses to improve its processes and systems for measuring effectiveness follows the OGI model. As program, committee, and office staff analyze the results of target objectives, the measurement system is an integrated component of the analysis. When recommendations emerge to make the system more effective and efficient in gathering and reporting relevant information, requests for improvements are developed in conjunction with the Information Services staff and then forwarded to the next level of decision making authority. Issues are discussed at the AC level to determine if the problems are shared, which may alter the approach to problem solving. Requests are prioritized by the AC.

7I2 Setting and communicating targets for improvement

Targets for improvement are set based upon three criteria for measuring effectiveness. Current initiatives for each are described below.

1. Efficiency in accessing information, running reports, and representing trend data
•  System design has been found to be efficient as evidenced by reporting on strategic initiatives and the ULO database
•  Using the system design, AC and Assessment Committee continue work to include all USM priorities (e.g. service learning, valuing people) and program level outcomes
•  Coordination among centralized and decentralized databases is being refined to include a web-based system so findings are centrally located
•  Based on feedback from staff and Assessment Committee, examination of how traditional responsibilities of institutional officer can be assumed is underway, e.g. Infomaker reports, personnel

2. Reliability and validity of data
•  Assessment Committee is studying results of ULO database field test of rubric score data from 2007-08. Necessary modifications and training will occur to address reliability and validity issues in 2008-09
•  AQIP Team and Assessment Committee have identified the need for a coordinating system for ongoing improvement projects. The Faculty Chair is currently leading a committee re-structuring project that can address this function

3. Communication of results to make improvements
•  Effective methods used by some programs include the communication of student learning outcome results on the USM webpage and highlighting accomplishments in the alumni magazine, Aspire
•  Expansion of the webpage to systematically report and communicate OGI results and improvement goals is planned
•  The Assessment Committee and AQIP Team have recommended a more systematic approach for “closing the loop” within the OGI process so that conclusions are made public and appropriate personnel and committees can more efficiently use these conclusions to systematically plan the needed next steps, e.g. OGI goals, AQIP Action Projects, updated policies and procedures