State Assessment Policy Analysis

Project Overview
State Self-Study Tools
State and Regional Policies
Assessment Policy Types and Models
Policy Development
Inventory of Instruments and Measurements
Data Collection and Analysis
Publications and Presentations

 

 

Select a state to view its policy

If you cannot see the U.S. Map, select a State from below.

Alabama Hawaii Massachusetts New Mexico South Dakota
Alaska Idaho Michigan New York Tennessee
Arizona Illinois Minnesota North Carolina Texas
Arkansas Indiana Mississippi North Dakota Utah
California Iowa Missouri Ohio Vermont
Colorado Kansas Montana Oklahoma Virginia
Connecticut Kentucky Nebraska Oregon Washington
Delaware Louisiana Nevada Pennsylvania West Virginia
Florida Maine New Hampshire Rhode Island Wisconsin
Georgia Maryland New Jersey South Carolina Wyoming

 

Alabama

Contact

William O. Blow, Deputy Executive Director

Alabama Commission on Higher Education
P.O. Box 302000
Montgomery, AL 36130-2000
334-242-1998
FAX: 334-242-0268
achxh01@asnmail.asc.edu

 
State Agency Original Initiative Year

Alabama Commission on Higher Education

Statewide Policy on Institutional Effectiveness and Assessment of Student Learning Outcomes

1988

Policy Analysis

Policy Type

In 1988, the Alabama Commission on Higher Education developed two statewide assessment policies. One policy concerned Institutional Effectiveness and Assessment of Student Learning Outcomes; the other policy addressed Programmatic Accreditation. In 1990, the State Commission adopted additional policies relating to academic program review. Guidelines for this policy were set in 1991, and amended in 1995. It is not clear from the state documents what prompted Alabama to establish or amend these policies. Most recently, in 1996, the Alabama Legislature passed a resolution creating a Higher Education Funding Advisory Commission, charged "to develop a proposed new funding approach for higher education that is performance-based and uses other incentive funding approaches." This move toward performance funding stemmed from a stated desire on the part of the Legislature to support "initiatives by institutions of higher education to continue its efforts to provide quality and educational efficiencies." Also in 1996, the legislature passed Act 96-557, which incorporated the Policy on the Identification of Programs for Review (described below) into law.

The initial policy on Institutional Effectiveness and Assessment of Student Learning Outcomes was designed to enable institutions "to demonstrate accountability to the state for its considerable investment in higher education." Clearly, then. this assessment policy can be characterized as addressing accountability. Under this policy, every public institution (two-and four-year) in Alabama was required to submit a "description of an outcomes assessment program and the results of such assessments" to the Commission. The other policy from 1988, dealing with Programmatic Accreditation, sought "to provide assurance that programs of study in the public institutions of higher education in Alabama meet established standards of quality." Again, this policy's own wording makes it clear that it was a quality assurance type of policy. Under this policy, each public institution was required to "seek accreditation for all programs of study eligible for specialized accreditation."

Policy Stage

Although the policy on Institutional Effectiveness and Assessment of Student Learning Outcomes "is still in effect , we [Alabama Commission] no longer receive annual planning statements from the institutions; thus, we do not know what is being done in the assessment area. This policy was really just the Commission's way of encouraging institutions to join the 'assessment movement.'" Because the state commission is no longer receiving feedback from the institutions, this policy may or may not be currently implemented by the institutions, and there is no mechanism for evaluation in place.

The policy on Programmatic Accreditation, however, was retained. The State Council of Chief Academic Officers was included in discussions concerning this policy. The policy, renamed the Policy on the Identification of Programs for Review, attempted "to identify and review programs in public institutions which do not meet minimum degree completion standards based on a five-year average." Alabama underwent one five-year identification and review cycle. During this cycle, "777 programs in the senior [four-year] institutions and 627 programs in the two-year were identified for review." Of the 777 programs reviewed in four-year institutions, 300 were slated either for alteration or termination. Of the 627 programs reviewed in two-year institutions, 250 were slated either for alteration or termination. This policy was codified into law with the passage of Act 96-557 in 1996. It should be noted that this policy relates to degree productivity, not accreditation (in the regional association sense of the word).

State Guidelines

Initially in July 1988 each institution was required to describe its progress on implementing assessment programs and then the results of the programs were to be reported in each planning cycle as part of the Annual Planning Statement. Plans remain voluntary and are developed by the institutions themselves. Measures of assessment are expected to "reflect the institutional goals of the institution, provide positive incentives for change, and include multiple indicators of educational outcomes. Results of the assessments should become part of the planning process and be reflected in the Institutional Annual Planning Statements submitted to the committee." (February 1988 Policy On Institutional Effectiveness and Assessment of Student Learning Outcomes) This annual statement is no longer required.

Programs/Positions

As of April 1996 Senate Joint Resolution 32 created the Higher Education Funding Advisory Commission to develop a new funding approach for higher education. A move toward performance based/incentive funding might result in other programmatic or positional developments.

Indicators and Outcomes

Specific outcomes have not yet been identified or mandated. From 1988 Policy Statement: "The policy will promote the attainment of the Goal of Excellence by providing high quality programs of instruction, and the Goal of Responsiveness by providing programs which contribute to the intellectual, ethical, and social development of individuals.... Measures of assessment should reflect the instructional goals of the institution, provide positive incentives for change, and include multiple indicators of educational outcomes."

Instruments

None currently evident, but the state is in the process of establishing common instruments. The CHE is working with the Institutional Advisory Committee on this matter. There are also explorations with regard to implementing statewide rising junior examinations and a statewide testing program for college seniors in their last term of study.

Teaching-Learning Elements

It appears that the extent of teaching and learning elements evidenced is limited. All that the policy states is that, "measures of assessment should reflect the instructional goals of the institution, provide positive incentives for change, and include multiple indicators of educational outcomes."

Public Reporting: Voluntary

Database

A bill passed by the state legislature in 1996 requires the CHE to establish a comprehensive multi-institutional database for students and faculty. This database is currently under development.

Budget

Per NCHEMS Report: "No new or distinct funding was attached to the assessment policy. There are no state funds for assessment purposes and each institution funds such activities through regular appropriations."

Regional Accreditation Association

Southern Association of Colleges and Schools

Regional Accreditation Association Relationship

Not evident

Disciplinary Accreditation

Policy on Programmatic Accreditation (1988) implemented to complete an inventory of instructional programs for which specialized accreditation is available. In general the policy advocates for institutions to seek accreditation for all programs of study available for specialized accreditation.

Technology Focus

Not evident

Top

Alaska

Contact

Nanne Myers, Assistant Vice President of Academic Affairs
Systemwide Academic Affairs

University of Alaska System
PO Box 755400
Fairbanks, AK 99775-5400
907-474-6302
FAX: 907-474-7570
sypvst@orca.alaska.edu

 
State Agency Original Initiative Year
Board of Regents and Commission on Postsecondary Education Board of Regents' Educational Effectiveness Policy 1996

Policy Analysis

Policy Context and Type

Partly in response to the action of Northwest Association of Schools and Colleges, the chief academic officers of Alaska's three universities proposed, and the Board of Regents enacted, a policy on "Educational Effectiveness." Previous to this there has been no such document with systemwide or statewide impact. This policy has as its stated purpose "[T]o improve the effectiveness of its educational programs and the fulfillment of its mission and objectives..." This wording would suggest a focus on quality assurance in the Regents' assessment policy.

Policy Stage

The system has completed its first annual report on its assessment activities. It is not clear what, if any, evaluation will be done as part of the overall process.

State Guidelines

Each Major Academic Unit is expected to "regularly undertake studies of the impact of its academic programs on its students and graduates. Universities are expected to describe achievements expected of their students and adopt reliable procedures for assessing those achievements. Assessment practices will be coordinated among MAUs. An annual report on the implementation and results of assessment practices will be provided to the Board of Regents.

Programs/Positions

System academic office has provided funding for faculty to attend the AAHE Assessment Forum meetings as well as bring several speakers to Alaska.

Indicators and Outcomes

Focus is "impact of academic programs on students and graduates", but no measures, indicators or desired outcomes are described.

Instruments

None evident

Teaching-Learning Elements

The extent of mention of teaching and learning elements is mention that according to regent's policy institutions "will describe achievements expected of their students and adopt reliable procedures for assessing those achievements".

Public Reporting

Annual report

Database

Comprehensive statewide database at SHEEO level including student records from 2 and 4 year institutions.

Budget

Funding for assessment activities is included in 1996 budget request.

Regional Accreditation Association

Northwest Association of Schools and Colleges

Regional Accreditation Association Relationship

Recent NWASC assessment requirement for self-study is mentioned as influence on development of Educational Effectiveness Policy.

Disciplinary Accreditation

No relationship mentioned

Technology Focus

None evident

Issues of Note

ECS 1987 survey results indicated Alaska was considering a "rising junior" exam in reading, writing and mathematics; that through surveys the state collects information on four populations: college graduate one year and again five years after leaving the institution, students who drop out after attending a state institution full-time for one year, and students in preparatory and vocational institutions. The surveys provide information on job satisfaction, job placement and salary.

ECS 1990 report suggests evidence of interest in assessment on part of Alaska Commission on Postsecondary Education in spring of 1990 to use process as means of seeking accountability and effectiveness. BOT also indicated to have incorporated a model of system assessment in its program evaluation process reviewed in 1990. This report suggests assessment is taking place in some contexts.

Top

Arizona

Contact

Jonathan E. Keller, Policy Analyst

Arizona Board of Regents
2020 N. Central Ave., Suite 230
Phoenix, AZ 85004-4593
602-229-2527
FAX: 602-229-2555
ASJEK@ASUVM.INRE.ASU.EDU

 
State Agency Original Initiative Year
Arizona Board of Regents Regents Mandate 1986

Policy Analysis

Policy Context and Type

Assessment in Arizona began in 1986, when the Arizona Board of Regents (ABOR) started issuing an "annual report on the academic performance of the preceding year's high school graduating class in mathematics and English courses at an Arizona university." (Arzberger and Cothran, 5/95) This "report card" is distributed to all school district superintendents, high school principals and counselors. This is a policy designed for accountability.

Assessment continued with the findings of a 1992 committee which conducted a Faculty Teaching Load Study and a Faculty Workload Study. In response to these findings, in 1993, the ABOR and the universities agreed on "a set of items and outcomes for which the universities would propose detailed measures and measurable goals." These goals and measures are designed to link faculty teaching effort to the quality of undergraduate education. The state's universities made their first report to the ABOR on these goals and measures in 1994. It is not clear from the state documents whether this reporting has continued, and if so, at what intervals. This policy has a strong focus on quality assurance.

In 1993, the state legislature passed the Budget Reform Act, which introduced a performance funding system for all state programs, including higher education. It is not clear from the state documents exactly how this Act has impacted higher education appropriations.

State Guidelines

The state's Budget Reform Act calls for public universities to follow a four-step process: (1) the purpose of each program must be stated in clear language; (2) goals and objectives must be identified for the next three years; (3) performance measures must be developed that measure the desired results/outcomes to be achieved; (4) outcome or quality measurements should identify the impact each program has on the goal or purpose for which it strives.

Programs and Positions

None evident

Indicators and Outcomes

Institutions were asked to develop measures and goals for the following items: (1) class availability; (2) adequacy of advising; (3) instructional technology; (4) lower-division courses taught by ranked faculty; (5) competitively educated graduates; (6) student contact with ranked faculty; and (7) research-related activities of students. The indicators used by institutions to measure these items varied.
There are two outcomes: (1) student persistence and graduation rates; and (2) length of time and credits taken for degree completion. The indicators used to measure these outcomes are common to all universities. For Outcome 1: percent of (a) full-time freshmen returning for a second year; (b full-time freshmen graduating in six years; (c) full-time lower-division transfer graduating in five years; (d) full-time upper-division transfer graduating in four years. For Outcome 2: average number of (a) years taken by all freshmen to complete a baccalaureate degree program; (b) hours earned by baccalaureate recipients who entered as freshmen; (c) hours earned by baccalaureate recipients who entered as transfers; and (d) percentage of seniors with more than 160 hours.

Instruments

Vary by institution.

Teaching-Learning Elements

None evident

Public Reporting

Annual

Database

Limited multi-institutional databases exist.

Budget

Not evident

Regional Accreditation Association

North Central Association of Colleges and Schools

Regional Accreditation Association Relationship

Not evident

Disciplinary Accreditation

Not evident

Technology Focus

The ABOR recognizes the importance of technology not only for assessment but also for access. Arizona's technological initiatives include distance learning.

 

Top

Arkansas

Contact

John Spraggins, Deputy Director, Academic Programs

Department of Higher Education
114 East Fifth Street
Little Rock, AR 72201
501-324-9300
FAX: 501-324-9308

 
Receipt of information pending.

 

 

Top

California

Contact

Charles A. Ratliff, Deputy Director

California Postsecondary Education Commission
1303 J. Street #500
Sacramento, CA 95814-2983
916-445-1000
FAX: 916-327-4417
pedgert@cpec.ca.gov

 
State Agency Original Initiative Year
California Postsecondary Education Commission Higher Education Assessment Act 1990

 

Policy Analysis

Policy Context

Assessment in California essentially began in 1990, when the state Legislature passed and the Governor signed Assembly Bill 1808, the Higher Education Assessment Act. The bill required the California Postsecondary Education Commission (CPEC) to compile an annual report profiling the progress public institutions were making toward meeting certain performance indicators. These indicators were established by CPEC and the state legislature. (Indicators are described below).

Policy Type

Bill 1808 states clearly the intent of the legislation: "[D]emonstrable improvements in student knowledge, capacities, and skills between entrance and graduation be publicly announced and available, and that these improvements be achieved efficiently through the effective use of student and instructional resources of time, effort, and money." This intent reflects a dual focus on accountability and quality assurance.

It should be noted that the most recent annual report acknowledges that "[T]he breadth and complexity of California public higher education make the development of measures of performance that are comparable across systems very challenging."

Policy Stage

The 1996 annual report is the third report issued. The performance indicators used in the 1996 report are the same as those used in 1995. While the original statute requires CPEC to review each set of reports and make recommendations concerning the format of future reports, the results of these reviews and recommendations are not clear from the state documents.

State Guidelines

All elements of public higher education in the state--the University of California system, the California State University system, and the California Community Colleges--are required to submit an annual performance report to the Legislature, CPEC, and the state Department of Finance. CPEC is then required to review the reports and make recommendations to the Legislature and Governor about "consolidating or eliminating existing reporting requirements..."

Programs/Positions

None evident

Indicators and Outcomes

Broadly speaking, CPEC has identified five general criteria considered "pertinent to the performance of higher education in California." These are numerous, more specific indicators measured under each broad criterion. The five general criteria are (1) population context; (2) fiscal context; (3) student preparation for college; (4) student access to college; and (5) student experiences in college.

Instruments

None evident

Teaching-Learning Elements

Intent of the Assessment Act is demonstrable improvement of student knowledge, capacities, and skills and recognition that these ends are achieved via instruction.

Public Reporting

By system

Database

Comprehensive statewide database exists at the SHEEO level, containing records from four-year and two-year public institutions

Budget

"The California Constitution requires the state to reimburse local agencies and school districts for certain costs mandated by the state." Claimed costs up to $1 million are reimbursed by a State Mandates Claims Fund.

Regional Accreditation Association

Northwest Association of Colleges and Schools

Regional Accreditation Association Relationship

Not evident

Disciplinary Accreditation

Not evident

Technology Focus

Not evident

Issues of Note

An April 1995 report entitled "The Challenge of the Century: Planning for Record Student Enrollment and Improved Outcomes in California Postsecondary Education" addresses the future vis-a-vis four issues: (1) financing collegiate opportunity or limiting student aid; (2) creating equitable state policies for student fees and financial aid; (3) increasing educational productivity and efficiency; and (4) improving regional and statewide cooperation and collaboration.

 

Top

Colorado

Contact

Jim Sulton, Senior Academic Officer

Colorado Commission on Higher Education
1300 Broadway, 2nd Floor
Denver, CO 80203
303-866-2723
FAX: 303-860-9750

 
State Agency Original Initiative Year
Commission on Higher Education HB 1187 1985

Policy Analysis

Policy Context

Colorado's assessment activities began with the passage of House Bill 1187 in 1985. This law, called the Higher Education Accountability Program Act, required that institutions "be held accountable for demonstrable improvements in student knowledge, capacities, and skills between entrance and graduation" and that "these demonstrable improvements be publicly announced and available." (HB 1187) Responsibility for enforcing this law was given to the Colorado Commission on Higher Education (CCHE). This law also featured an appropriations component: the CCHE was authorized by HB 1187 to withhold up to two (2) percent of an institution's annual appropriation if that institution failed "to implement any part of the higher education accountability program or fails to comply with the policies and standards of the commission in regard to this program." (HB 1187)

Policy Type

In operationalizing HB 1187, the CCHE developed a policy and general procedures for the establishment of accountability programs. In this policy, the Commission clearly states: "The overall purpose of the accountability program is to ensure the public that Colorado's state-supported institutions are accountable for providing quality education efficiently through the effective use of institutional resources of time, effort, and money." (CCHE policy, 2/11/88) This wording reflects a dual focus on accountability and quality assurance.

Policy Stage

Colorado's policies changed, however, in 1996. During that year, the Legislature replaced the accountability statute with the "Higher Education Quality Assurance Act." This new act instructed the CCHE and the system's governing boards "to develop a quality indicator system to obtain information for measuring, on systemwide and institutional levels, institutional performance, student satisfaction and success, employer satisfaction, and systemwide performance." (HB 1219) Based on how well institutions meet the levels of performance set by the indicators, HB 1219 authorizes the CCHE "to consider the governing boards' and institutions' performance on the statewide expectations and goals in making its funding recommendations and allocating funds to the governing boards." (HB 1219) Colorado is presently in the process of developing a list of performance indicators for use in operationalizing this legislation.

State Guidelines

CCHE and the governing boards will gather "the necessary information from the institutions and from students, graduates, and employers either by request or through development or implementation of surveys." CCHE is required to submit "an annual report of the information obtained through the quality indicator system." Governing boards and institutions are expected to use this information to improve the quality of education they offer.

Programs/Positions

The Colorado Commission for Achievement in Education, consisting of the CCHE, the Governor, and various members of the legislature, has been created as the group to which the Commission will report under the Higher Education Quality Assurance Act. By June, 1997, a list of performance indicators will be chosen. During the 1997 legislative session, a bill was introduced to eliminate the Colorado Commission for Achievement in Education.

Indicators and Outcomes

The recommended list of performance indicators consists of (1) percentage of graduates obtaining employment and/or engaging in further study and the pass rates of graduates on relevant professional examinations; (2) graduation, persistence, and transfer rates; (3) percentage of students who believe their instructional program met their goals; (4) existence and operation of a formal, comprehensive, and effective academic advising and career advising system; (5) employer satisfaction with the preparation of graduates; (6) general fund and tuition per FTE student in Colorado compared to other states, and instruction and academic support expenses as a percentage of educational and general expenses in each institution; (7) existence and implementation of a formal, comprehensive, and effective plan for appropriately integrating educational technology into the curriculum; (8) existence and operation of a formal, comprehensive, and effective institutional assessment and accountability plan; (9) provision of assistance to elementary and secondary education in achieving systemic reform and creation of appropriate linkages between elementary and secondary education and higher education.

Instruments

Vary by institution

Teaching-Learning Elements

The most recent funding bill for higher education allowed for the appropriation of additional funds to governing boards for specific policy objectives. Among these were objectives that addressed the level of student-faculty contact and amount of time faculty spend teaching students. However, “[T]he legislature is probably not going to fund the policy areas because last year’s funding was allocated inappropriately at some governing boards (in the legislature’s opinion).

Public Reporting

HB 1219 requires CCHE to "publish an annual consumer guide to state institutions of higher education for students and their families.” The first such guide was published last year.

Database

Comprehensive statewide database exists at SHEEO level, containing student records from four-year and two-year public institutions and some independent nonprofit colleges.

Budget

The state has not determined what percentage of appropriations will be linked to performance indicators. Nor is it clear from HB 96-1088 how much additional money the General Assembly has appropriated for special policy areas.

Regional Accreditation Association

North Central Association of Colleges and Schools

Regional Accreditation Association Relationship

Not evident

Disciplinary Accreditation

Not evident

Technology Focus

One of the special policy areas for which the General Assembly may appropriate additional funds is the use of technology to lower costs and improve the quality and delivery of education. This includes distance learning initiatives and the integration of technology into the curriculum.

 

Top

Connecticut

Contact

Donald H. Winnandy, Senior Associate/Chief Academic Officer

Department of Higher Education
61 Woodland Street
Hartford, CT 06105-2391
203-566-2325
FAX: 203-566-7865
derocco@apollo.commnet.edu

 
State Agency Original Initiative Year
Board of Governors for Higher Education Strategic Plan 1988

Policy Analysis

Policy Context

Statewide assessment in Connecticut began in 1988, when the Board of Governors (BOG), in their Strategic Plan for Higher Education, asked public colleges and universities to assess institutional effectiveness and report their progress to the Board. Interestingly, independent colleges and universities were invited to participate in the process. In 1993, the state legislature mandated institutional assessment, and further required that the results of assessment activities be reported biennially to the Commissioner of Higher Education and to the appropriate committees in the legislature. In 1996, the Board of Governors issued guidelines for the submission of these biennial assessment reports.

Policy Type

The 1988 BOG policy declared clearly that "the overall objective [of assessment] is to enhance the quality of instruction and student performance." This reflects a focus on quality assurance. The 1993 legislation was designed to ensure that "each public institution of higher education implements a process of institutional assessment and continuous improvement based on goals, objectives, and measurable outcomes..." This law seems to be the state's attempt to provide some legislative force behind the existing BOG policy.

State Guidelines

Each institution is required to submit an assessment report biennially. These reports are read and reviewed by the Peer Review Committee and the State Department of Higher Education. The Commissioner then forwards the reports to the Education Committee of the General Assembly. It is not clear what action is taken by the Department of Higher Education and/or the state.

Programs/Positions

A peer review committee, appointed by the Commissioner of Higher Education, assists in the review of institutional assessment plans.

Indicators and Outcomes

The 1996 guidelines recommend that each of these areas be addressed in the biennial reports: overview; general education; academic programs/major; basic skills testing, placement, and remediation; admission rates, retention rates, minority enrollment, enrollment of persons with disabilities, student financial aid, student transfer and articulation; student performance, attainment, and development; follow-up on graduates; faculty and administrative productivity; adequacy of core academic, student, and library services and facilities.

Instruments

Vary by institution.

Teaching-Learning Elements

As part of their biennial reports institutions are asked to provide goals and objectives for student performance, attainment and development; the methods they will use to assess these objectives, and information on how this assessment is used to promote improvement Intent of original policy (1989) was improvement of student performance and instruction.

Public Reporting

Biennial

Database

Comprehensive statewide database exists at the SHEEO level, containing student records from four-year and two-year public institutions.

Budget

Not evident

Regional Accreditation Association

New England Association of Schools and Colleges

Regional Accreditation Association Relationship

Not evident

Disciplinary Accreditation

Not evident

Technology Focus

Not evident

 

Top

Delaware

Contact

Marilyn B. Quinn, Executive Director

Delaware Higher Education Commission
820 French Street, 4th Floor
Wilmington, DE 19801
302-577-3240
FAX: 302-577-3862
mquinn@ois.state.de.us

 

No Initiatives at the state or system level.

 

 

Top

Florida

Contact

Patrick H. Dallet, Assistant Executive Director

Postsecondary Education Planning Commission
Florida Education Center
Tallahassee, FL 32399
904-488-7894
FAX: 904-922-5388
dalletp@mail.firn.edu

 
State Agency Original Initiative Year
Postsecondary Education Planning Commission CLAST (College-level Academic Skills Test) 1982

Policy Analysis

Policy Context

In terms of testing, assessment in Florida began in 1982, when the state initiated the requirement that students take the CLAST (College-level Academic Skills Test). The CLAST was implemented "to ensure that college students graduating with an associate of arts degree or entering the upper division at a state university possess basic or essential skills deemed necessary for success...CLAST was further intended to serve as both a summative evaluation instrument prior to student enrollment in upper-division programs and as a source of information for student advisors." (1989 Report on Assessment of the General Education Curriculum) In 1983, the state initiated the use of common entry level placement tests for incoming freshmen in both two-year and four-year institutions. The assessment movement continued with a 1987 report entitled "Enhancing Undergraduate Education in the State University System of Florida," compiled by the System itself. This report contained numerous recommendations in a number of areas, including assessment. In 1988, the State Board of Community Colleges conducted program reviews of associate of arts degree programs at those institutions, attempting to assess institutions' effectiveness in meeting the general education needs of students. Assessment of the curriculum became a mandated activity in 1988, with the General Appropriations Act, in which the state legislature charged the Postsecondary Education Planning Commission (PEPC) to "undertake an assessment of the general education curriculum in Florida's public community colleges and state universities..."

In 1991, the state legislature created accountability reporting requirements for Florida's public institutions. These requirements were established, at least in part, to respond to "a perceived concern that the public did not have adequate and appropriate information about how colleges and universities function." (Accountability Review: Progress Report, 1994) More recently, the 1994 General Appropriations Act directed the PEPC to "review and evaluate the accountability plans in public postsecondary institutions as they relate to the mission and goals of each system and its respective institutions, as well as the goals articulated by the Legislature." (Ibid.) These goals fall primarily into three areas: access/diversity; quality of undergraduate education, and productivity. Interest in linking accountability plans with state goals stems from the belief that "existing legislation and institutional responses did not sufficiently embody the kinds of characteristics that would lead to improved management at the local level and provide for systematic, ongoing assessment." (Ibid.) Interestingly, independent institutions in Florida are also covered by this policy.

Policy Type

In its 1994 report, Accountability in Florida's Postsecondary Education System, PEPC stated that assessment has two purposes: to foster improvement at the institutional level (the primary purpose), and to provide information to state-level policymakers (the secondary purpose). Thus, Florida's policy is one of both quality assurance and accountability.

Policy Stage

In 1995, PEPC conducted a review of its efforts to review institutions' accountability plans. Recommendations were made on how to improve the process. Following this evaluation, implementation of the policy has continued. In addition, the CLAST has been revisited and evaluated on numerous occasions since its implementation, resulting in an evolution of the exam since 1984.

State Guidelines

"The annual accountability report shall include goals and measurable objectives related to the system-wide strategic plan...the report must include, at a minimum, system-wide performance targets, measures, and data related to the following issues: (1) undergraduate teaching productivity and class size; (2) access and diversity; (3) baccalaureate degree retention and graduation; (4) progression to the baccalaureate degree; (5) research; (6) public service; (7) institutional quality." This wording reflects the changes recommended in the 1995 Postsecondary Accountability Review. For state universities, indicators include total student credit hours; contact hours of instruction provided by faculty; pass rates on professional licensure exams; surveys of alumni, parents, clients, and employers; time-to-degree rates; enrollment, retention, and graduation rates by race and gender; and student course demand.

Programs/Positions

None evident

Indicators and Outcomes

There are a wide variety of performance indicators, some broadly designated by the state legislature and other, more specific measures established by PEPC. There are different set of indicators for the State University System (SUS) and the Division of Community Colleges (DCC). Each of these indicators serve one or more of the following purposes: accountability; performance-based program budgeting; and incentive funding. These indicators are under discussion in the 1997 legislative session.

Instruments

CLAST for general and major field of education

Common college placement tests

Professional licensure examinations

Surveys of alumni, parents, clients, and employers

Teaching-Learning Elements

Quality of undergraduate education is one of the three primary goals of the Florida Legislature vis-à-vis assessment.

Public Reporting

Annual

Database

Several separate institutional databases exist and are linked.

Budget

Not evident

Regional Accreditation Association

Southern Association of Colleges and Schools

Regional Accreditation Association Relationship

Not evident

Disciplinary Accreditation

Not evident

Technology Focus

Not evident

Issues of Note

The Community College System was required to submit a performance-based budget for 1996-97; the State University System is required to do so for 1997-98. This marks the beginning of a performance funding policy in Florida.

 

Top

Georgia

Contact

Joe Szutz

University System of Georgia
244 Washington Street, SW
Atlanta, GA 30334
404-657-6674/651-2213
FAX: 404-651-5190

 
State Agency Original Initiative Year
Board of Regents Planning Policy 1989

Policy Analysis

Policy Context and Type

The Board of Regents of the University System of Georgia (BORUSG) first adopted an assessment policy in 1989. This policy called for each institution within the University System of Georgia to develop an assessment process and to report progress toward the implementation of this process to the Chancellor's Office. According to the BORUSG policy, "[E]ach institution plan will describe the structure and process by which...the results of assessment are used to achieve institutional improvement." This would seem to indicate a policy focus on quality assurance, as well as reform.

State Guidelines

"Each institution shall have a plan...which will contain the institution's current goals and priorities, a summary of significant assessment results and associated improvement objectives, and action plans by which institutional priorities, including improvement in effectiveness, will be achieved." The policy acknowledges that "assessment procedures may differ from institution to institution," but the Regents outlined four areas on which all institutions must report assessment results: basic academic skills at entry; general education; specific degree program areas; and all academic and administrative support programs."

Programs/Positions

At the state level, the Task Force on the Assessment of General Education was established in 12/88, and the Committee on the Assessment of Institutional Effectiveness was made part of the formal Administrative Committee Structure in 12/89. At the institutional level, committees have been created on some campuses to facilitate their assessment activities.

Indicators and Outcomes

These vary by institution, but institutional types tend to have similar indicators. In Georgia, for assessment purposes, institutions are categorized as universities, regional universities, state universities, and associate-level colleges.

Instruments

Yes

Teaching-Learning Elements

Georgia places a strong emphasis on student learning outcomes in its policy, and has developed a model for the process of assessing these outcomes, calling for each program to define student learning outcomes, establish measurable expected results for each outcome and assess those results, analyze assessment results to identify strengths and weaknesses, and take the necessary steps to improve the program. (BORUSG, 1989-90)

Public Reporting

Annual

Database

Comprehensive statewide database exists at SHEEO level, containing student records from all four sectors of public institutions.

Budget

According to the Regents' policy, "[E]ach institution shall link its major budget allocations and other major academic administrative decisions to its planning and assessment process.”

Regional Accreditation Association

Southern Association of Colleges and Schools

Regional Accreditation Association Relationship

Not evident

Disciplinary Accreditation

Not evident

Technology Focus

Not evident

 

Top

Hawaii

Contact

Colleen O. Sathre, Vice President for Planning and Policy

University of Hawaii
2444 Dole Street, BH 209
Honolulu, HI 96822
808-956-7075
FAX: 808-956-9119
opp_csathre@#mvax.mso.hawaii.edu

 
State Agency Original Initiative Year
Board of Regents Executive Policy E5.210 1989

Policy Analysis

Policy Context

In 1989, the University of Hawaii Board of Regents approved a statement declaring the university system was "committed to a process of educational assessment that provides evidence about the institution's effectiveness in meeting its goals and objectives." To this end, the Regents directed the president of the University of Hawaii to establish a policy guiding the assessment process for member institutions. Later that year, the University's Office of Planning and Policy issued a new executive policy on educational assessment. Additionally, beginning in 1996, Hawaii has also employed performance indicator “benchmarks” as direction and guidance in the development of budgets and tuition schedules. Act 161 requires the university to report on their progress in meeting these benchmarks.

Policy Type

According to the executive policy, assessment was the primary means by which information could be gathered about the university system's success in meeting its goals and missions. Further, this information would be used to promote program improvement. This reflects attention to quality assurance. (The policy drew a sharp distinction between the evaluation of faculty, staff, and student performance, which was addressed in other procedures, and the assessment of program effectiveness, which was addressed by this policy.) The policy also discussed the usefulness of assessment in determining "the degree to which the University meets state objectives and satisfies state needs." This wording indicates some focus on accountability to the state level.

Policy Stage

The policy states that the Regents "will be informed of University assessment activities by means of special reports and as part of ongoing program review, accreditation, academic planning, and budgeting processes." These “special reports” have been replaced by the annual Benchmark/Performance Indicator Reports.

State Guidelines

The executive policy requires that assessment programs include the following dimensions: (1) each unit or program must have a clear mission statement; (2) special priority to undergraduate instructional programs (see teach-learn elements below); (3) recognition of the effect of graduate and professional education on overall scholarly reputation (for the UH-Manoa campus); (4) the role(s) and effectiveness of research in institutional goals; (5) data on students transferring within the system and effectiveness of student service programs; (6) evidence that shows the University is meeting the needs of the state.

Programs/Positions

None evident

Indicators and Outcomes

The broad “benchmarks,” or goals, articulated in the executive policy are (1) expanding access to educational opportunity; (2) striving for excellence in undergraduate education; (3) continuing to gain prominence in research and distance learning; (4) revitalizing service to the state; (5) enhancing the international role of the university; (6) maintaining diversity by clarifying campus missions and coordinating campus plans; and (7) improving the organization, financing and image of the university. The following are indicators used to measure progress toward these goals. For Goal 1: attendance rates at the University of Hawaii for recent state high school graduates; admission rates of state residents; status of off-campus access to UH credit programs; status of remedial education; demographic trends in the composition of UH student body; status of enrollment by geographic origin within Hawaii . For Goal 2: persistence and graduation rates; status of post-baccalaureate enrollment at UH-Manoa; success rates for transfer students; linkage with K-12; status of articulation within the UH system; percentage of eligible students who pass external exams; student satisfaction with educational experience; student satisfaction with employment preparation; student satisfaction with academic preparation; overall state of faculty satisfaction and morale; class size. For Goal 3: federal grants and contracts; library resources; access to technology. For Goal 4: number of degrees awarded annually; employer satisfaction; employment rates (for community college vocational students); economic impact of UH; opportunities for continuing education and non-credit instruction. For Goal 5: access to international programming/faculty. For Goal 6: avoidance of duplication through specialization; registration in Hawaiian language and culture courses. For Goal 7: relationship between state appropriations and enrollment; share of state support in comparison to the rest of the state; comparison of UH tuition with peer institutions; level of investment for the physical plant; faculty salaries; faculty workload; rate of private giving; public opinion.

Instruments

Vary by institution.

Teaching-Learning Elements

Particular emphasis is given to "the interaction between undergraduate students and the campus' curricula and services." Student educational expectations, achievement in general education, accomplishment in major field of study, level of satisfaction, and long-term tracking of satisfaction, demographics, and employment are spotlighted in the executive policy.

Public Reporting

Annual

Database

Comprehensive statewide database exists at the SHEEO level, containing student records from four-year and two-year public institutions.

Budget

Assessment activities are not centrally funded. Act 161, however, returns a portion of tuition revenue to the university.

Regional Accreditation Association

Western Association of Schools and Colleges

Regional Accreditation Association Relationship

Not evident

Disciplinary Accreditation

Not evident

Technology Focus

None evident

 

Top

Idaho

Contact

Robin Dodson, Chief Academic Officer

State Board of Education
650 W. State Street
P.O. Box 83720
Boise, ID 83720-0037
208-334-2270
FAX: 208-334-2632
rdodson@osbe.state.id.us

 
State Agency Original Initiative Year
State Board of Education Governing Policies and Procedures - Outcomes Assessment 1988

Policy Analysis

Policy Context

Outcomes Assessment in Idaho began in 1988, when the Idaho State Board of Education (BOE) required all four of the public colleges and universities to "form campus assessment committees...compile inventory of current assessment practices...develop working knowledge of assessment as a national phenomenon..." In June, 1989, campuses reported their assessment inventories and plans to the BOE. These campus initiatives were followed by department-level assessment reports in June, 1990, and General Education Assessment in June, 1991. Since 1993, each of the four campuses has been asked to report to the BOE annually on assessment procedures, and implement changes, if necessary, to these procedures.

Policy Type

The BOE states the purpose of its Outcomes Assessment policy clearly: "The primary purpose of assessment is to enhance the quality and excellence of programs, learning, and teaching." This identifies Idaho's commitment to quality assurance. In addition, the BOE sees assessment as a means of increasing communications both within and between departments, and also as a means of giving the general public a better sense of the various roles and missions of higher education institutions. Significantly, the BOE assessment policy also states clearly how assessment should not be used: "...to compare institutions, to evaluate teachers, or to eliminate positions, programs, or departments."

Policy Stage

Idaho has broken down its assessment initiative into multiple component parts: Assessment Inventory, Assessment Plan, subject areas assessment, and general education assessment. The effectiveness of assessment procedures in subject areas and general education have been evaluated. Because of the way in which Idaho has subdivided assessment, it has gone through at least two complete policy cycles. Since then, it would seem that each year is an ongoing evaluation of assessment procedures. Any additional policy formulation or re-formulation appears to be based on the results of this continuing evaluation process.

State Guidelines

State Board of Education identifies three guiding principles for the assessment process: student assessment of programs should be included in the current program review process; assessment of student learning should occur in major fields of study as defined at the departmental level by each institution; and student learning should be assessed in general education areas as defined by each institution. Each institution is expected to develop their own individual assessment plan using a broad range of recommended, but not mandated processes and tests. Each campus is expected to inform its student body of the assessment process and its benefits.

Programs/Positions

Each public institution formed a campus assessment committee, which included student representation.

Indicators and Outcomes

None specifically outlined

Instruments

Different kinds (surveys, standardized tests, exit examinations) are suggested but not mandated.

Teaching-Learning Elements

Assessment of student learning in both general education and major fields of study are listed as guiding principles of Outcomes Assessment Policy and Procedures. The stated primary purpose of assessment is the enhancement of teaching and learning.

Public Reporting

Annual Reports to Board

Database

Comprehensive statewide database exists at the SHEEO level, containing student records from four-year and two-year public institutions.

Budget

First year planning costs were borne by individual institutions, but policy notes that a long term financial commitment from the Board and Legislature will be required. The BOE allowed each institution to place planning costs into their base budget beginning in fiscal year 1990.

Regional Accreditation Association

Northwest Association of Schools and Colleges

Regional Accreditation Association Relationship

None evident

Disciplinary Accreditation

None evident

Technology Focus

None evident, but will be a future consideration.

 

Top

Illinois

Contact

Kathleen F. Kelly, Deputy Director

State Board of Higher Education
4 West Old Capitol Plaza, Room 500
Springfield, IL 62701-1287
217-782-3442
FAX: 217-782-8548
katkelly@uis.edu

 
State Agency Original Initiative Year
State Board of Higher Education Recommendations of the Committee on the Study of Undergraduate Education 1986

Policy Analysis

Assessment began in Illinois in 1986, with the recommendations of the Committee on the Study of Undergraduate Education. The State of Illinois Board of Higher Education (BHE) adopted these recommendations, and later adopted the committee's slightly revised recommendations in 1990. The 1990 recommendations call for each public institution to do the following: (1) set expectations for students' development of baccalaureate-level skills of communications, mathematics, and critical thinking, and establish objectives for general education and major field education; (2) communicate these expectations and objectives clearly to students; (3) assess individual student achievement of these expectations and objectives at appropriate intervals; (4) use assessment results to reinforce academic standards and promote student progress; and (5) report the findings and conclusions of reviews of undergraduate education to the BHE. These recommendations form the foundation of the Illinois policy. Beyond this, however, the state's "approach to assessment of students is to call upon institutions to develop appropriate assessment programs rather than develop some sort of statewide assessment text or common set of indicators." (Kelly letter, 1/28/97)

Another element of Illinois' system is the state's Priorities, Quality, and Productivity (PQP) Initiative. This initiative was designed to engage "governing boards and campus communities in priority-setting and decision-making." (PQP 1995-96 Summary) The heart of PQP is “setting priorities and making tough decisions.” PQP called on public institutions of higher education to "reinvest six to eight percent of their state-appropriated operating funds from low priority programs and activities to higher priority needs." (Ibid.) This reinvestment was to take place over a three-year period from 1992 to 1995. After four years of PQP, an estimated $153.6 million has been reinvested. Of this amount, $27.5 million went to improve the quality of undergraduate education. This policy is an ongoing process of implementation, evaluation, and redesign.

State Guidelines

See Policy Analysis above.

Programs/Positions

None evident

Instruments

Vary by institution. For the assessment of baccalaureate-level skills, among the instruments used is the ACT-CAAP, and a variety of writing and math proficiency exams developed by the institutions. For the assessment of general education, at least one institution uses the ACT-COMP. Some institutions also draw data from surveys administered to students who are either withdrawing or graduating; most institutions use surveys sent to alumni.

Indicators and Outcomes

For assessment of baccalaureate-level skills, general education, and major field of education, indicators vary by institution.

Teaching-Learning Elements

Improvement of the quality of undergraduate education is one of the areas of reinvestment in the PQP policy.

Public Reporting

Made available to the public through the BOE’s agenda materials, and to the Governor and the legislature as appropriate.

Database

Comprehensive statewide database at SHEEO level which contains student records from four and two-year public institutions.

Budget

Total reinvestment under PQP = $153.6 million
Reinvestment in improvement of undergraduate education = $27.5 million

Additional funding has been included each year for the last four years.

Regional Accreditation Association

North Central Association of Colleges and Schools

Regional Accreditation Association Relationship

Board of Higher Education noted North Central's 1989 request that all member institutions develop institution-wide student assessment plan. The types of evidence suggested by North Central closely parallel the Board of Education's assessment components.

Disciplinary Accreditation

Not evident

Technology Focus

None evident

 

Top

Indiana

Contact

Kenneth Sauer, Associate Commissioner, Research and Academic Affairs

Commission for Higher Education
101 West Ohio Street #550
Indianapolis, IN 46204-1971
317-464-4400
FAX 317-464-4410
ken@che.state.in.us

 
State Agency Original Initiative Year
Commission for Higher Education State-level Performance Objectives 1984

Policy Analysis

Policy Context

Beginning with the 1985-87 biennium, public institutions in Indiana have been reporting on their progress toward meeting state-level performance objectives set by the Indiana Commission on Higher Education (ICHE). During that same biennium, budget appropriations recommendations were linked to institutional performance reports. The 1985-87 set of performance objectives have been updated in an attempt to focus on the future of public higher education in the state.

Policy Type

In the Commission's 1995 report on institutional progress toward meeting the state-level performance objectives, the rationale for this policy was stated explicitly: "One way to demonstrate the value of such investment is to call attention to higher education's accomplishments in those areas deemed important by the state...Performance objectives help to focus attention on what the state's system of postsecondary education must accomplish in the 1990s. They should also motivate discussion about alternative strategies for meeting postsecondary education's needs, which in many respects are also Indiana's needs." This wording would seem to point to a shared emphasis on accountability and reform.

Policy Stage

Since the first performance objectives were issued in 1984, ICHE has amended and revised the guidelines on multiple occasions. To the extent this has been done, Indiana has undergone numerous cycles of implementation, evaluation, and revision. This process looks to continue indefinitely.

State Guidelines

Institutions are required to submit biennial reports providing data on progress made toward meeting statewide performance objectives.

Programs/Positions

None evident

Indicators and Outcomes

Specific indicators are divided into six major categories: (1) postsecondary participation--of all residents, of state minority residents, and of residents in underserved counties; (2) affordability--family affordability index and cost of attendance index; (3) degree completion rates--of baccalaureate, associate, and minority students; (4) medical education--students in family practice and primary care, and minority enrollment; (5) credit transfer--expanded credit transfer opportunities; and (6) productivity - as yet unspecified.

Instruments

None evident

Teaching-Learning Element

None evident

Public Reporting

Biennially

Database

Comprehensive statewide database exists at the SHEEO level, containing student records from four and two-year public institutions, and some independent non-profit colleges.

Budget

Not evident

Regional Accreditation Association

North Central Association of Colleges and Schools

Regional Accreditation Association Relationship

Not evident

Disciplinary Accreditation

Not evident

Technology Focus

Not evident

 

Top

Iowa

Contact

Robert J. Barak, Deputy Executive Director

State Board of Regents
Old Historical Building
East 12th and Grand
Des Moines, IA 50319
515-281-3934
FAX: 515-281-6420
rbarak@iastate.edu

 
State Agency Original Initiative Year
Board of Regents Board of Regents Policy on Student Outcomes Assessment 1991

Policy Analysis

Policy Analysis

Beginning in 1991, Iowa's Board of Regents (BOR) required all institutions under its control to perform outcomes assessment for all of their academic programs. The BOR adopted the NASULGC Statement of Principles; in addition, each institution/program is asked to develop an assessment plan that meets its own needs. The Regents are quite clear that its assessment policy is designed, first and foremost, to improve student learning. The Deputy Executive Director acknowledges that "the collection of information documenting the institutional activities does provide institutional accountability," but the state recognizes the danger in the collection of standardized data and its use in educational policymaking. Iowa's policy, then, is focused on quality assurance.

State Guidelines

"Each institution was asked to require every academic unit to develop student outcomes reporting that will serve as a guide to them in the improvement of student learning. This meant that from a wide variety of approaches to student outcomes assessment, the units would utilize the assessment methodology of student learning that best meets the needs of the discipline and its students." (Barak letter, 8/6/96)

Programs/Positions

Varies by institution.

Indicators and Outcomes

Vary by institution and by program area.

Instruments

No

Teaching-Learning Element

The policy is largely driven by the goal of improving student learning.

Public Reporting

Reported to the BOR annually.

Database

No multi-institutional databases exist.

Budget

Not evident

Regional Accreditation Association

North Central Association of Colleges and Schools

Regional Accreditation Association Relationship

“The institutions, particularly Iowa State University, which was recently re-accredited, and the University of Iowa, which is in the process of being re-accredited, saw that the accreditation, or institutional assessment, were compatible; i.e., served both purposes.” (Barak letter)

Disciplinary Accreditation

“…In those disciplines in which assessment is required, the assessment undertaken for the BOE meets this purpose.” (Barak letter)

Technology Focus

Not evident

 

Top

Kansas

Contact

John F. Welsh III, Director of Academic Affairs

Kansas Board of Regents
700 SW Harrison #1410
Topeka, KS 66603-3760
913-296-3422
FAX: 913-296-0983
John@KBOR.State.KS.US

 
State Agency Original Initiative Year
Kansas Board of Regents Assessment Policy 1988

Policy Analysis

Policy Context

1988 marked the beginning of statewide assessment efforts. In that year, the Kansas Board of Regents (BOR) developed a plan focusing on the assessment of the undergraduate experience. The BOR adopted and implemented the plan in 1989.

Policy Type

According to the Regents, the "[F]undamental goal of the Board's assessment strategy is to evaluate the impact of undergraduate programs in the areas of student basic skills, general education, and the undergraduate major." This strategy has been manifested in a "variety of activities intended to improve the quality...of academic programs...and to demonstrate the efficiency and effectiveness of the use of state and student resources to support them." (Emphasis is in the original.) Clearly, then, the Kansas policy focuses on quality assurance and accountability. Each of the state institutions has taken its own direction with assessment, while keeping with the Regents' overall strategic guidelines.

Policy Stage

Each institution reports annually on its assessment activities related to student basic skills and general education. Reports on students learning in the undergraduate major are submitted to the BOR every three years. As of June, 1996, all six Regents’ universities have submitted plans to assess student learning in the undergraduate major. The BOR staff have these reports--on basic skills, general education, and student learning--and they have offered observations, commentary, and recommendations for the next step in the assessment process.

State Guidelines

See policy type and stage.

Programs/Positions

None evident

Indicators and Outcomes

Four common indicators: (1) retention and graduation rates; (2) student perceptions of the quality of the experience; (3) post-baccalaureate enrollment and employment survey; (4) specific to program areas

Instruments

No

Teaching-Learning Element

The degree to which these are emphasized vary by institution, but the Regents' commitment to student learning outcomes is reflected in the general policy

Public Reporting

Not evident

Database

No multi-institutional databases exist.

Budget

Not evident

Regional Accreditation Association

North Central Association of Colleges and Schools

Regional Accreditation Association Relationship

In its most recent NCA report, the University of Kansas system of assessment was described as "extremely sophisticated" and "not inexpensive."

Disciplinary Accreditation

Not evident

Technology Focus

Not evident

 

Top

Kentucky

Contact

Sue Hodges Moore, Deputy Executive Director
Academic Programs, Planning and Accountability

Council on Higher Education
1024 Capital Center Dr. #320
Frankfort, KY 40601-8204
502-573-1555
FAX: 502-573-1535

 
State Agency Original Initiative Year

Kentucky Council on Higher Education

Senate Bill 109 1992

Policy Analysis

Policy Context

In 1992, the Kentucky Legislature passed Senate Bill 109 into law, which required public institutions of higher education to implement an accountability process, which would provide "for a systematic ongoing evaluation of quality and effectiveness..." In 1994, for the first time, the Kentucky Commission on Higher Education (KCHE) linked its funding recommendations to reports on institutional performance. This was the precursor to a system of performance funding. Two years later, in 1996, the "Strategic Plan for Kentucky Higher Education, 1996-2000: Seize the Future," was composed, seeking to "establish system priorities for 1996-2000 and provide direction for institutional planning efforts." Among these priorities were an educated citizenry, equal opportunities, and economic development. Most recently, in July 1996, the KCHE followed up on its initial experiment with performance funding and adopted a long-term policy of performance funding. The performance measures for this policy were based in large part of the priorities set forth in the strategic plan.

Policy Type

The 1992 Accountability Legislation, as its name would suggest, was an attempt by the Kentucky General Assembly to increase the accountability of higher education. This accountability would better enable state legislators and the KCHE to "monitor performance at the institutions in each of the major areas of instruction, research, and public service, while recognizing the individual missions of each of the institutions." The strategic plan seemed to be designed to prioritize these "major areas," thus providing public colleges and universities with a more informed idea of what Kentucky as a state would need and expect from them in the coming years. In this respect, the Strategic Plan is precisely that--a plan, not a policy. The plan was followed, however, by a policy calling for the implementation of performance funding. The KCHE anticipates that a performance funding system "will help demonstrate that Kentucky higher education serves the long-term needs of the Commonwealth and that excellence in performance and outcomes is the ultimate goal of the entire higher education system." Given this stated rationale for performance funding, the policy seems to have both accountability and quality assurance as its goals.

Policy Stage

Following the passage of the original Accountability Legislation in 1992, Kentucky has “produced four annual editions of the Higher Education Accountability Report Series (1993-1996). The series includes a system-wide report, eight university reports, fourteen community college reports, and a system-wide community college report. In 1996, the reports were redesigned based on comments from three external reviewers. The reports are now easier to read and more useful for policymakers and consumers of higher education.” (Moore)

Since the 1996-1998 biennium will mark the first full-scale implementation of performance funding, this policy is currently in its initial implementation stage. It appears Kentucky will continue requiring the Accountability Reports while it moves toward a system of performance funding. It seems likely that most, if not all, of the performance measures used in the performance funding policy will be communicated in the annual Accountability Reports.

State Guidelines

The state requires the system to submit an annual accountability report, addressing performance in a variety of indicators (listed below). Three years' worth of these reports helped to inform the Strategic Plan for 1996-2000, which in turn provided the basis for the performance funding policy. All public institutions in Kentucky are measured on four common indicators, the total value of which must be at least 50 points on a 100-point scale. There are, in addition, seven institution-specific and three mission-specific indicators. Each institution selected any number from among these ten indicators on which to be measured. The value for these indicators, when totaled, could be no more than 50 points on the 100-point scale. The first institutional reports on success in meeting these indicators will be submitted in February, 1997. Based on a review of those reports, the KCHE will recommend distribution of performance-linked funds.

Programs/Positions

None evident.

Indicators and Outcomes

Common indicators: 1. quality of educational outcomes; 2. student advancement; 3. use of technology in student learning; 4. preparation of K-12 teachers (common to universities); 5. educated workforce development (common to community colleges)

Institution-specific indicators: 6. effective use of resources; 7. global perspective in academic programs; 8. review of gender issues; 9. cooperative academic degree programs; 10. alternative educational delivery; 11. level of gifts, grants, and contracts funding; 12. EEO plan implementation

Mission-specific indicators: 13. institutional scholarships and grants; 14. educated workforce development; 15. educational reform--professional development

Instruments

(The instruments correspond by number to the common and institution-specific indicators below.)
1. number of degree programs and general education programs using student outcomes for program assessment
2. rate of student progress as measured by retention, graduation, or both
3. number of uses of technology in student learning by faculty
4. scores of Kentucky teachers on multiple-choice component of all Praxis II subjects area assessments, in comparison to national averages
5. qualitative report summarizing annual performance in this area
6. increase in effectiveness through the use of innovative management practices
7. degree to which global/international perspective is included in academic programs
8. qualitative report summarizing annual performance in this area
9. number of cooperative academic degree programs and/or agreements
10. number of courses or programs using alternative delivery systems (e.g., interactive TV, non-traditional time blocks, practice-based/service-learning component)
11. amount of funding received from grants, contracts, and gifts
12. degree to which EEO goals, established by KCHE, have been met

Teaching-Learning Element

These performance indicators address student outcomes, student persistence, and teacher education/preparation.

Public Reporting

Reports, available to the public, are made yearly to the governor and General Assembly

Database

Comprehensive statewide database exists at the SHEEO level, contains student records from 4-year and 2-year public institutions.

Budget

Not clear from the state legislation or reports

Regional Accreditation Association

Southern Association of Colleges and Schools.

Regional Accreditation Association Relationship

Assessment activities are complementary to both institutional and programmatic accreditations.

Disciplinary Accreditation

Assessment activities are complementary to both institutional and programmatic accreditations

Technology Focus

One performance indicator addresses "the ultimate use of technology in the learning process..."

Issue of Note

The first performance funding recommendations, based on the institutions' reports on how well they are meeting the performance indicators, will be made this year.

 

Top

Louisiana

Contact

Kerry Davidson, Senior Deputy Commissioner
Academic Affairs and Sponsored Programs

Louisiana Board of Regents
150 Third Street #129
Baton Rouge, LA 70801-1389
504-342-4253
FAX: 504-342-9318
labor@lsuvm.bitnet

 
State Agency Original Initiative Year
Board of Regents of the State of Louisiana

Act 237

1993

Policy Analysis

Assessment in Louisiana is strongly linked to accountability. In its 1997 report on Accountability in Louisiana’s Colleges and Universities, the Board of Regents acknowledged that “Louisiana joined a growing number of other states in an effort to provide the business community, policymakers, the media, and the general public with an annual report on the state of higher education as a public investment.”

Louisiana’s first accountability legislation was Act 237, passed during the 1993 Regular Legislative Session. This act outlined a “number of specific indicators to guide the work of Louisiana’s higher education efforts.” Two years later, the Legislature passed Act 459, which required implementation of the accountability process, and also required the submission of a formal report to the Legislature in 1997. Since the passage of Act 459, committees working under the direction of the Regents have been examining the accountability indicators and the related performance measures.

According to the Regents, there are four primary purposes to the accountability effort in Louisiana: (1) to strengthen the quality of higher education; (2) to enhance the cycle of continuous improvement within and among the state’s colleges and universities; (3) to inform the governor, legislators, and citizens of higher education activities; and (4) to identify further efforts to better serve Louisiana. The Regents state explicitly that assessment of institutional effectiveness and the accountability effort should not be used for broadly comparative purposes. “Because the roles, scopes, and missions of the institutions are many and varied, it is important that the data and information generated by this effort not be used to compare unlike institutions. Only peer institutions should be compared to other peer institutions.” (Emphasis in the original.)

State Guidelines

Institutions report data on the indicators mandated in the accountability legislation.

Programs and Positions

None evident

Indicators and Outcomes

Not clear from state documents

Instruments

licensure, certification, and professional examinations

Teaching-Learning Element

None evident

Public Reporting

First formal report to the State Legislature was due in January, 1997.

Database

Data yet to be available.

Budget

None evident

Regional Accreditation Association

Southern Association of Schools and Colleges

Regional Accreditation Association Relationship

The Regents make reference to the fact that SACS requires as a condition of accreditation that each member institution has a strategic plan and an internally developed assessment program to measure progress toward the performance goals in their institutional plan.

Disciplinary Accreditation

None evident

Technology Focus

None evident

 

Top

Maine

Contact

Nancy M. MacKnight, Vice Chancellor, Academic Affairs

University of Maine System
107 Maine Avenue
Bangor, ME 04401-1805
207-973-3232
FAX: 207-973-3296
nancym@maine.maine.edu

 
State Agency Original Initiative Year
University of Maine System Planning Goals 1986

Policy Analysis

At present, there is no state-level legislative or executive policy regarding assessment of public institutions of higher education. All assessment is conducted at the institutional level, using measures that are institution-specific.

According to the Vice Chancellor for Academic Affairs, assessment has at least four functions: "accountability, student outcomes...to enhance the understanding of faculty members and academic administrators about what is taking place in the classroom in order to improve instruction and, ultimately, student learning." Each of the seven members of the Maine state system has taken its own direction with its assessment activities since assessment became a priority with the arrival of Robert Woodbury as Chancellor in 1986. Annual reporting by institutions on their assessment activities ended in 1994.

State Guidelines

None evident; all guidelines are set at the institutional level

Programs/Positions

None evident.

Indicators and Outcomes

All indicators are determined by the institution [we have some of this information from the partial summaries of institutional reports on assessment activities].

Instruments

No.

Teaching-Learning Element

From the background statement on assessment from the Vice Chancellor for Academic Affairs: "[The] common purpose [of effective assessment techniques] is often to facilitate the improvement of teaching and learning by providing clear insight into what works best in various teaching and learning contexts."

Public Reporting

Periodic

Database

Comprehensive statewide database exists at the SHEEO level.

Budget

Some funds were appropriated by the Vice Chancellor for Academic Affairs in 1988, 1989, and 1990. It is not clear from the state documents how much money was appropriated, and if funds are still being appropriated.

Regional Accreditation Association

New England Association of Schools and Colleges

Regional Accreditation Association Relationship

Not evident

Disciplinary Accreditation

Not evident

Technology Focus

Varies from institution to institution

 

Top

Maryland

Contact

John Sabatini

Maryland Higher Education Commission
16 Francis Street
Annapolis, MD 21401
410-974-2971
FAX: 410-974-5376

 
State Agency Original Initiative Year
Maryland Higher Education Commission Reorganization of Maryland Higher Education Act 1988

Policy Analysis

The Reorganization Act of 1988 gave responsibility for assessment efforts to the Maryland Higher Education Commission (MHEC). The Commission, in 1991, required each of Maryland's public colleges and universities to submit an annual report. These reports, called the Student Learning Outcomes Assessment Reports, give institutions a chance to demonstrate the progress they have made toward designated performance indicators that measure student learning outcomes. The state acknowledges that these performance indicators are part of an accountability policy, and that this accountability actually has two components: educational and financial. Educational accountability can be used "to assess an institution's effectiveness in terms of student learning outcomes." Financial accountability can be used "to measure how productively and efficiently an institution uses state resources." (Florestano memo, 3/5/96)

State Guidelines

Each institution is required to submit its Student Learning Outcomes Assessment Report annually. These reports must address the common performance indicators listed below. (See Indicators/Outcomes section.) Institutions may also choose to address additional, campus-specific indicators in their reports. In addition to providing data on the common and institution-specific indicators, the reports should "analyze the significance of the data to student learning outcomes" and "discuss the implications of the assessment process for innovations and changes at the campus."

Programs/Positions

None evident

Indicators and Outcomes

Common indicators of student learning: effectiveness of general education; student retention and graduation rates; student evaluations of teaching; post-baccalaureate admissions rates; academic performance of transfer students; student performance on licensing, certification, and graduate admissions exams; and post-graduate surveys. The MHEC also allows institutions to supplement their reports with additional indicators. Examples include: basis skills tests; capstone courses; portfolios; and employers' surveys.

Instruments

No.

Teaching-Learning Element

There is an emphasis on student learning. Student evaluations of teaching is an indicator.

Public Reporting

Annual

Database

Comprehensive statewide database exists at the SHEEO level, containing student records from four-year and two-year public institutions.

Budget

None evident

Regional Accreditation Association

Middle States Association of Colleges and Schools

Regional Accreditation Association Relationship

Not evident

Disciplinary Accreditation

Not evident

Technology Focus

None evident

 

Top

Massachusetts

Contact

Jack Warner

Higher Education Coordinating Council
McCormack Building, #1401
One Ashburton Place
Boston, MA 02108-1530
617-727-7785
FAX: 617-727-6397

 
Receipt of information pending.

 

 

Top

Michigan

Contact

C. Danford Austin, Deputy Superintendent, Postsecondary Education

State Department of Education
P.O. Box 30008
Lansing, MI 48909
517-373-3345
FAX: 517-335-4817
mdeohe@pilot.msu.edu

 
No Initiatives at the state or system level.

 

 

 

Top

Minnesota

Contact

Leslie K. Mercer, Director, Data and Program Division

Minnesota Higher Education Services Office
550 Cedar Street
400 Capitol Square
St. Paul, MN 55101
612-296-9665 FAX: 612-297-8880
mercer@hecb.state.mn.us

 
State Agency Original Initiative Year
Minnesota Higher Education Services Office Task Force on Postsecondary Quality Assessment 1987

Policy Analysis

At present, the only state-level activity concerning assessment is legislation, passed and adopted in 1995, establishing five performance measures for the University of Minnesota and five additional and separate measures for the Minnesota State College and University System (MnSCU). “At this time, documentation of achievements of the performance measures in 1995-96 is being collected by the University of Minnesota and the MnSCU. Funds have not been released pending reports from the systems.” (Mercer)

State Guidelines

The 1995 legislation established a performance incentive account. Each time the system fulfilled one of its system's five performance indicators, it would receive $1,000,000 from that account. Each system could receive a maximum of $5,000,000 if they succeeded in fulfilling all five of its system's performance indicators.

Programs/Positions

The 1987 Task Force on Postsecondary Quality Assessment was disbanded in 1991.

Indicators and Outcomes

The five performance indicators for the University of Minnesota are: (1) increases at the Twin Cities campus in the percent of freshmen ranking in the top 25 percent of their graduating high school class; (2) increase in freshmen retention rate; (3) increase in the minority freshmen retention rate and hiring rate of minority faculty; (4) increase in the five-year graduation rate; and (5) increase in the number of academic credits issued through courses offered by telecommunications.

The five performance indicators for the Minnesota State Colleges and Universities System are: (1) increase budget percentage directed to instruction and academic resources; (2) increase in the number of academic credits issued through courses offered by telecommunications; (3) at least a 2 percent increase in the freshmen retention rate; (4) increase the percentage of students in two-year programs who graduate within two years of admission, and at least a 2 percent increase in the percentage of students in four-year programs who graduate within four-years; (5) increase in placement rates for occupational programs and transfer rates for community and technical colleges.

Instruments

None evident

Teaching-Learning Element

As indicated by HESO official, indicators #1 and #2 of MSCU and University of Minnesota #5 are designed to improve teaching and learning.

Public Reporting

Not required, but the report to the legislature is a public document.

Database

Comprehensive statewide database exists at the SHEEO level containing student records from four-year and two-year public institutions (also contains student records from all independent, non-profit and some proprietary schools.

Budget

The legislature placed $5,000,000 in the performance incentive accounts for both the University of Minnesota System and the Minnesota State College and University System, for a total of $10,000,000.

Regional Accreditation Association

North Central Association of Colleges and Schools

Regional Accreditation Association Relationship

None evident

Disciplinary Accreditation

None evident

Technology Focus

All public institutions have been asked to increase the number of credits earned through courses offered by telecommunications, demonstrating a commitment to distance-learning.

 

Top

Mississippi

Contact

Kelley Pearson

Board of Trustees of State Institutions of Higher Learning
3825 Ridgewood Road
Jackson, MS 38211
601-982-6611
FAX: 601-982-6129

 
State Agency Original Initiative Year
Mississippi Board of Trustees of Institutions of Higher Learning

Board of Trustees' Policies and Bylaws

1990

Policy Analysis

According to the Board of Trustees, "[A]ll institutions under the governance of the Board shall maintain regional accreditation with the Southern Association of Colleges and Schools. Institutions shall endeavor to acquire accreditation for all programs for which professional accreditation is available." Based on this policy, each public institution is expected to "establish and implement appropriate assessment standards and practices related to student outcomes and achievement, and/or institutional effectiveness."

State Guidelines

See policy analysis

Programs/Positions

None evident.

Indicators and Outcomes

None evident.

Instruments

None evident.

Teaching-Learning Element

None evident.

Public Reporting

None evident.

Database

Statewide database exists at the SHEEO level, but it is not comprehensive.

Budget

None evident

Regional Accreditation Association

Southern Association of Colleges and Schools.

Regional Accreditation Association Relationship

The Trustees have clearly linked assessment with accreditation requirements of the SACS.

Disciplinary Accreditation

The Trustees also expect all programs "for which professional accreditation is available" to pursue and obtain that accreditation.

Technology Focus

Not evident.

 

Top

Missouri

Contact

Robert Stein, Senior Associate
Planning and Academic Programs
Coordinating Board for Higher Education
3515 Amazonas Drive
Jefferson City, MO 65109-5717
314-751-2361
FAX: 314-751-6635
robert?cbhe400 percentadmin@admin.mocbhe.gov

 
State Agency Original Initiative Year
Missouri Coordinating Board for Higher Education Value-added assessment at Northeast Missouri State University early 1980s

Policy Analysis

Policy Context

Assessment in Missouri began with the efforts of a single institution, Northeast Missouri State University (now Truman State University), which implemented a value-added assessment program in the early 1980s. The rest of the state followed soon thereafter, as "state educational leaders, with strong backing from the governor, challenged all public institutions to establish assessment programs which would improve student academic performance." (Stein letter, 2/25/97) It is important to note that the impetus for assessment did not come from the state legislature, but rather from the Missouri system of public higher education itself. However, the Missouri Business and Education Partnership Commission (MBEPC), created by an act of the state General Assembly, did recommend an emphasis on "measuring and reporting institutional performance toward mission achievement and goal realization. In addition, the commission recommended that performance mechanisms...be utilized to the maximum extent feasible." (Stein, 1996 AAHE Report)

Responding to the challenge from state educational leaders, institutions began in 1987 to submit annual reports to the Missouri Coordinating Board for Higher Education (CBHE) on their assessment activities. These assessment activities were designed by the individual institutions to serve their own missions and needs. At this same time (1986-87), Missouri established a Student Achievement Study (SAS), which was set up "to track the performance of students from high school graduation through college graduation." (Stein letter, 2/25/97)

The decentralized and autonomous nature of Missouri's system, allowing each institution to determine its own approach to assessment, eventually led some to call for a better and more consistent way to assess the state system as a whole. In reply, the state expanded its SAS data collection efforts to include data on performance indicators. Reports documenting how well institutions are meeting these indicators have replaced the annual assessment reports. These performance indicators are the foundation of a performance funding policy, called Funding for Results (FFR), which has been used in Missouri since 1991. The FFR policy works at both the state and the institutional levels, and distinguishes between two- and four-year institutions. (See indicators below.) Missouri’s four-year institutions began receiving FFR funds with the FY 1994 budget, while for two-year institutions, FFR began with the FY 1995 budget. Also in 1991, the Missouri Assessment Consortium (MAC) was formed by the assessment coordinators at the public four-year institutions. The primary purpose of the MAC is to foster inter-institutional communication in a decentralized system. In recent years, assessment coordinators from public two-year institutions have also been involved in discussions about statewide policy development and implementation. In 1994, the CBHE received a grant from FIPSE to refine and expand the FFR criteria and to support new efforts to teaching and learning practices through the sponsorship of on-campus projects via a block grant mechanism.

Policy Context

Assessment in Missouri began with the efforts of a single institution, Northeast Missouri State University (now Truman State University), which implemented a value-added assessment program in the early 1980s. The rest of the state followed soon thereafter, as "state educational leaders, with strong backing from the governor, challenged all public institutions to establish assessment programs which would improve student academic performance." (Stein letter, 2/25/97) It is important to note that the impetus for assessment did not come from the state legislature, but rather from the Missouri system of public higher education itself. However, the Missouri Business and Education Partnership Commission (MBEPC), created by an act of the state General Assembly, did recommend an emphasis on "measuring and reporting institutional performance toward mission achievement and goal realization. In addition, the commission recommended that performance mechanisms...be utilized to the maximum extent feasible." (Stein, 1996 AAHE Report).

Responding to the challenge from state educational leaders, institutions began in 1987 to submit annual reports to the Missouri Coordinating Board for Higher Education (CBHE) on their assessment activities. These assessment activities were designed by the individual institutions to serve their own missions and needs. At this same time (1986-87), Missouri established a Student Achievement Study (SAS), which was set up "to track the performance of students from high school graduation through college graduation." (Stein letter, 2/25/97).

The decentralized and autonomous nature of Missouri's system, allowing each institution to determine its own approach to assessment, eventually led some to call for a better and more consistent way to assess the state system as a whole. In reply, the state expanded its SAS data collection efforts to include data on performance indicators. Reports documenting how well institutions are meeting these indicators have replaced the annual assessment reports. These performance indicators are the foundation of a performance funding policy, called Funding for Results (FFR), which has been used in Missouri since 1991. The FFR policy works at both the state and the institutional levels, and distinguishes between two- and four-year institutions. (See indicators below.) Missouri’s four-year institutions began receiving FFR funds with the FY 1994 budget, while for two-year institutions, FFR began with the FY 1995 budget. Also in 1991, the Missouri Assessment Consortium (MAC) was formed by the assessment coordinators at the public four-year institutions. The primary purpose of the MAC is to foster inter-institutional communication in a decentralized system. In recent years, assessment coordinators from public two-year institutions have also been involved in discussions about statewide policy development and implementation. In 1994, the CBHE received a grant from FIPSE to refine and expand the FFR criteria and to support new efforts to teaching and learning practices through the sponsorship of on-campus projects via a block grant mechanism.

Policy Type

According to the MAC, assessment in Missouri serves two purposes: "[F]irst, the improvement of instruction and student learning; second, accountability. Assessment should focus on student learning and instruction and should be approached as a multi-dimensional exploration of curricular and co-curricular issues and the learning process associated with them. In addition, the public institutions...recognize a variety of statewide constituencies to which they are appropriately accountable for the effectiveness of their educational programs, including but not limited to students and parents, employers, taxpayers, the respective governing boards, the CBHE, and the state legislature." The CBHE's position is very similar: "Missouri is trying to use assessment both for improvement and accountability." (Ibid).

Policy Stage

At present, Missouri "is at a particular junction as it addresses ways to strengthen its approach to assessment and to performance funding." (Ibid.) Because performance funding is one of the CBHE's major initiatives, and the amount of money allocated to institutions through FFR continues to grow, the state looks to be in an ongoing cycle of implementation and evaluation. The policy goals defined through a strategic planning process in 1992 serve as the standards by which new performance is evaluated and other priorities are established. A key emphasis is on setting priorities based on planning, establishing agreed-upon measures and designating a portion of the state’s appropriations to public institutions based on performance.

State Guidelines

See Policy context above.

Programs/Positions

The FFR initiative has generated the initiation of a number of new groups which support the program among different constituencies:

1. The FFR Teaching and Learning Committee—faculty members from a cross-section of Missouri institutions.

2. The FFR Advisory Committee—academic officers, assessment coordinators or faculty from each public institution in Missouri

3. A Steering Committee composed of state legislators, institutional representatives, and a coordinating board member

4. An informal group of community college personnel who focus on FFR issues in the context of the mission of community colleges

Indicators and Outcomes

Included among the indicators are percentage of first-time, full-time, degree-seeking freshmen (hereafter referred to simply as freshmen) who took the CBHE-recommended high school core curriculum; percentage of minority students who are freshmen; percentage of teacher education students who met CBHE standards on ACT, C-BASE, and NTE; percentage of freshmen who completed 24 or more credit hours by the end of the first academic year and achieved at least a 2.0 GPA; graduation, completion, and transfer rates; percentage of bachelor degree recipients who scored above the 50 percentile on nationally-normed exams in general education and major field of study; pass rates of graduates taking licensure, certification, or registration examinations.

Instruments

1. By statewide administrative rule, C-BASE and NTE examination are used for admittance to and exit from teacher education programs.

2. ACT examinations are used as one of the general admission criteria for all four-year students.

3. Nationally-recognized and/or normed examinations are encouraged for assessment of general education as well as the major; locally developed instruments are also recognized.

Teaching-Learning Element

The whole policy is infused with a focus on teaching and learning.

Public Reporting

Annual

Database

Comprehensive statewide database exists at the SHEEO level, containing student records from four-year and two-year public institutions.

Budget

For 1997 FFR, $2 million to two-year institutions (2 percent of core budget) and $10.6 million to four-year institutions (1.7 percent of core budget). Between 1994-97, $3.5 million has been added to the core budgets of two-year institutions and $27 million to the core budgets of four-year institutions through the FFR program. In FY 97, 17.1 percent of new funding for four-year institutions and 11.1 percent of new funding for two-year institutions was generated through the FFR program. Institutions have flexibility in the way they choose to utilize FFR funds except for those funds designated for campus-level teaching/learning initiatives. These resources are for a specific program of innovation, and the use of these funds is reported annually.

Regional Accreditation Association

North Central Association of Colleges and Schools

Regional Accreditation Association Relationship

Commitment to assessment programs which support the standards promulgated by the NCA.

Disciplinary Accreditation

FFR does not specifically address this area, but a quinquennial review of each academic program is required for all public institutions. This review stresses the assessment of the outcomes of the program including the use of instruments to measure the impact of the major program for all graduates. In addition, all new programs proposals must include performance goals associated with student preparation and student outcomes.

Technology Focus

As part of the Blueprint for Missouri Higher Education, issued in 1996, a commitment has been made to improving access to higher education for all potential students, and promoting the development of an effective telecommunications-based delivery system.

 

Top

Montana

Contact

Richard A. Crofts, Deputy Commissioner, Academic Affairs

Office of the Commission of Higher Education
Montana University System
2500 Broadway
Helena, MT 59620-3101
406-444-6570
FAX: 406-444-1469
rcrofts@oche.oche.montana.edu

 
State Agency Original Initiative Year
Montana University System Proficiency Admission Requirements and Developmental Education in the Montana University System  

Policy Analysis

Policy Context

In 1995, the Montana University System (MUS) approved a policy on “Proficiency Admission Requirements and Developmental Education in the Montana University System.” This policy called for the MUS to “adopt a uniform assessment tool to be used in determining if students or prospective students have the basic proficiencies in math and English to provide them a reasonable chance of success in postsecondary education.” This basic skills assessment instrument would be “made available for administration to high school juniors so that they can receive an early indication of their preparedness for postsecondary education.” This would give students the opportunity to take additional, college-preparatory courses during their senior year if necessary.

In 1996, faculty and administrators from all of the MUS institutions developed Quality, Access, and Productivity Enhancement Plans for “improving quality, access, and productivity at their institutions. Each campus plan included baseline information, numerical goals for improvements and timelines through FY 1999 for reaching the agreed-upon goals.” The MUS requires that institutions submit annual reports on their progress. Each institution has developed its own plan, based on its needs and mission. There are seven very broad rubrics for enhancement, which are listed in the Indicators section below.

One of the ways in which Montana encouraged faculty productivity, and thus promoted the quality of undergraduate education, was to use a new approach called “collaborative bargaining.” This approach called for the state to provide 2.5 percent for faculty salary increases, and the remainder to come from “budget reallocation and other sources of revenue, mainly tuition.” The continued use of this approach is contingent upon “success in meeting the agreed-upon goals of increased productivity, quality, and accountability” at all public institutions in the state.

Policy Type

The Proficiency Admission Requirements policy, by establishing standards for basic skills performance, is a regulatory policy. The Quality, Access, and Productivity Enhancement Plan policy combines quality assurance with accountability.

Policy stage

These policies are all in the implementation stage.

State Guidelines

See Policy Context section above.

Programs/Positions

None evident.

Indicators and Outcomes

As part of the Quality, Access, and Productivity Enhancement Plans, seven categories of enhancement have emerged: (1) academic policies and instructional program quality; (2) faculty productivity and accountability; (3) educational technology; (4) student learning productivity; (5) academic advising; (6) access and course scheduling; and (7) fiscal policies, and library and other enhancements

Instruments

Vary by institution.

Teaching-Learning Element

Not directly evident in the seven categories of enhancement.

Public Reporting

Annually

Database

No multi-institutional databases exist.

Budget

None evident

Regional Accreditation Association

Northwest Association of Schools and Colleges

Accrediting Relationship

“The postsecondary institutions in Montana are developing assessment programs in response to the standards established by the NWACS.”

Disciplinary Accrediting Relationship

None evident.

Technology Focus

One of the seven categories of enhancement on which institutions report is “educational technology.” Enhancing faculty technology competency, establishing multi-media and interactive telecommunications classrooms, and upgrading of existing computer resources for students are some of the steps taken by different institutions.

 

Top

 

Nebraska

Contact

Odus V. Elliott, Academic Officer

Coordinating Commission for Postsecondary Education
140 N. 8th Street #300
P.O. Box 95005
Lincoln, NE 68509-5005
402-471-2847 FAX: 402-471-2886
klukesh@nde4.nde.state.ne.us

 
State Agency Original Initiative Year
Coordinating Commission for Postsecondary Education Program Review 1992

Policy Analysis

Policy Context

In 1992, the Nebraska Legislature passed a law requiring the State Coordinating Commission for Postsecondary Education (CCPE) to establish "an ongoing process to review, monitor, and approve or disapprove the new and existing programs of public institutions..." This law is the only statewide policy related to assessment issues. The responsibility for this process, according to the CCPE, is "shared by three partners: the public colleges and universities, their governing boards, and the CCPE."

Policy Type

Again, as defined by the CCPE, the goals of this program review policy are "to improve the instructional programs and to assure that the state resources invested in public postsecondary education are used as efficiently as possible." If program review is successful and these goals are met, it will result in "stronger institutions and demonstrated accountability to state government and the general public." Clearly, then, the Nebraska program review policy has as its goals both accountability and quality assurance.

While the responsibility for this program review is considered to be shared among three entities, each entity is considered to have different constituencies and thus somewhat different emphases. At the institutional level, public colleges and universities in Nebraska "focus on building exemplary institutions and on direct delivery of instruction, scholarship, and service." The focus of the institutional governing boards is on "the needs and performance of the institution(s) under their responsibility and on the constituents to whom they are accountable." It is not clear from the state documents how these constituencies of the governing boards are defined. Finally, the CCPE "focuses on its constitutional and statutory responsibilities and on the needs of the people of Nebraska for strong institutions that have distinctive missions, that work together in the interests of students and taxpayers, and that avoid unnecessary duplication of programs and facilities."

Policy Stage

The first round of program reviews began in 1993. During the first three years, the CCPE reviewed a total of 657 programs distributed across all of Nebraska's public colleges, universities, and community colleges. In its report describing the program review process, the CCPE concluded that "the first three years of program review have had a significant impact on postsecondary education in Nebraska...[T]he changes that are made as a result of program review or as a result of other institutional action often result in more efficient use of institutional resources." A complete cycle is seven years. Each program will be reviewed once during the 7-year period.

State Guidelines

The state has mandated four question to guide the review of existing programs: (1) what is the centrality of the program to the role and mission of the larger institution?; (2) to what extent the program is consistent with the Comprehensive Statewide Plan (CSP)?; (3) is there evidence of need and demand for the program?; and (4) are the resources available to the program adequate? For new programs, these questions are very similar. Regarding consistency with the CSP, the state also asks for additional descriptions and information about the new program of study, how the program proposes to assess student learning, how/if the new program will seek special accreditation, how/if the new program will meet the needs of diverse students, how/if the program will collaborate with other postsecondary institutions, how/if the program will offer distance, and how/if the new program will establish partnerships with businesses, organizations, and public agencies.

Programs/Positions

None evident.

Indicators and Outcomes

See Instruments

Instruments

For "centrality to role and mission," the institution is asked either to prepare a checklist or a narrative format explaining how the existing program fits with the role and mission of the institution as defined by state law. The same process is to be followed by new programs.

For "consistency with the CSP," the institution is asked to prepare narrative descriptions explaining how existing programs meet CSP priorities. These priorities will change regularly. For new programs, the institution is asked to prepare narrative descriptions explaining how the new programs will meet areas "special emphasis." (See State Guidelines above).

For "evidence of need and demand," the institution is asked to provide the existing program's graduation figures for the five years prior to review, and the average student credit hour production of program faculty. Also for existing programs, data on job placement rates and/or studies of workforce needs are requested. For new programs, the institution is asked for this same data and what potential the new program has to contribute to these rates and needs. Institutions must also address of unnecessary duplication for new programs.

For "adequacy of resources," the institution is asked to provide number of faculty, quantity of library holdings, and descriptions of physical facilities and equipment for existing and new programs. For new programs, the institution is asked to show a budget for the first five years.

Teaching-Learning Element

For new programs, one of the areas of special emphasis in program review is the assessment of student learning. For existing programs, being consistent with the CSP includes providing "measurable educational results..." the goal of the program review policy is the improvement of the instructional program.

Public Reporting

The results of program review prepared for the CCPE are also available to the general public.

Database

No multi-institutional or statewide higher education databases exist.

Budget

None.

Regional Accreditation Association

North Central Association of Colleges and Schools

Regional Accreditation Association Relationship

The state sees the assessment requirements of NCA as complementary to its own goals of consistency with the CSP.

Disciplinary Accreditation

Not evident.

Technology Focus

Requested in program review under “Adequacy of Resources” criterion.

 

Top

Nevada

Contact

John Richardson, Vice Chancellor for Academic and Student Affairs

University and Community College Systems
2601 Enterprise Road
Reno, NV 89512
702-784-4901
FAX: 702-784-1127
minedew@nevada.edu

 
State Agency Original Initiative Year
Board of Regents Assessment Policy 1989

Policy Analysis

Policy Context

There have not been any statewide executive or legislative assessment initiatives in Nevada. This is in part because the UCCSN Board of Regents has constitutional status that is separate and equal to the other branches of state government.

Policy Type

In 1989, the UCCSN Board of Regents adopted a policy requiring each campus within the state system to compile and submit "an appropriate plan of regular student educational assessment." The Regents, in creating this policy, acknowledged the diversity of institutional types and missions, and so allowed each campus to tailor its assessment activities to its own mission and needs. In addition to student assessment, the Regents require each campus to submit an annual academic program review. Each year, a different set of programs are up for review. Given the multiplicity of approaches taken by campuses in relation to student assessment, and the policy of academic program review, Nevada's policy can best be characterized as a combination of quality assurance and reform.

Policy Stage

Each campus submitted preliminary student assessment plans to the Regents in 1990. This was followed by an update report in 1992. Since this update, "a great deal of progress has been made by the community colleges and universities in their student assessment efforts...as assessment is becoming an integral part of their planning activities." With both student assessment and program review, it seems evaluation and redesign is an ongoing part of the implementation process.

State Guidelines

The UCSSN Board of Regents "requires that an appropriate plan of regular student educational assessment be developed by each campus, with each campus assuming responsibility for developing the processes and procedures to be used. Plans should be based upon campus mission and should be developed with multiple assessment approaches which may include but not be limited to testing. Among other activities, regular regional accreditation review will provide an overall assessment of the campus. Plans should reflect the mix of programs and types of students." (Regents' Policy on Student Assessment, 1/89).

Programs/Positions

New positions, if any, vary from campus to campus.

Indicators and Outcomes

Vary from campus to campus.

Instruments

Vary from campus to campus. Examples include alumni and employer surveys, standardized tests, institutional self-studies.

Teaching-Learning Element

Vary from campus to campus.

Public Reporting

Not evident.

Database

Comprehensive statewide database exists at the SHEEO level, containing student records from four- and two-year public institutions.

Budget

Not evident.

Regional Accreditation Association

Northwest Association of Schools and Colleges

Regional Accreditation Association Relationship

The Regents recognize that the "[N]orthwest Association of Schools and Colleges is now placing a greater emphasis on assessment. The Commission on Colleges expects each institution and program to adopt an assessment scheme responsive to its mission and needs, the UCCSN campuses are responding."

Disciplinary Accreditation

Not evident.

Technology Focus

Not evident.

 

Top

New Hampshire

Contact

James A. Busselle, Executive Director

New Hampshire Postsecondary Education Commission
Two Industrial Park Drive
Concord, NH 03301-8512
603-271-2555
FAX: 603-271-2696

 
No Initiatives at the state level. Receipt of information about the system level pending.

 

 

Top

New Jersey

Contact

Phil Beardsley, Director of Research

Commission of Higher Education
20 West State Street, CN542
Trenton, NJ 08625
609-984-2847

 
State Agency Original Initiative Year
New Jersey Commission on Higher Education BASP 1977

Policy Analysis

Policy Context

New Jersey's original assessment initiative began with the Basic Skills Assessment Program (BASP) in 1977. This program was designed to test the basic skills proficiencies of entering freshmen and to evaluate institution's efforts in remedial education. State-level assessment activity continued with the creation of the College Outcomes Evaluation Program (COEP) in 1985. A program advisory committee was appointed and given the responsibility of developing a plan of action. "Student learning and development, faculty research and scholarship, and the impact of institutions on society comprised the focus" of this plan. (COEP Advisory Committee Report, 10/23/87) In 1987, this advisory committee reported its recommendations. These recommendations called for, among other things, outcomes assessment of general education, outcomes assessment in major fields of study, assessment of the personal development and satisfaction of students, and assessment of success in providing access and meeting the human resource needs of an institution's population. This initiative gave considerable latitude to individual institutions in terms of instruments and indicators. The COEP had as its stated purpose "the improvement of undergraduate education” and attempted to balance the use of standardized measures with the need for institutional autonomy.

Policy Stage and Type

COEP ended in 1991, and the BASP was eliminated during the restructuring of higher education in New Jersey. This restructuring of the higher education system has produced a new focus on accountability. Beginning in the fall of 1995, public institutions in New Jersey began making annual reports to policymakers and the public as part of this new accountability. The Committee on Advancement, Excellence, and Accountability (CAEA) issues guidelines for the submission of annual reports by the institutions. These guidelines must be approved by the state commission. In developing these guidelines, or performance indicators, New Jersey has drawn on Ewell's definition: "Indicators can best be described as policy-relevant statistics produced regularly to support overall policy planning and monitoring at the national, state or system level." (Ewell as quoted in NJ System-wide Accountability Report, 4/96) At present, the institutional annual reports address four broad issues: faculty, affordability, access, and the return on the public investment in higher education.

State Guidelines

Institutions submit data annually to the CAEA, reflecting progress toward meeting performance indicators in a variety of areas: affordability; retention, transfer, and graduation rates; access; and return on investment.

Programs/Positions

At the state level, the Board and Department of Higher Education have been replaced by the Commission on Higher Education and the President's Council. The CAEA was formed by the President’s Council.

Indicators and Outcomes

Graduation rates; SAT and other standardized test scores; percentage of students who are New Jersey residents; number of scholarship recipients

Instruments

No

Teaching-Learning Element

Not at present

Public Reporting

Annual

Database

Comprehensive statewide database exists at the SHEEO level, containing student records from four-year and two-year public institutions.

Budget

No budget

Regional Accreditation Association

Middle States Association of Colleges and Schools

Regional Accreditation Association Relationship

Not evident.

Disciplinary Accreditation

Not evident.

Technology Focus

Not evident.

 

Top

New Mexico

Contact

Bill Simpson, Deputy Director for Educational Programs

Commission on Higher Education
1068 Cerrillos Road
Santa Fe, NM 87501-4295
505-827-7383
FAX: 505-827-7392
bsimpson@che.state.nm.us

 
State Agency Original Initiative Year
Commission on Higher Education Report Card 1990

Policy Analysis

Policy Context

In 1989, the New Mexico Commission on Higher Education (CHE) requested that its member institutions submit "comprehensive, five-year strategic plans." One section of the larger plan was to address "System Development," defined by the CHE as a description of "institutional efforts to evaluate student outcomes and other indicators of institutional effectiveness." The plans were submitted by New Mexico's 23 public universities and community colleges in 1990 and revised by those institutions in 1991. After reviewing these comprehensive plans, the CHE decided that "further statewide planning would best be served by asking that institutions focus upon specific, statewide issues...rather than continuing to submit comprehensive plans. One of the issues to be addressed is outcomes assessment."

Also in 1990, the state legislature passed a law requiring the CHE to submit an "annual report card," featuring a variety of measures. Due to a variety of problems, the "annual report card" is no longer compiled. (Some of these problems are discussed below.) Most recently, in 1993, the CHE began asking member institutions for a copy of the plan institutions sent to the North Central Association (NCA) to satisfy NCA's requirement that schools seeking accreditation have an institution-wide outcomes assessment plan. The NCA also requires a summary report on the progress each institution is making toward meeting these outcomes assessment goals. The report to the NCA, then, doubles as an assessment report to the state commission.

Policy Type

The defunct "report card" policy was designed primarily for comparative, and thus, accountability, purposes. "The indicators [of performance] are to be published annually in order to draw comparisons among schools districts and among institutions of higher learning." This comparative approach was sharply criticized by institutions, which argued that because of the "diversity of New Mexico's institutions, missions, and students," each institution should establish its own assessment measures.

The latest policy to allow institutions to submit their own unique NCA outcomes assessment plans as fulfillment of state-level assessment requests serves two purposes: (1) it does not hold numerous and varied institutions to a single, arbitrary standard; and (2) it streamlines the assessment process and "reinforces the NCA requirement rather than adds to a second requirement" from the state. Therefore, a characterization of New Mexico's assessment policy is essentially a characterization of the NCA requirement. Because most institutions feel obligated to seek NCA accreditation, the requirement for an outcomes assessment plan is at least in part regulatory. Given the understanding that accreditation generally, and the NCA requirement specifically, are means of institutional improvement, the policy requiring an outcomes assessment plan is also an effort at quality assurance.

Policy Stage

New Mexico has experienced one complete policy cycle with the "annual report card" policy. After approximately two years of implementation, that policy was evaluated as "unreliable" or "only minimally indicative of institutional performance." As a result of these criticisms, the "report card" policy was terminated. Presently, the CHE requests only that institutions submit a copy of their outcomes assessment plan prepared for NCA.

State Guidelines

The CHE collects and publishes some data on statewide outcomes measures in its annual report on the "Condition of Higher Education in New Mexico." The Commission also encourages assessment by supporting the accreditation requirements of the NCA. Otherwise, there are no state-mandated assessment policies.

Programs/Positions

None evident.

Indicators and Outcomes

In the Commission's annual report, the following outcomes measures are provided:
1. bachelor's and graduate degrees awarded, by field
2. certificates and associate's degrees awarded, by field
3. degree completion rates
4. program completion and transfer rates

Instruments

All of New Mexico's indicators are measured by unit counting. In the case of the rates of degree and program completion and transfer, unit counts are used to calculate the percentages of students who have completed degrees or programs, or have transferred.

Teaching-Learning Element

The indicators reflect an interest in completion rates.

Public Reporting

Annual reports are available to the public.

Database

Comprehensive statewide database exists at the SHEEO level, containing student records from four-year and two-year public institutions.

Budget

None.

Regional Accreditation Association

North Central Association of Colleges and Schools

Regional Accreditation Association Relationship

Current unwritten policy is to encourage progress at the institutional level in assessment of student learning and institutional performance, supporting North Central's accreditation requirements. Institutional accreditation report can be used as state assessment report.

Disciplinary Accreditation

Not evident.

Technology Focus

Expansion of educational technology, including distance education, is a major priority of the CHE, although it is not lined in any way to assessment.

 

Top

New York

Contact

Jeanine Grinage, Deputy Commissioner
The University of the State of New York

The State Education Department
Albany, NY 12230

 
State Agency Original Initiative Year
The University of the State of New York

Board of Regents Policy

N/A

Policy Analysis

Policy Context and Type

In New York, there are really two, concurrent assessment policies in effect. The first policy stems from the Commissioner of Higher Education's Regulations. These regulations require two things: (1) the registration of college curricula; and (2) the publication of student outcomes data. The second policy stems from the strategic plan of the Board of Regents (BOR) of the University of State of New York System. Contained within this strategic plan are six goals, as well as performance indicators by which the BOR can measure institutional and systemic progress toward achieving the six goals. Of these goals, the first two -(1) "All students will meet high standards for academic performance and demonstrate the knowledge and skills required by a dynamic world" and (2) "All educational institutions will meet Regents high performance standards"- directly address student outcomes assessment. Both of these policies have quality assurance as their focus.

Policy Stage

It is not clear from the state documents when the Commissioner's Regulations, part 52 and 53, went into effect and how often registration of curricula and publication of outcomes data are required. In terms of the BOR strategic plan, it was approved in 1995. It is not clear how far along the state has gone in implementing this plan.

Recent, additional assessment activities in New York include self-study guides for the 90 public and independent colleges focusing on baccalaureate teacher education programs; statewide review of two-year and community colleges, which has student learning and development as an overarching theme; the Doctoral Evaluation Project, committed to assuring doctoral education of high quality through peer review; evaluation of programs leading to licensure; and outcomes assessment in vocational education.

At the system level, the State University of New York (SUNY) has been involved with assessment since 1989, when SUNY institutions each submitted campus assessment reports to the SUNY Central Administration. These reports addressed four dimensions of outcomes assessment: basic skills, general education, academic majors, and personal/social growth. SUNY institutions submit updates on an annual basis about their assessment activities and results. In the City University of New York (CUNY), all institutions require students to take a comprehensive basic skills test in reading, writing, and mathematics. The State Education Department also keeps track of assessment programs at independent and proprietary institutions.

State Guidelines

On governing registration of college curricula: "To be registered, each curriculum shall show evidence of careful planning. Institutional goals and the objectives of each curriculum and of all course shall be clearly defined in writing, and a reviewing system shall be devised to estimate the success of students and faculty in achieving such goals and objectives."
On publication of student outcomes data: "Part 53 of the Commissioner's Regulations requires institutions to publish student outcomes data, namely data on student retention and, where available, placement of graduates."

Programs/Positions

"For each curriculum the institution shall designate a body of faculty who, with the academic officers of the institution, shall be responsible for setting curricular objectives, for determining the means by which achievement of objectives is measured, for evaluating the achievement of curricular objectives, and for providing academic advice to students. The faculty shall be sufficient in number to assure breadth and depth of instruction and the proper discharge of all other faculty responsibilities. The ratio of faculty to students shall be sufficient to assure effective instruction." (Commissioner's Regulations 52.2 (b) (3)).

Indicators and Outcomes

Among the performance indicators used in measuring institutional progress toward meeting the six goals in the BOR strategic plan are completion and graduation rates; rates of performance on licensure and certification exams; and the rate of employer satisfaction with the knowledge and skills of graduates.

Instruments

Vary by institution

Teaching-Learning Element

These elements are a large part of all NY state assessment policies. Students are expected to achieve and demonstrate knowledge and skills demanded by a dynamic world.

Public Reporting

Cyclical

Database

Significant multi-institutional databases exist, but not at SHEEO level.

Budget

Not evident.

Regional Accreditation Association

Middle States Association of Colleges and Schools

Regional Accreditation Association Relationship

"The Department is also moving toward a closer working relationship with the regional accrediting group...as a means of assuring consistency in standards as well as efficiencies in staff time and cost." (McHugh letter, 2/13/97)

Disciplinary Accreditation

"The Department is also moving toward a closer working relationship with accrediting bodies in particular disciplines as a means of assuring consistency in standards, as well as efficiencies in staff time and cost." (Ibid.)

Technology Focus

None evident.

 

Top

 

North Carolina

Contact

Gary T. Barnes, Vice President of Program Assessment and Public Service

UNC-General Administration
P.O. Box 2688
Chapel Hill, NC 27515-2688
919-962-4591
FAX: 919-962-2751
barnes@ga.unc.edu

 
State Agency Original Initiative Year
The University of North Carolina System Senate Bill 44, Chapter 752 1989

Policy Analysis

Policy Context

In 1989, the North Carolina General Assembly passed legislation requiring all institutions comprising the University of North Carolina system to submit assessment reports. The institutions submitted assessment plans to the General Assembly in 1991. Contained within these plans were outlines for "comprehensive, integrated assessment to be developed and implemented over a five-year period, 1990-91 through 1994-95." Assessment reports have been submitted each year since 1992.

Policy Type

The annual reports all contain three sections. The first section consists of "activities and outcomes related to improving undergraduate student learning and development"; the second section consists of "activities and outcomes related to improving graduation rates and shortening time-to-degree"; and the third section consists of "the impact of budget flexibility on undergraduate education, including specific reallocation or redirection of resources to support and enhance undergraduate education." Sections one and two reflect a quality assurance policy; section three has a distributive focus.

Policy Stage

The first annual reports were submitted in 1992. Those reports submitted in 1994-95 were the fifth and final group of reports under the current assessment plan. Work on the new five-year Institutional Assessment Plan was concluded in late 1996. It is anticipated that this plan will "refine, recast and augment" the first five-year plan and its measures "to ensure that they address the requirements of the original 1989 legislation."

State Guidelines

See policy type above.

Programs/Positions

None evident.

Indicators and Outcomes

As part of a newly-instituted Performance/Program Budgeting System (1996-97), there are four system-wide expected outcomes and indicators. (Each expected outcome is given below, followed immediately by the indicator to measure it).

1. The University of North Carolina (UNC) will expand access for eligible NC high school graduates: percent of NC high school graduates who attend a UNC institution in the year following graduation.

2. The UNC will expand access for NC community college transfers: percent of community college cohorts who transfer to a UNC institution within two years of completing coursework at community college.

3. The UNC will expand access for nontraditional undergraduates: fall headcount enrollment of undergraduate 25 and older.

4. The UNC will continue to serve growing numbers of students seeking higher education and workforce preparation: annual undergraduate and graduate student credit hours, and number of degrees awarded at all levels.

Instruments

(1) System-wide survey of second-semester sophomores .

(2) System-wide survey of graduating seniors.

(3) System-wide survey of baccalaureate graduates one year after graduation

Teaching-Learning Element

These elements are more directly addressed in the expected outcomes and measures of differing institutional types.

Public Reporting

Annual or biennial schedule for all performance indicators

Database

Comprehensive statewide database exists at the SHEEO level, containing student records from four-year and two-year public institutions.

Budget

$200,000 per year to support biennial surveys

Regional Accreditation Association

Southern Association of Colleges and Schools

Regional Accreditation Association Relationship

Not evident.

Disciplinary Accreditation

Not evident.

Technology Focus

One of the expected outcomes for Comprehensive, Liberal Arts, and Baccalaureate institutions is the following: "Students will experience ready and convenient access to modern information technology resources: computers, local and wide-area networks with Internet access; and other electronic resources that support the instructional process."

Current Issues of Note

In April 1997, having studied various aspects of incentive funding for two years, the Board of Governors voted not to recommend an incentive funding plan.

 

Top

North Dakota

Contact

Dr. Michael Hillman, Vice Chancellor of Academic Affairs

North Dakota University System
State Capitol Building
600 East Boulevard
Bismarck, ND 58505
701-328-2965
FAX: 701-328-2961
NDUS_office@prairie.nodak.edu

 
State Agency Original Initiative Year
North Dakota University System

Policy Analysis

1996

Policy Analysis

The State Board of Education requires each public institution to assess student achievement and learning in light of each institution's mission statement. There have not been any other "major initiatives" taken at the state level in terms of assessment policy or and/or student outcomes.

State Guidelines

Not evident

Programs/Positions

None evident

Indicators and Outcomes

None evident

Instruments

No

Teaching-Learning Element

None evident

Public Reporting

Cyclical

Database

Comprehensive statewide database exists at the SHEEO level, containing student records from four-year and two-year public institutions.

Budget

Not evident

Regional Accreditation Association

North Central Association of Colleges and Schools

Regional Accreditation Association Relationship

The State Board of Education policy requiring institutions to assess
student achievement in light of institutional mission;is interpreted
to minimally be the assessment process required by the regional accrediting
association; (NCHEMS, 1996).

Disciplinary Accreditation

Not evident

Technology Focus

Not evident

 

Top

Ohio

Contact

E. Garrison Walters, Vice Chancellor, Academic Affairs and Access
Linda Ogden, Communications Director

Ohio Board of Regents
30 East Broad St. 36th Floor
Columbus, OH 43266-0417
614-486-0885
FAX: 614-486-5866
gwalters@summit.bor.ohio.gov

 
State Agency Original Initiative Year
Ohio Board of Regents State Bill 140 1989

Policy Analysis

Policy Context, Type and Stage

In April, 1996, for the first time, the Ohio Board of Regents (BOR) included a performance funding component in the annual appropriations for public two-year community and technical colleges and university regional campuses. This performance funding policy was designed to ensure that these institutions provide "educational programs and services identified as statewide priorities and expectations." (Regents Review, Spring 1996) These priorities and expectations are discussed in the State Guidelines section below. According to Regents Chancellor Elaine Hairston, "[P]erformance funding is intended to reward campuses for providing needed services and for the quality of those services...Every campus gains a better sense of its existing strengths and areas for improvement. Students and communities gain improved access to a range of high-quality educational programs and services. The public gains greater accountability in the use of state revenues." (Ibid.) This policy, then, has both a quality assurance and an accountability focus.

In its November 1996 report, The Challenge is Change: Ohio's Master Plan for Higher Education, the Ohio BOR addressed both assessment of student learning outcomes and performance funding. As part of Objective 2, "Improve the Quality of the Learning Experiences," assessing student learning outcomes is listed as one strategy to meet that objective. The BOR acknowledges the growing national awareness of the value of assessment. The Regents also recognize that "[M]any of Ohio's campuses already are engaged in student learning outcomes assessment and many more will be in the next few years as they strive to meet new review procedures for re-accreditation." (BOR Master Plan, 1996) In speculating about the specifics of an assessment approach, the BOR states that "[W]hile no single student outcomes assessment tool is suitable for all programs, the selection of a combination of appropriate evaluations by individual colleges and universities offers a rich opportunity to determine the quality of students' learning experiences..." (Ibid).

As part of Objective 8, "Implement a Funding Model that Reflects Ohio's Goals for Higher Education," the creation and implementation of performance measures/benchmarks is given as a strategy to meet this objective. In discussing this strategy, the BOR summarizes its fiscal year 1996 performance funding policy for two-year colleges and university regional campuses. In fiscal year 1997, the BOR plans to implement four additional expectations for two-year institutions. Beginning in 1998, the Regents will turn their attention to a performance funding policy for four-year institutions. At present, performance indicators for universities are under consideration.

State Guidelines

Statewide priorities and expectations for two-year institutions include developmental education services, partnerships with industry and government for work force education and training, non-credit continuing education opportunities, and career/technical programming. For each of these performance areas, an institution "exceeds, meets, partially meets, or does not meet" the BOR expectations. It is not clear from the state documents exactly how the BOR makes this evaluation. The performance areas now under consideration for universities include the following: student access and diversity, effectiveness of the learning process, cost-effectiveness, contributions to knowledge, and service mission and its integration with teaching and research.

Programs/Positions

None evident.

Indicators and Outcomes

See State Guidelines above for priorities and expectations.

Instruments

No.

Teaching-Learning Element

Assessment of student learning outcomes is clearly an emphasis of this policy.

Public Reporting

Annual.

Database

Comprehensive statewide database exists at the SHEEO level, containing student records from four-year and two-year public institutions.

Budget

In 1996, the BOR allocated $1.5 million to two-year institutions as part of its performance funding policy.

Regional Accreditation Association

North Central Association of Colleges and Schools

Regional Accreditation Association Relationship

A connection between assessment of student learning outcomes and the assessment requirements of accreditation has been drawn by the BOR.

Disciplinary Accreditation

Not evident.

Technology Focus

Not evident.

 

Top


Oklahoma

Contact

Cynthia S. Ross, Executive Vice Chancellor, Academic Affairs

State Regents of Higher Education
500 Education Building, State Capitol Complex
Oklahoma City, OK 73105-4503
405-524-9151
FAX: 405-524-9235
cross@osrhe.edu

 
State Agency Original Initiative Year
Oklahoma State Regents for Higher Education Regents Policy 1991

Policy Analysis

In Oklahoma, the "responsibility for prescribing standards for admission, retention, and graduation applicable to each institution" in the state system is constitutionally-vested in the Oklahoma State Regents for Higher Education (OSRHE). The Regents, responding to national trends in assessment, as well as the North Central Association's expectation that all institutions engage in assessment of student achievement, developed an official assessment policy in 1991.

The State Regents’ policy clearly states its purpose: 1) to improve instruction through the systematic gathering, interpretation, and use of information about student learning/achievement, and 2) to provide public accountability. The policy has been revised twice since its implementation – in 1994 and again in 1996.

State Guidelines

All institutions are required to assess students at four levels: (1) entry-level, to determine academic preparation and course placement; (2) mid-level, to determine general education competencies; (3) exit-level (program outcomes), to evaluate the outcomes in a student's major; and (4) student satisfaction, to determine students' perceptions of their educational experiences. Assessment of graduate student achievement is optional.

Programs/Positions

In recognition of varying institutional missions and clientele served, each campus will develop its assessment program under the leadership of the local faculty and administrators providing that the procedures met State Regents’ requirements. Furthermore, each component of the assessment program should be coordinated to complement the whole.

Indicators and Outcomes

Vary by institution. Each institution is, however, required to report the following:
For entry-level assessment: number of students participating in entry-level assessment and number of students requiring additional basic skills development, explanatory summary of assessment results, and the methods used (courses, tutoring etc.) by which students were required to participate in the improvement of basic skills.

For mid-level assessment: number of students assessed, explanatory summary of assessment results, and plans for change.

For exit-level assessment: number of students participating, explanatory summary of assessment results, and plans for instructional changes.

For student satisfaction assessment: number of students assessed, explanatory summary of assessment results, and plans for change.

Instruments

For primary entry-level assessment, the ACT is used by all institutions. Secondary entry-level tests include ASSET, CPT, COMPASS, and the Nelson-Denny Reading Test. Institutionally-developed tests are also used.

For mid-level assessment, the ACT-CAAP, BASE, CAT, and TABE are used most commonly.

For exit-level assessment, the MFAT, GRE, MCAT, GMAT, and LSAT, Area Concentration Achievement Tests (ACAT) and ACT-COMP are used. Additional measures include exit interviews, senior projects, student portfolios, certification and licensing exams, capstone courses, and job placement.

For assessment of student satisfaction, nationally-standardized surveys such as the ACT-SOS, the SSI, the ACT Alumni Survey, and the CSEQ are used. Institutionally developed instruments are also used.

Teaching-Learning Element

Intent of policy is to improve instruction with knowledge about how well and what students are learning. See Instruments section above.

Public Reporting

Annual.

Database

Comprehensive statewide database exists at the SHEEO level, containing student records from four-year and two-year public institutions. Also contains student records from some independent, non-profit and some proprietary schools.

Budget

Each institution is permitted to charge a fee for the purposes of conducting institutional and programmatic assessment. This fee can be no more than one dollar per credit hour.

Regional Accreditation Association

North Central Association of Colleges and Schools

Regional Accreditation Association Relationship

The Regents acknowledge the NCA's expectation that "all institutions are expected to assess the achievements of their students..."

Disciplinary Accreditation

Not evident.

Technology Focus

None evident.

 

Top

Oregon

Contact

Nancy P. Goldschmidt, Senior Policy Associate for Assessment and Planning

PO Box 3175
Eugene, OR 97403-1075
503-346-5791
FAX: 503-346-5764

 
State Agency Original Initiative Year
Oregon State System of Higher Education (OSSHE) Board Policy 1991

Policy Analysis

Policy Context and Type

Assessment in Oregon has been, and continues to be, done at the system level. There have been no initiatives related to assessment from the executive or legislative branches of government. The state university system's Academic Council adopted the Oregon Assessment Model (OAM) in 1993. This model called for assessment of student performance at "three critical transitions: admissions, midpoint, and graduation." According to a summary on assessment and accountability prepared by the Office of Academic Affairs for OSSHE, "A goal of the assessment model is quality assurance. Those who participate and invest in higher education should expect high quality." In 1994, each campus within the Oregon system was given incentive funds to help defray the costs of implementing the OAM. In 1995, an accountability report, based on institutional assessment activities, was made to the State Legislature. OSSHE acknowledges that its assessment policy is a "two-pronged approach. The emphasis at the system level is with accountability. Campuses assume responsibility for program level assessment and looking at individual student outcomes and implications for curricular revision." (Goldschmidt letter, 1/27/97)

Policy Stage

"The campuses are at various stages of implementing the Oregon Assessment Model." (Goldschmidt letter, 1/27/97).

State Guidelines

Each institution is required to submit a report biennially, providing "evidence about student performance at three critical transitions: admissions, midpoint, and graduation."

Programs/Positions

None evident; the Academic Council which adopted the OAM is a permanent policy advisory group consisting of the Vice Chancellor for Academic Affairs and provosts from all seven of the state's public colleges and universities.

Indicators and Outcomes

There are seven broad indicators of student success and achievement in the OMA: (1) general knowledge and abilities; (2) learning environment; (3) major field knowledge; (4) degree completion; (5) professional licensure in selected programs; (6) employment; and (7) customer satisfaction.

Instruments

Policy seeks measure of student performance. The means to this end varies by institution.

Teaching-Learning Element

Vary by institution.

Public Reporting

Biennial.

Database

Statewide database exists at the SHEEO level, but it is not comprehensive i.e. contains four-year institutional data only.

Budget

OSSHE has used small amounts of incentive funds ($200,000 a biennium) to incent campuses to participate in collaborative assessment projects and for implementation of OAM.

Regional Accreditation Association

Northwest Association of Schools and Colleges

Regional Accreditation Association Relationship

Not evident.

Disciplinary Accreditation

Not evident.

Technology Focus

Not evident.

Current Issues of Note

"There is a growing interest on the part of the Board of Higher Education in performance indicators, largely in the areas of employment and national examinations (such as professional licensure). " (Goldschmidt letter, 1/27/97).

 

Top

Pennsylvania

No Initiatives at the state or system level.

 

Top

Rhode Island

Contact

Cynthia V.L. Ward, Associate Commissioner

Office of Higher Education
301 Promenade Street
Providence RI 02908
401-277-6560
FAX: 401-277-6111
cvlw@uriacc.uri.edu

 
State Agency Original Initiative Year
Rhode Island Office of Higher Education Board of Governors' Policy on Quality in Higher Education, Program and Institutional Review Processes 1986

Policy Analysis

Policy Context

In 1986, the Rhode Island Board of Governors (BoG) adopted guidelines for both program reviews and institutional quality reviews. While additional guidelines were added in 1988, and the whole process was streamlined in 1990, the current policy remains very similar to the original.

Policy Type

The first half of the BoG policy calls for program review. Institutions may, if they choose, submit national accreditation assessment reports in lieu of Bog program review reports. It is not clear from the state documents what action is taken, if any, in response to the findings of the review reports.

The second half of the Bog policy calls for institutional quality reviews. Institutions are required to submit the findings of each ten-year regional accreditation report and any other special reports or reviews as part of these institutional quality reviews. In addition, each of the state's three public institutions must submit the following information about each of the state's quality indicators (see below for list): (1) summary data on the indicator; (2) significant changes in this indicator since the previous report; (3) any changes related to this indicator under consideration; (4) any major impediments related to change related to this indicator; and (5) a description of the review process. The state uses this information as part of the ongoing monitoring process of programs.

Policy Stage

The reports were intended to run on a seven-year cycle. However, the state has not kept to this cycle. Instead the state looks at programs on an as-needed basis. Currently small enrollments have resulted in review, particularly at the institutional level

State Guidelines

See Policy Type above.

Programs/Positions

None evident.

Indicators and Outcomes

The following are recommended quality indicators for the Institutional Quality Reviews: (1) background information on students--pre-matriculation measures such as placement tests and admissions measures such as yearly admissions profiles; (2) resources--support for libraries, financial aid analysis, and financial incentive programs to promote quality; (3) faculty--number of part-time faculty, support for professional development, research and other scholarly and creative activities, and efforts to promote faculty service; (4) special programs--descriptions of remedial programs, general education, academic advising; (5) outcomes--retention and graduation rates, outcomes assessment, follow-up on both graduates and non-graduated former students, student and alumni satisfaction; (6) other changes --related to the improvement of academic policy.

Instruments

Not thought to be appropriate for three institutions all with a unique identity: a community college, liberal arts college, and a research university.

Teaching-Learning Element

One of the recommended quality indicators relates to how institutions are conducting outcome assessments and evaluations, but degree and nature of attention to teaching and learning is not clear. Special attention is paid to teacher education programs in this indicator.

Public Reporting

Voluntary.

Database

No multi-institutional databases exist.

Budget

Not evident.

Regional Accreditation Association

New England Association of Schools and Colleges

Regional Accreditation Association Relationship

The Bog policy allows institutions to substitute accrediting reports for program reviews, and requires institutions to submit accrediting reports as part of their larger institutional quality reports.

Disciplinary Accreditation

Done on a program by program basis.

Technology Focus

Recently added element for all programs to describe the uses made of technology

 

Top

South Carolina

Contact

Alan Krech, Director of Planning, Assessment, and Communications

Commission on Higher Education
1333 Main Street, Suite 200
Columbia, SC 29201
803-737-2291
FAX: 803-737-2297
akrech@che400.state.sc.us

 
State Agency Original Initiative Year
South Carolina Commission on Higher Education Cutting Edge 1988

Policy Analysis

Policy Context

Assessment in South Carolina is a "play in three acts." The first Act, passed by the state legislature in 1988, is Act 629, which declared that "each institution of higher learning is responsible for maintaining a system to measure institutional effectiveness in accord with provisions, procedures, and requirements developed by the Commission on Higher Education (CHE)." (Section 59-104-650 (b) (1) of Act 629). The second Act, approved by the legislature in 1992, is Act 255, which required institutions to "submit an annual report to the Governor and to the General Assembly...presented in a readable format so as to easily compare with peer institutions in South Carolina and other Southern Regional Education Board (SREB) states the state's public postsecondary institutions." (Section 59-101-350 (A) of Act 255). The third and most recent assessment policy, Act 359, was adopted by the legislature in 1996. Act 359 "requires that state appropriations for public higher education be based on institutions' success in meeting thirty-seven performance indicators..." ("Dynamics," Summer 1996). These performance indicators are included in the legislation. Act 359 further mandates that this policy be implemented fully by 1999. It is important to note that Act 359 is in addition to, not in place of, Acts 629 and 255.

Policy Type

Act 629 was intended to "strengthen the quality of higher education and to produce a continuous cycle of improvement in public colleges and universities." (Guidelines for Institutional Effectiveness, 1989 and 1995.) This act requires institutions to provide information on 18 indicators, reduced to 17 indicators in updated guidelines issued in 1995. This policy has a focus on quality assurance. Act 255 required institutions to provide data on a set of indicators established by the legislature; in this case, 11 indicators for four-year institutions and seven (7) indicators for two-year institutions. The purpose of this reporting, however, differed from Act 629 in that data gathered in compliance with Act 255 was used for comparative purposes with institutions both in South Carolina and with institutions in the member states of the SREB. This comparative dimension makes this policy one of accountability. Act 359, calling as it does for a 100 percent performance funding system by July, 1999, is a distributive policy.

Policy Stage

Acts 629 and 255 remain in effect, and institutions submit reports in compliance with these laws annually. Act 629 was updated in 1995. Act 359 mandated a total of 37 performance indicators. These indicators will be used, in combination with the current enrollment-driven formula, to determine a portion of new money for the 1997-98 appropriations. The state plans to move to a 100 percent performance-funding appropriations formula for fiscal year 1999-2000.

State Guidelines

See Policy context section above.

Programs/Positions

At state level, Coordinator of Planning and Assessment position was added in 1988. In July 1996, a director position was added to oversee the performance funding initiative.

Indicators and Outcomes

For Act 359, the indicators are expenditure of funds to achieve institutional mission; curricula offered to achieve mission; approval of a mission statement; adoption of a strategic plan to support the mission statement; attainment of goals of the strategic plan; academic and other credentials of faculty; performance review system for faculty to include student and peer evaluations; post-tenure review for tenured faculty; compensation of faculty; availability of faculty to students outside classroom; community and public service activities of faculty for which no extra compensation is paid; class sizes and student/teacher ratios; number of credit hours taught by faculty; ratio of full-time faculty to other full-time employees; accreditation of degree-granting programs; institutional emphasis on quality teacher education and reform; sharing and use of technology, programs, equipment, supplies and source matter experts within the institution, with other institutions, and with the business community; cooperation with private industry; percentage of administrative costs compared to academic costs; use of best management practices; elimination of unjustified duplication; amount of general overhead costs; SAT and ACT scores of students; high school class standing, GPAs, and activities of students; postsecondary non-academic achievements of students; priority on enrolling in-state residents; graduation rates; employment rates; employer feedback on graduates; scores of graduates on employment-related exams; number of graduates who continue their education; credit hours earned by graduates; transferability of credits to and from institutions; continuing education; accessibility for all citizens of the state; financial support for reform in teaching education; amount of public and private sector grants.

Instruments

Professional licensure and certification examinations (such as NTE and NMBE)

Statewide surveys using a common set of survey questions

Some vary by institution

Teaching-Learning Element

Act 629 contains substantial consideration of these elements. Two of the categories of performance indicators in Act 359 address issues of instructional quality and graduates' achievements.

Public Reporting

Annual

Database

Comprehensive statewide database exists at the SHEEO level, containing student records from four-year and two-year public institutions.

Budget

It is not clear from state documents what percentage of the appropriations formula is determined by performance indicators for 1997 and 1998. The formula will be 100 percent indicator-driven by 1999.

Regional Accreditation Association

Southern Association of Colleges and Schools.

Regional Accreditation Association Relationship

Act 629 and SACS criteria reinforce one another.

Disciplinary Accreditation

Accreditation of degree-granting programs is one of the performance indicators.

Technology Focus

One performance indicator (Act 359) is the sharing and use of technology, programs, equipment, and expertise within institution, with other institutions, and with business community.

 

Top

South Dakota

Contact

Lesta V. Turchen, Senior Administrator

Board of Regents
207 East Capitol Avenue
Pierre, SD 57501-3159
605-773-3455
FAX: 605-773-5320
lestat@bor.state.sd.us

 
State Agency Original Initiative Year
South Dakota Board of Regents Assessment Policy 1984

Policy Analysis

Policy Context

Assessment in South Dakota began in 1984, when the Board of Regents (BOR) implemented a statewide program designed "to provide institutions, departments, and students with information necessary to adequately identify strengths and weaknesses in the general education curriculum as well as specific academic programs." (Assessment Committee report, 6/87) The primary source of this information came in the form of students' performance on standardized tests. In this same report, the Committee articulated some of the difficulties with the 1984 policy. These difficulties included: "high test costs; inadequate test score data; inappropriate tests; lack of student motivation; lack of faculty support." For these and other, similar reasons, the BOR decided to redesign the assessment policy. The new policy, adopted in 1987, encouraged individual institutions to use a three-tiered approach to assessment: assessment of content knowledge, assessment of the ability to process knowledge, and the assessment of student attitudes and beliefs. In essence, each campus was permitted to create its own assessment program to meet this approach. The most recent BOR policy, adopted in 1992, gave campuses even more autonomy in terms of assessment practices. A 1996 planning document, Access to Quality, calls for assessment to receive a higher priority.

Policy Type

The 1992 BOR policy (2:11) has its primary purpose "to enhance the quality and excellence of programs, learning, and teaching by providing important information on the effectiveness of academic programs. Campus assessment programs should also increase communication within and between departments related to departmental and institutional goals and objectives. It is also important that campus assessment programs enhance the public understanding of higher education and the diversity of institutional roles and missions." Thus, South Dakota's policy has quality assurance, as well as accountability, components.

Policy Stage

Institutions are to report, at five-year intervals, to the BOR on their assessment programs beginning no later than 1995. As of 2/97, four of the six state institutions have submitted their assessment reports. Ongoing assessment activities are alluded to in the 1996 planning document.

State Guidelines

"Each university shall have in place a functioning assessment program which conforms to the accreditation requirements of the North Central Association and any specialty accreditations held by the university. At a minimum, each assessment program shall: (a) assess the general education component of the baccalaureate curriculum; (b) assess each of the specialty areas for which a baccalaureate degree is offered; and (c) consider the findings of the assessment program in the regular review of curriculum and related policies and procedures." (BOR policy 2:11).

Programs/Positions

A core curriculum committee was established on each campus to define institutional goals, how institutional progress toward these goals can be measured, and make the necessary changes in curriculum based on those measures.

Indicators and Outcomes

Vary by institution.

Instruments

For process knowledge: ACT-COMP, Watson-Glaser Critical Thinking Appraisal, ETS Academic Profile.

For students' attitudes and beliefs: locally-developed surveys, ACT survey series, NCHEMS series, CIRP, CSEQ.

Teaching-Learning Element

Purpose of the policy is the enhancement of learning, teaching, and programs. This can be achieved by providing information on the effectiveness of academic programs. The means to this vary by institution.

Public Reporting

Annual.

Database

Statewide database at the SHEEO level but is not comprehensive.

Budget

"Each campus is authorized to include in its university support fee, a fee ($.25/credit hour) to be used for the administration of the assessment program." (BOR policy 2:11; 6/92).

Regional Accreditation Association

North Central Association of Colleges and Schools

Regional Accreditation Association Relationship

Policy 2:11 links the state requirement closely to the accreditation requirements for outcomes assessment of the NCA. And the policy explicitly states, "Each university shall have in place a functioning assessment program which conforms to the accreditation requirements of the North Central Association and any specialty accreditations held by the university.

Disciplinary Accreditation

"Each university shall have in place a functioning assessment program which conforms to the accreditation requirements of the North Central Association and any specialty accreditations held by the university

Technology Focus

Varies by institution.

 

Top

Tennessee

Contact

Donald R. Goss, Director of Academic Programs

Linda Bradley-Long, Associate Executive Director of Academic Affairs

Tennessee Higher Education Commission
404 James Robertson Parkway
Parkway Towers #1900
Nashville, TN 37219
615-741-7565
FAX: 615-741-6230
lbradley@mail.state.tn.us

 
State Agency Original Initiative Year
Tennessee Higher Education Commission (THEC) Performance Funding 1979

Policy Analysis

Policy Context and Stage

Tennessee has one of the longest histories of assessment activity; the state adopted its first Performance Funding Program in 1979. According to Peter Ewell, this program "remains one of the most distinctive and most often cited approaches to state-based assessment." (Ewell, 1990.) The Performance Funding Program (PFP) was appealing because "it supported necessary appropriations and because it linked new dollars with a tangible return on investment." (Ibid.) In 1982, the program was further defined: a total of 2 percent of available funds would be "set-aside" and awarded to institutions as a reward for fulfilling the state's five performance indicators. In recent years, the amount of "set-aside" funds has totaled between $25 and 30 million. Institutions may earn up to 5.45 percent of their operating budget each year.

In 1984, the state assembly passed the Comprehensive Education Reform Act (CERA), which announced a new set of "Legislative Benchmarks." This law required the THEC to submit an annual report on all of these "benchmarks" for a five-year period. Among these "benchmarks" were students' scores on standardized tests, graduation and job placement rates, and results from licensing and certification examinations. As this five-year policy drew to a close in 1989, the state assembly issued the Tennessee Challenge 2000, which included some of the elements of the CERA. The Tennessee Challenge 2000 policy requires annual reports from all institutions on their progress toward meeting the established standards. This policy is currently in effect and has been amended frequently since its inception. The new set of standards (listed below) are in effect from 1992-93 to 1996-97. Thus, the policy is an ongoing cycle of implementation and evaluation.

Policy Type

As one of the national models of state-level assessment policies, the Tennessee Performance Funding system has a focus on quality assurance and reform, insofar as the reform will produce quality assurance. Each of performance funding standards begins with a statement of purpose. Many of these statements explain that the standards are "designed to provide incentives to an institution for improvement in the quality of its undergraduate general education programs/master's degree programs..."

State Guidelines

1. Each of the ten (10) standards established by the state (see below for list) shall apply to "all public universities, community colleges, and technical institutes in Tennessee."

2. "Each institution shall annually conduct the assessment activities required by the standards and report the results to its governing board and, through it, to the THEC."

3. "Reports are due to the governing boards by July 1 of each year and to the Commission by August 1."

4. "Data and other information will be submitted in formats provided by the Commission."

5. "Mid-year reports and requests are due to governing boards by December 1 of each year and to the Commission by January 1. Requests and petitions after that date may be considered, but only by exception."

6. "The Executive Director of the Commission may authorize modification of these standards...Final responsibility for implementation of these standards reside with THEC." (General Provisions for Performance Funding Standards).

Programs/Positions

Not evident from the state documents

Indicators and Outcomes

1. Performance of graduates on an approved standardized test of general education

2. Performance of graduates on approved examinations in major fields of study

3. Satisfaction of alumni and enrolled students

4. Program accreditation

5. Quality of non-accreditable undergraduate programs by external review

6. Quality of master's degree programs by external review

7. Level of minority enrollment and enrollment vis-à-vis mission-related goals

8. Graduation and retention rates

9. Institutional success in the strategic planning process

10. Improvement actions (correction of weaknesses identified through the PFP).

Instruments

For general education: ACT-COMP or College BASE

For major field education: tests and measures to be approved in cooperation through governing boards and institutions

Student and alumni surveys

Programmatic accreditation

External reviews of non-accreditable undergraduate programs and graduate programs

Teaching-Learning Element

Four of the PFP standards address teaching-learning elements: general education, major field education, non-accreditable undergraduate program education, and master's degree program education.

Public Reporting

Annual

Database

Comprehensive statewide database exists at the SHEEO level, containing student records from four-year and two-year public institutions.

Budget

$25 to $30 million each year is awarded through the PFP.

Regional Accreditation Association

Southern Association of Colleges and Schools

Regional Accreditation Association Relationship

Standard Four in the PFP calls for institutions "to achieve and maintain program accreditation."

Disciplinary Accreditation

See above.

Technology Focus

Non existent.

 

Top

Texas

Contact

William H. Sanford, Assistant Commissioner, Universities

Texas Higher Education Coordinating Board
PO Box 12788, Capitol Station
Austin, TX 78711
512-483-6200
FAX: 512-483-6127
sanfordbl@thecb.texas.gov

 
State Agency Original Initiative Year
Texas Higher Education Coordinating Board TASP 1989

Policy Analysis

Texas state policy features testing, as well as an institutional effectiveness, components. In terms of testing, in 1985, the Texas Higher Education Coordinating Board (HECB) named a Committee on Testing. This committee was charged with finding out “how many Texas students were entering college inadequately prepared for college-level work.” The Committee’s 1986 report, entitled A Generation of Failure: The Case for Testing and Remediation in Texas Higher Education, found that “an estimated 30 percent of the students entering Texas public higher education each year could not read, write, or compute at levels needed to perform effectively in higher education…” Based on this finding, the Committee made four primary recommendations: (1) Texas adopt a diagnostic test for the reading, writing, and mathematics skills needed to perform effectively in college [this test is called the Texas Academic Skills Program , TASP, test]; (2) the test be administered after admission decisions had been made, thereby avoiding a possible move to admit students according to their performance on the skills test; (3) all institutions develop student advising programs and also remedial programs to meet the needs of under-prepared students; and (4) all institutions report annually to the Texas HECB on the effectiveness of remedial and advising programs.” These recommendations became law during the 1987 Texas Legislative Session. Program changes in 1993 allowed students who had scored at a certain level on the SAT, ACT, or TAAS (Texas Assessment of Academic Skills) to be exempted from the TASP. Another change required students to take the TASP test by their 9th college hour, instead of their 15th college hour.

In terms of institutional effectiveness, the HECB appointed a Task Force on Institutional Effectiveness for community and technical colleges in 1993. Based on the recommendations of this task force, the HECB developed a review system that: “identifies institutional and programmatic strengths and areas of concern; verifies institutional outcomes and improvement efforts; identifies exemplary programs and innovative ideas; and reviews progress toward goals established by colleges in Annual Data Profiles, Carl D. Perkins annual and discretionary grant applications, and Office of Civil Rights compliance.”

Policy Type

The TASP testing policy is designed to ensure that “if skills deficiencies are identified, the student is required to participate in continuous remediation until he or she masters all sections of the examination.” As such, this is a regulatory policy. The Institutional Effectiveness policy has a threefold purpose: “continuous improvement of community and technical colleges in response to state and federal goals and mandates, including workforce education and training; accountability to the Texas Legislature, Governor, and U.S. Department of Education for public expenditures; demonstration of the quality and responsiveness of community and technical colleges in developing a well-educated citizenry and a highly trained workforce.” This policy is clearly one of quality assurance and accountability.

Policy Stage

The TASP testing policy has undergone evaluation and redesign. It is not clear from the state documents whether the Institutional Effectiveness policy for community and technical colleges has been implemented.

State Guidelines

For TASP, see Policy Analysis above. In addition to the TASP test, institutions may continue to administer “local” diagnostic examinations already in place to entering freshmen.

For Institutional Effectiveness Policy, institutions go through three steps: (1) Annual Data Profile, which “summarizes current performance data and annual progress toward meeting state-level goals and federal reporting requirements”; (2) On-site Review, which is a three-day site visit by a team of community and technical college personnel and HECB staff during which the college is evaluated on mission, effective use of resources, access, achievement, and quality; (3) Follow-up Reviews after three, six and twelve months, during which HECB staff follow-up on the implementation of recommendations made by on-site reviewers.

Programs/Positions

None Evident.

Indicators and Outcomes

As part of the Institutional Effectiveness policy, standards have been established to measure institutions’ success. Most of these standards are general and broad in scope. Those standards relating specifically to student outcomes are course completion; technical program completion; placement of students who complete the technical program; follow-up of technical student non-returners; and licensure pass rate.

Instruments

The TASP test; others vary by institution

Teaching-Learning Element

One of the areas in which community and technical colleges are evaluated is achievement. Within this area, student outcomes is listed as one “success factor.” Indicators of student outcomes is listed above.

Public Reporting

Annual for the TASP results

Database

Comprehensive statewide database exists at the SHEEO level, containing student records from four-year and two-year public institutions.

Budget

Not evident.

Regional Accreditation Association

Southern Association of Colleges and Schools

Regional Accreditation Association Relationship

Not evident.

Disciplinary Accreditation

Not evident.

Technology Focus

One of the measures of success in for “Quality of Programs” in the Institutional Effectiveness policy is “Integrating Academic and Technical Education.”

 

Top

Utah

Contact

Dr. Phyllis Safman, Assistant Commissioner for Academic Affairs

Utah System of Higher Education
State Board of Regents
355 West North Temple, 3 Triad Center, Suite 550
Salt Lake City, UT 84180-1205
psafman@utahsbr.edu

 
State Agency Original Initiative Year
State Board of Regents H.B. 37/Assessment Policy 1992

Policy Analysis

Policy Context

Following a national trend that "[L]egislatures around the country are asking that public institutions of higher education be accountable in much the same way other state agencies are accountable," the Utah Legislature passed House Bill 37 in 1992. This law required that the Utah System of Higher Education (USHE) "report biennially its own effectiveness in the areas of student assessment, faculty productivity, and program and facility measures." This report is called the "Assessment and Accountability Report." This law also mandated the creation of an Assessment and Accountability Committee, charged with the initial review of the report.

Policy Type

The mission of USHE is "to educate the students who attend its campuses and prepare them to become productive members of their communities." Given this mission, the Assessment and Accountability Report offers a review on "how well the institutions are fulfilling their mission." All nine of the colleges and universities that comprise USHE are required to submit data for this report. Since the Assessment and Accountability Reports are used primarily to track institution's success in fulfilling its mission, Utah's assessment policy can best be described as a combination of accountability and quality assurance. Broadly speaking, "[T]he findings from the 1995 Assessment and Accountability Report will aid USHE institutions in improving the efficiency of their data collection and effectiveness in delivering programs."

Utah's assessment policy hinges on the biennial Assessment and Accountability Report. The report is reviewed first by the Assessment and Accountability Committee. The 18 members are drawn from all nine member institutions and the Utah business community. After its review, this committee recommends whether or not "additional data elements and outcome measures should be collected in order to strengthen future reports." In this respect, the committee seems to have as its primary aim the refinement of the assessment instruments, not the analysis and interpretation of the data provided by those instruments. For this reason, Utah's assessment policy deals more with accountability and quality assurance than regulation and reform. Following the Committee's review, the Assessment and Accountability Report is then made more widely available (State Board of Regents, member institutions, legislators, general public).

Policy Stage

In response to the 1992 legislation, the first Assessment and Accountability Report was issued in 1993, the second in 1995, and the third is due this year (1997). The Assessment and Accountability Committee had the opportunity to review the 1995 report and make numerous recommendations concerning improvement of the 1997 report. Thus, Utah has completed two cycles of policy implementation (1993 and 1995) and one cycle of evaluation (1995). The results of this evaluation presumably will be reflected in the 1997 report.

The primary intention of Utah's assessment legislation is to require USHE to track and report on the effectiveness and success of its member institutions. After two cycles of policy implementation and one cycle of evaluation, it seems that the primary functional result of this assessment policy has been the ongoing improvement of the assessment instruments and procedures. As another way to improve upon the assessments and procedures, a subcommittee of the Assessment and Accountability Committee has been created to review "the efforts of other states in this arena and to review relevant literature in an effort to determine other measures and means that could prove useful in evaluating higher education." Each of the nine member institutions of the USHE is required to submit the stipulated data for the biennial report. These institutions also provide the majority of the members of the Assessment and Accountability Committee.

State Guidelines

H.B. 37 mandated biennial Assessment and Accountability reports on student assessment, faculty productivity, and program and facility measures which are reviewed by the Assessment and Accountability Committee. The reports are produced at the state level. “The updated draft [of the report] includes specific tables which the institutions will use in order to provide standardized data to the Regents and the legislators. Some data will come from anecdotal information provided by each institution.” (Safman).

Programs/Positions

Utah State Higher Education Assessment and Accountability Committee formed to review biennial reports. Participants on the Committee include faculty, academic and student affairs administrators, and institutional researchers representing all nine institutions, as well are two business persons from the state.

Indicators and Outcomes

"Student assessment includes institutional reassessment measures that identify students' academic strengths and deficiencies, student progress measures which tell how institutions follow each student's progress in meeting individual goals, and student outcomes which measure graduation rates, student satisfaction with their education, and the fit between student education and employment" (1995 A&A Report, p.2).

Teaching time, research/publication productivity, professional development, and institutional and public service are measured for faculty. Faculty and course evaluations and program review systems are in place at all nine institutions.

Instruments

Students

Student Reassessment Measures:
all nine institutions require ACT/SAT for admission
Community colleges use ASSET, COMPASS and subject placement writing and mathematics exams
all students at the nine institutions take assessment tests to determine academic strengths and weaknesses focusing on mathematics, and reading and writing skills

Student Progress Measures Grade Point Averages, and terms to completion.

Student Outcomes Measures licensure Examinations, Graduation rates of cohorts, Capstone courses mentioned, Student Opinion Surveys and Exit Interviews (all institutions), Placement information for both employment and graduate/professional school, Alumni satisfaction surveys.

Faculty

Faculty and course evaluations and program review processes are used. “Faculty hours continue to be of interest. Rank of faculty associated with level of classes will also be provided.” (Safman)

Teaching-Learning Element

It appears as though emphasis is more on measuring progress as indicated by retention, graduation rates, and employment information than on teaching and learning.

Public Reporting

Biennial.

Database

Comprehensive statewide database exists at the SHEEO level, containing student records from four-year and two-year public institutions. 1995 A&A Report has in its list of recommendations "the need for uniformity and consistency in the collection of data from the institutions" and the more systematic use of data resources collected by the Commissioner's office. "The (Assessment and Accountability) Committee will also make recommendations on designing relevant data-gathering instruments with the goal of maximizing consistency in reported results" (p. 25 1995 A&A Report).

Budget

None.

Regional Accreditation Association

Northwest Association of Schools and Colleges.

Regional Accreditation Association Relationship

Sees regional and professional/disciplinary accreditation process "as essential to maintaining quality" (1995, A&A Report, p. 22).

Disciplinary Accreditation

Acknowledges importance of national and professional accreditation boards evaluating programs to assure that quality measures are being met.

Technology Focus

In 1994 $7.85 million one-time funding for the Technology Initiative was appropriated by the Utah legislature. Funding was targeted toward faculty development and assistance, classroom enhancement, course/curriculum development, library resource expansion, and equipment and infrastructure. Base funding for technology initiatives was added in 1995-1996.

 

Top

Vermont

Contact

Jeanie W. Crosby, Director, Academic Services

Vermont State Colleges
PO Box 359
Waterbury, VT 05676
802-241-2533
FAX 802-241-3369
haigh@maze.vsc.edu

 
No Initiatives at the state or system level.

 

 

Top

Virginia

Contact

Donna Bradd, Acting Associate Director of Academic Affairs

State Council of Higher Education
101 North 14th Street, 9th Floor
Richmond, VA 23219
804-225-2137
FAX: 804-225-2604
bradd@schev.edu

 
State Agency Original Initiative Year
State Council of Higher Education for Virginia (SCHEV) Assessment Program 1986

Policy Analysis

Policy Context and Stage

In 1985, the State Senate directed the SCHEV to "investigate means by which student achievement may be measured to assure the citizens of Virginia the continuing high quality of higher education in the Commonwealth." In 1987, the SCHEV issued the results of its study, which articulated guidelines for student assessment in Virginia. In that same year, the state government appropriated funds for assessment activities for the 1988-1990 biennium. Each institution in the system developed its own assessment plan, and these plans were reviewed externally, approved for implementation, and summarized in the Virginia Plan. The first progress reports on assessment plans at the institutional level were submitted in 1988. In 1989, state law made assessment a permanent responsibility of the SCHEV, and assessment funding was made part of the institution's base budgets. 1990 marked the first year of implementation for the assessment plans summarized in the Virginia Plan. Progress reports have been made biennially since then; 1996 begins the eighth year of assessment in the state. The state's policy is in a continual cycle of implementation and evaluation, as evidenced by the updating and revision of reporting guidelines after each biennium.

Policy Type

In Guideline #10, it states that "the purpose of assessment is not to compare institutions but to improve student learning and performance." This is a quality assurance policy. It was hoped that the state's approach to assessment "could meet the dual goals of program improvement and accountability. Of the two, however, improvement took priority, which influenced a number of decisions about the shape of the program." (Miller, 1995).

State Guidelines

1. Assessment of undergraduate student outcomes should be appropriate to the mission and goals of each institution and program.

2. Data collected for other reasons may be suitable for assessment purposes.

3. The effect of assessment procedures on students should be taken into account.

4. Students should be assessed at appropriate intervals during college, and data should be collected on alumni.

5. Each institution should identify minimal verbal and quantitative skills.

6. Each institution should describe its plans and means for measuring success of remediation programs.

7. Institutions should provide annual reports on all full-time, first-year students with diplomas from Virginia for distribution to school districts.

8. Similar reports should be compiled on community-college transfer students.

9. Each institution has the responsibility to evaluate its own assessment procedures.

10. The purpose of assessment is not to compare institutions but to improve student learning and performance.

Guidelines on the specific content and form of reports are issued each year.

Programs/Positions

The Virginia Assessment Group (VAG) was started in 1987 to discuss issues relating to this subject. The group holds annual conferences.

Indicators and Outcomes

Vary by institution and by program area.

Instruments

Vary by institution and by program area.

Teaching-Learning Element

State guidelines address the need for assessment of student outcomes in general education, as well as major field. The stated purpose of assessment is the improvement of student learning and performance.

Public Reporting

Biennial.

Database

Comprehensive statewide database exists at the SHEEO level, containing student records from four-year and two-year public institutions. Also contains student records from some independent nonprofit colleges and some proprietary schools.

Budget

"Funds averaging $12 per full-time student were granted to the institutions to implement assessment procedures." (NCHEMS, 2/96).

Regional Accreditation Association

Southern Association of Colleges and Schools

Regional Accreditation Association Relationship

Not evident.

Disciplinary Accreditation

Not evident.

Technology Focus

Technology is one of four areas of state focus. The degree of focus will vary by institution.

Issues of Note

The SCHEV has issued a report entitled Assessment in Virginia: Guidelines for the Second Decade. These guidelines provide a framework for future directions in assessment policies and practices.

 

 

Top

Washington

Contact

Kathe Taylor, Senior Policy Associate

Higher Education Coordinating Board
917 Lakeridge Way
PO Box 43430
Olympia, WA 98504-3430
360-753-7800 FAX: 360-753-7808

 
State Agency Original Initiative Year
Higher Education Coordinating Board Assessment Policy 1989

Policy Analysis

Policy Context

In an effort to encourage assessment and evaluation, the Washington State Higher Education Coordinating Board (HECB) conceived a Master Plan in 1987 "to develop a multi-dimensional program of performance evaluation" for each of its member colleges and universities. This plan "envisioned assessment as a link between two separate but complementary goals: to improve the quality of undergraduate education and to provide needed information about student outcomes to the HECB." The plan was refined and operationalized in 1989 by the HECB in a resolution directing each member four-year institution and community college to follow an evaluation program. The resolution also called for the creation of a subcommittee of the HECB to "continue development of an effective performance evaluation program..."

Policy Type

The 1989 resolution stated the goals of this performance evaluation program differently than the 1987 master plan. According to the resolution, the evaluation program had two complementary goals: "(1) to provide a means for institutional self-evaluation and improvement, and (2) to meet the state's need for institutional accountability in order to assure quality in the state's higher education system." This wording reflects the focus in Washington's assessment policy on accountability and quality assurance. In its annual report for 1995, the HECB acknowledged that assessment policies can, and often do, have multiple functions. "This struggle--between assessment as improvement and assessment as accountability--is occurring nationwide and has been termed the 'contraries' or 'contradictions' of assessment (Angelo, 1995; Ewell, 1991)." Washington has made clear its preference for the former function: "[A]ssessment is at present more valuable as an aid to helping institutions continuously improve than as a tool for evaluating them as they existed one or two years ago."

Policy Stage

The first series of assessment reports from the member institutions covered the 1991-1993 biennium. The second reports that followed covered the 1993-1995 biennium. These second biennial assessment reports have been used to focus attention on "how students learn, how faculty/curricula/institutions help them learn, and what contributes to student learning." The state has documented specific examples of how assessment has already been, and can continue to be, an "aid to policy." Thus, Washington has completed two cycles of implementation and evaluation, and presumably will conclude its third biennium in 1997.

State Guidelines

The following components are expected to be incorporated by each four-year and community college systems' performance evaluation programs: collection of entry-level baseline data; intermediate assessment of quantitative and writing skills, and other appropriate intermediate assessment as determined by the institution; end of program assessment; post-graduate assessment of the satisfaction of alumni and employers; and periodic program review.

Programs/Positions

In the Board's 1989 Resolution refining the performance evaluation program, the board agreed to appoint a subcommittee to work with staff and institutional representatives to continue development of the performance evaluation program.

Indicators and Outcomes

Writing and quantitative skills, alumni and employer satisfaction are mandated outcomes arenas. The manner in which these are measured is left to the discretion of the community college system and the individual four-year public institutions.

Instruments

Upon recommendation of 1987 Master plan, "Building a System", the Higher Education Coordinating Board piloted three nationally-normed tests (College Measures Program, Academic Profile, and Collegiate Assessment of Academic Proficiency) and decided they were not the best tools to assess the quality of undergraduate education, in particular the targeted academic skills -- communication, computation and critical thinking. Following this pilot effort, individual institutions were encouraged to develop their own assessment tools and tests.

Teaching-Learning Element

There is a clear commitment at the Board and institutional levels to assessment of the attainment of writing and quantitative skills. The 1995 Assessment Report notes means by which institutions have been using assessment to understand and improve the teaching/learning process.

Public Reporting

Annual Report expected from six public four-year institutions and State Board for Community and Technical Colleges. The State Higher Education Coordinating Board publishes its own annual assessment report which is a compilation of findings from across the institutions, including an overall evaluation of the state of assessment.

Database

Statewide database exists but is not comprehensive.

Budget

State funding for assessment has been available since the 1989-91 biennium when $400,000 was given for assessment activities at each of the six, four-year institutions and to the State Board for Community Colleges. In 1990 supplemental funds of $60,000 per institution was given to the 27 community colleges. Total funding levels for public four-years, community colleges and technical institutions have remained relatively constant for each successive biennium budget. The Community Colleges and Technical System Governing Board has funding to coordinate assessment activities while the Higher Education Coordinating Board does not.

Regional Accreditation Association

Northwest Association of Schools and Colleges

Regional Accreditation Association Relationship

Not evident.

Disciplinary Accreditation

Not evident.

Technology Focus

Not evident.

 

Top

West Virginia

Contact

Bruce C. Flack, Director, Academic Affairs

State College and University Systems of West Virginia
1018 Kanawha Blvd. East, #700
Charleston, WV 25301-2827
304-558-0261
FAX 304-558-1646
flack@wvnscus.wvnet.edu

 
State Agency Original Initiative Year
State College and University Systems of West Virginia Assessment Policy 1989

Policy Analysis

Policy Context and Type

Systemic assessment efforts began in West Virginia in 1987, when the West Virginia Board of Regents created a Task Force on Assessment. This task force established general guidelines for institutional assessment activities. West Virginia's institutions developed assessment plans in response to the task force. In 1989, West Virginia's Board of Regents split into two governing boards--one for the State University System and one for the State College System. Both of these boards passed resolutions in 1989 recommending that "[E]ach public college and university is urged to develop a five-year comprehensive assessment program which is compatible with its mission and educational objectives." Thus, assessment of student learning is essentially done only at the institutional level. In 1992, legislation was passed requiring the system offices to issue institutional "report cards." Institutional report cards suggest an emphasis on accountability. Legislation passed in 1995 reaffirmed the report card legislation. Most recently, in 1996, the State University and State College Systems adopted plans based on eight principles consistent with the purpose of public higher education in the state and how these principles inform the assessment of general education and academic programs. (The eight principles are listed below.) These plans, consisting of numerous initiatives, are currently being implemented. These initiatives focus on quality assurance and reform.

State Guidelines

The most recent legislation, Senate Bill 547, which was passed in 1995, requires both of the governing boards to prepare institutional, as well as system-wide, report cards. The 1996 System Plans of the State University and State College Systems call for a variety of assessment activities designed to fulfill the eight principles derived from the purpose of higher education in the state. These eight principles are (1) preparing for life's work; (2) increasing educational opportunities and standards; (3) partnering for quality and efficiency; (4) measuring by results; (5) transforming education through technology; (6) rewarding strategic change; (7) supporting faculty and staff to drive strategic change; and (8) seeking additional resources through partnerships.

Programs/Positions

The 1987 Regents' Task Force on Assessment was reconfigured and renamed the West Virginia Higher Education Council on Assessment in 1989. In response to the recommendations of this Assessment Council, assessment committees were created on most campuses.

Indicators and Outcomes

Indicators vary across all of the initiatives for each of the principles. Examples include alumni surveys, basic skills testing, and nationally-standardized achievement tests.

Instruments

No.

Teaching-Learning Element

Not evident.

Public Reporting

Periodic.

Database

Comprehensive statewide database exists at the SHEEO level, containing student records from four-year and two-year public institutions.

Budget

According to NCHEMS (2/96), "West Virginia governing boards have allocated approximately $15,000 annually for statewide assessment programs and materials. However, the primary responsibility for funding assessment activity has been borne by the campuses."

Regional Accreditation Association

North Central Association of Colleges and Schools.

Regional Accreditation Association Relationship

Not evident.

Disciplinary Accreditation

Not evident.

Technology Focus

Principle Five in the System Plan directly addressed "transforming education through technology." The goal is to "[B]ecome a national leader in using technology to enhance access to learning and to improve the quality and cost-effectiveness of education."

 

Top


Wisconsin

Contact

Kevin Boatright, Special Assistant to the Vice President for University Relations

University of Wisconsin-Madison
1846 Van Hise Hall
1220 Linden Dr.
Madison, WI 53706
608-263-2227
608-262-5739
kevin.boatrightunivrelvh@topnet.uwsa.edu

 
State Agency Original Initiative Year
The University of Wisconsin System Accountability Policy 1993

Policy Analysis

Policy Context and Type

Assessment became a major issue in Wisconsin in the summer of 1992, when the Governor's Commission on University of Wisconsin (UW) Compensation recommended that a task force should be created to look into establishing accountability measures for the public university system. This Assessment Task Force, in 1993, announced its recommendation that the UW System should adopt accountability measures, and that these measures should fall into seven broad areas: (1) access; (2) quality; (3) effectiveness; (4) efficiency; (5) diversity; (6) stewardship of assets; (7) contribution to compelling state needs. Further, the commission recommended that specific performance indicators should be used to measure accountability. In the report of the Accountability Task Force, the objectives of this policy were clearly stated: (1) "To enable our stakeholders to know what we are achieving with our resources;" and (2) "To encourage continuous improvement in serving our many different kinds of clients, using appropriate feedback mechanisms." The Task Force also recommended that "there be consequences for failing to act to meet the accountability goals and rewards for special efforts which lead to success in meeting the goals." This gives the policy a distributive component.

In addition to the accountability policy, the UW system has an ongoing program of quantitative measurements, called the Academic Quality Program (AQP). The AQP includes "annual publications of the Statistical Profile and regular surveys of students and/or alumni, business and/or industry, the general Wisconsin public, and UW System faculty members." The AQP calls on the UW System to continue the assessment of students' verbal and quantitative skills, refine the techniques and report annually on the use of assessment results in the improvement of teaching and learning." (Resolution 6215, adopted 9/11/92).

Policy Stage

It is not clear from the state documents how far along Wisconsin is in implementing its systemwide accountability measures.

State Guidelines

"Once a set of core indicators is established and baseline data are available for each of the indicators, the UW System Board of Regents should evaluate the data and set performance goals related to each indicator." (1993 Task Force Report) Results of accountability measures should be provided to the Governor and Legislature in "report card" form. Finally, the Regents "should periodically reconvene a public/private sector task force to review the progress made and recommend changes as appropriate." (1993 Task Force Report).

Programs/Positions

The Governor's Task Force on UW Accountability Measures was impaneled to make recommendations related to assessment, and issued its final report in 1992.

Indicators and Outcomes

Student, alumni, and employer surveys, faculty share of undergraduate instruction, research funding at doctoral institutions, sophomore competency tests, graduation rate, post-graduation experience, credits-to-degree, state funding for instruction-related activities, rates of admission and access for state high school graduates, hiring, retention, and tenure rates of women and minority faculty and staff, minority student enrollment and graduation rates, reporting and resolution of sexual harassment complaints, faculty retention and development, facilities maintenance, workplace safety, continuing education/extension enrollment.

Instruments

See indicators/outcomes.

Teaching-Learning Element

Some of the indicators address teaching and learning elements. The Academic Quality

Program (AQP) deals more directly with teaching and learning issues.

Public Reporting

Three year cycle.

Database

Comprehensive statewide database exists at the SHEEO level, containing student records from four-year and two-year public institutions.

Budget

None evident.

Regional Accreditation Association

North Central Association of Colleges and Schools

Regional Accreditation Association Relationship

The AQP, in particular, was designed "with special emphasis on meeting the North Central Association's accreditation guidelines for assessment."

Disciplinary Accreditation

None evident.

Technology Focus

None evident.

 

Top

Wyoming

Contact

Dr. Thomas Henry, Executive Director

Wyoming Community College Commission
2020 Carey Avenue, 8th Floor
Cheyenne, WY 82002
307-777-7763
FAX 307-777-6567

 

Receipt of information is pending.

Top

 

 
© 2003, National Center for Postsecondary Improvement, headquartered at the
Stanford Institute for Higher Education Research