Student Assessment Instruments

 

Inventory of Institutional Support for Student Assessment (ISSA)

Survey of Institutional Climate for Student Assessment (ICSA)





Inventory of Institutional Support for Student Assessment (ISSA)

This instrument, designed for the Phase II institutional survey, was developed using the framework and literature review from Phase I. The initial Phase I framework identified seven of eight domains represented in this tool kit. Five were used in developing this survey: 1) external influences on, 2) institutional approaches to, 3) institution-wide support for, 4) academic management policies and practices for, and 5) uses and impacts of student assessment. The domain for 6) institutional context was incorporated into the analysis of the Phase II institutional survey results using data from the 1995 Integrated Postsecondary Education Data System (IPEDS). The domains for 7) institutional culture and 8) integration with academic management and educational improvement were developed in Phase III of the national study and are not part of this instrument.

The instrument contains 244 items, which are organized into five major sections, paralleling the conceptual domains. Table 1 gives an overview of these five sections (conceptual domains) and the subsections of the questionnaire and identifies the questionnaire items related to each. This instrument was sent to all 2,524 two-year and four-year non-profit institutions offering undergraduate education in the United States. We received completed surveys from 1394 institutions for a 55 percent response rate.

The results were compiled into a database and descriptive analysis and frequency distributions were run on all items. Data was reduced through either factor or cluster analysis. In the factor analysis, items within sections were factored using an oblique rotation method. Items were chosen for inclusion in the factor if they weighed most strongly on that factor, their loadings exceeded .40, and they made sense conceptually (see Institutional Support for Student Assessment: Methodology and Results of a National Survey for further details on how the items factored). Cluster analysis was used on sections of the questionnaire with yes/no responses. Several sections of the survey consisted of dichotomous or categorical variables, which did not lend themselves to factor analysis. In most instances, we reduced data by summing scores within sections to create additive indices of the "yes" responses.

Univariate and multivariate analysis were run using the indices to examine the relationships between and the influences of the four domains (external influences, approach, support, and management policies and practices) on the uses and impacts. Finally, a regression analysis was used to identify dimensions that lead to greater use of and have positive institutional impacts of student assessment.

As a part of this tool kit, this instrument can be used to identify the influences on, approaches to, patterns of support for, academic management policies and practices for, and the uses and impacts of student assessment at the user's institution. Furthermore, after completing the instrument, results from the user's institution can be compared to other similar type institutions by referring to Student Assessment by Differing Institutional Types. This comparison can be helpful in identifying areas that may need improvement.

 

Survey of Institutional Climate for Student Assessment (ICSA)

This instrument used in Phase III of the national study draws on the Institutional Support for Student Assessment (ISSA, see above) research project undertaken in Phase II. Based on this national survey, seven institutions that differed by type, control and accrediting region and which used several approaches to student assessment, had a wide array of activities promoting assessment, and actively used the data for academic decision-making were identified. The institutions were contacted and asked to participate in the study. The seven included Iowa State, Western Washington University, Santa Fe Community College, South Seattle Community College, Wake Forest University, Northwest Missouri State University, and Mercyhurst College. Within each institution, a sample of two hundred tenure-track faculty members and all academic and student affairs administrators involved in student assessment were surveyed. The numbers were fewer for the institutions with less than 200 faculty. The overall response rate for faculty in the seven institutions was approximately 30 percent. Despite the seemingly low response rate, respondents were representative of faculty by rank, gender, and race at their institutions. Therefore, weights to correct for non-response biases were not calculated.

The survey instrument, titled Institutional Climate for Student Assessment (ICSA), was designed to assess respondent perceptions of their institution's student assessment patterns and their own satisfaction with and involvement in student assessment efforts. It was structured to parallel the original Conceptual Framework for Organizational and Administrative Support for Student Assessment, which reflected the findings of the Institutional Support for Student Assessment institutional survey. The domains from the Conceptual Framework for Influences of Faculty Satisfaction with and Involvement in Student Assessment were used as topic headings. A series of subheadings were then developed that included questions to measure the multi-dimensional constructs under each topic heading. It was intended for faculty, for academic and student affairs administrators, and for institutional researcher, and assessment officers. The questions were Likert-type response sets except for the demographic information that was collected at the end of each survey.

The total number of respondents for the survey was approximately 255, with as high as 182 faculty and 73 administrators responding, depending on the question. The responses to the surveys were significantly different for faculty and administrators therefore the analysis was restricted to faculty only. This selection was accomplished by including only those respondents whose primary appointment was faculty.

As a part of this tool kit, this instrument can be used to identify the faculty and administrators' perceptions of the institution's student assessment patterns and their satisfaction with and involvement in the institution's student assessment efforts.

Table 1. Dimensions of Institutional Support for Student Assessment


Dimension of Institutional Support

Survey Question
External Influences on Student Assessment
National Efforts
State Level Initiatives
Regional Accreditation Associations
Private Sector Support
Professional Association Support

IIIC1a-b
IIIA1-5, IIIC1c, IIIC2c
IIIB1-3, IIIC2b
IIIC1d
IIIC2a,d
Institutional Approach to Student Assessment
Content
Timing
Methods
Assessment Studies

IA1-14
IA1-14
IB1-10, IC1-9, ID1-4
IE1-10, IF1-6
Institution-wide Strategy, Support, and Leadership for Student Assessment
Institutional Support Strategy
Leadership and Governance Patterns
Evaluation of Student Assessment Processes


IIA1-2, IIB1-7
IIC1-7, IID1-6, IIE1-9
IIF1-2
Assessment Management Policies and Practices
Budget / Resource Allocation
Information System
Access
Reports
Student Involvement
Professional Development
Faculty Evaluation
Planning and Review

IVA1-4
IVB1-4
IVC1-5
IVD1-6
IVE1-4
IVF1-7
IVG1-7
IVH1-4
Uses and Impacts of Student Assessment
Education-related Decision-making
Faculty Decision-making
Internal Impacts
External Impacts

VA1-5, 8-12
VA6-7
VB1-8
VB9-15

Integration with Academic Management and Educational Improvement

Not included in this survey

Institutional Culture for Student Assessment

Not included in this survey

Institutional Context

Data obtained through IPEDS


NCPI HOME | SIHER HOME | TOOLKITS' HOME
© 2003, National Center for Postsecondary Improvement, headquartered at the Stanford Institute for Higher Education Research.