Planning & Assessment South of the Border - 2003 Cancun Conference Presentation Abstracts

Developing Assessment Plans and Building Rubrics to Measure Student Performance in General Education and the Major. Presenter: Larry Kelley, Kelley Planning and Educational Services.
The presenter will give a comprehensive but brief overview of an innovative assessment process and describe examples of course embedded assessment techniques/procedures and development of assessment plans and rubrics. Participants will spend most of the time reserved for the session working in small groups to utilize the model presented in the workshop to develop program assessment plans and related rubrics.

An Easy to Use Model for Effective Departmental Assessment and Planning. Presenter: Larry Kelley, Kelley Planning and Educational Services.
The presenter of this session will describe a comprehensive but easy to use model for assessment/improvement of programs and services at the department level. Using this model, academic departments are required to develop annual assessment plans to address two primary areas for each of its major programs, student learning outcomes (knowledge and skills in the major and general education knowledge and skills) and faculty outcomes (grant writing, scholarly productivity, and service). Departments are also required to create reports of the results of assessment work and to develop short and long-range plans for improvement. Participants will develop related plans and reports.

Assessing the Impact of Address Clean-up and Incentives on Response Rates. Presenters: Danny Olsen, Director, Institutional Assessment and Analysis; Joseph Curtin, Assessment Consultant, Institutional Assessment and Analysis; and Steve Wygant, Assessment Consultant, Institutional Assessment and Analysis, Brigham Young University.
A dynamic culture of assessment relies upon data regarding student admissions, course enrollments, curricular pathways, and the time to degree. Decision-making is further enhanced when data regarding faculty (scholarly productivity, workload statistics), costs (cost/SCH, cost/FTE student), and resource demands (e.g., space, administrative overhead, capital equipment) are available. This session will overview: (a) the steps in identifying critical information needed by decision-makers, ranging from the president-to-chairs, (b) best practices in selecting an IT platform, (c) procedures to engage the academic community, and (d) a phased implementation to insure success. The session will demonstrate our DSS and attendees will receive copies of development tools (e.g., process maps & guidelines drawn from interviews with 20 IR directors).

But How Well Do These Findings Generalize? Investigating Non-Response Bias in a Mail-out Alumni Survey. Presenters: Danny Olsen, Director, Institutional Assessment and Analysis and Steve Wygant, Assessment Consultant, Institutional Assessment and Analysis, Brigham Young University.
Non-response bias poses a potentially serious problem for valid interpretation and presentation of self-report survey data. In the current study, attitudes held by alumni who responded to a mail-out survey are compared via a follow-up telephone survey to attitudes of initial non-responders. Comparative analyses suggest that alumni who initially completed and returned the mail-out survey hold more favorable attitudes toward the institution than those who did not.

Using Self-Efficacy Tools to Enhance the Freshman Experience in General Education Courses. Presenters: Patricia Esplin, Director, Freshman Academy and Joseph Curtin, Assessment Consultant, Institutional Assessment and Analysis, Brigham Young University.
Making the transition from high school to college is a significant challenge for students enrolling in higher education. Understanding the demands of a college course coupled with students' assessment of their capabilities in meeting these challenges provide useful information to help students make this important transition. For example, one of these key areas is adjusting to the differences between high school classes and the large general education lecture sections encountered on university campuses. This session will discuss the use of a self-efficacy scale used to help identify students who may struggle in this new environment. This scale is currently being administered to students participating in a Freshman Academy and taking general education courses in American Heritage, Psychology, and Biology.

Fulfilling the Mission. Presenter: Linda Heiland, Director of Curriculum, Learning and Assessment Support Services, Central Arizona College.
In this age of accountability, assessment plays an ever important role in the development and successful implementation of programs that actively work to improve the institutional academic programs that are vital to revitalization efforts. Steps toward the true integration of mission and functioning are achieved when the collected data from those departmental efforts are the basis for decision-making in the departmental and ultimately, the institutional budgetary process. This presentation will demonstrate a successful integration of planning, assessment and budgeting.

Linking Institutional Program Review to North Central Association's AQIP (Academic Quality Improvement Project) Criterion. Presenters: Roberta Bell, Director of Institutional Research and Linda Heiland, Director of Curriculum, Learning and Assessment, Central Arizona College.
An overview of Central Arizona College's (CAC) participation in the North Central Association's AQIP process and how we have begun to integrate the AQIP criterion into academic and institutional program reviews. The review of the AQIP criterion, the three action projects and the process that CAC has taken to collect baseline data related to the work in each action project will be related to integration of that information into the academic program review process. AQIP criterion will be shared as well as the basis for the three action projects at CAC and samples of the academic program review packet.

From Dancing to Rainmaking: An Expanded Role for the IR Professional in the Strategic Planning Process. Presenter: John Kelley, Director, Office of Planning and Institutional Research, Villanova University.
Over the past decade, strategic planning has given way to a newer notion of strategic management. Focus has shifted from planning, as a conceptual process, to a more comprehensive system, which places equal emphasis upon action, deployment and implementation. In this session, the presenter will present two best practices from Villanova University; namely, the formation, purpose and execution of (1) Goal Attainment and (2) Self-Study Implementation Teams. Each has drawn attention from accrediting bodies and other universities. Both best practices underscore moving from plan to action. They also illustrate what many believe to be a wise approach to planning; namely, simultaneous top-down and bottom-up efforts. The presentation will include specific descriptions of each practice so that attendees can replicate these at their institutions if they so choose. In particular, the presentation will describe roles that institutional researchers can play, not only in supporting the planning process, but in promoting implementation as well.

Benchmarking Visits: A Good Investment? Presenter: John Kelley, Director, Office of Planning and Institutional Research, Villanova University.
Since the advent of TQM (Total Quality Management) and its euphemistic in-law, CQI (Continuous Quality Improvement), benchmarking has become a household work in many organizations. Indeed the Encyclopedia of Quality offers a somewhat elliptical definition of the "Benchmarking Partner" as the institution "you work with to conduct a benchmarking study." This source quickly adds that the term also refers to two or more institutions "working together to compare similar functions to learn how both can improve their operations." Now we're getting to the substance of this session; namely, a description of a recent multi-day Benchmarking Visit conducted by Villanova's Office of Planning, Training and Institutional Research to the Office of Planning and Institutional Research of Georgetown University. The session will cover many bases: selecting a partner institution, preparing for the visit, distilling discussion questions, scheduling meetings, fostering the emergence of tangible outcomes (as well as intangible ones) and assessing such outcomes. In addition, both real cash and opportunity costs as well as the benefits of a typical Benchmarking Visit will be shared. Finally, ample time will be devoted for dialogue, questions and answers.

Assessing Alternative Pedagogies: Two Approaches to Comparing Problem-Based and Traditional Instruction. Presenters: M.B. Ulmer, Acting Dean, College of Arts and Sciences and Jonathan Trail, Director of Institutional Research, University of South Carolina Spartanburg.
For six years, the mathematics faculty at the University of South Carolina Spartanburg offered their College Mathematics course in both traditional and modified problem-based formats. Performance of students who progressed from each format into statistics was tracked by looking at success rates. Performance of students who progressed from each format into mathematics for early childhood/elementary education mathematics courses was tracked by looking at grade point averages. The two approaches offer complementary views of performance advantages of the modified format called targeted problem-based learning.

Expanding Institutional Research by Utilizing National Student Clearinghouse Data. Presenters: Danny Olsen, Director, Institutional Assessment and Analysis and Joseph Curtin, Assessment Consultant, Institutional Assessment and Analysis, Brigham Young University; and Jeff Tanner, Vice President, Higher Education Development, National Student Clearinghouse.
A dynamic culture of assessment relies upon data being as complete and accurate as possible. Decision-making is further enhanced when data is available and complete. This session will overview BYU's recent use of National Clearinghouse data to explore previously unobtainable data as well as enhance existing data specific to: (a) graduation rates, (b) degrees completed, and (c) concurrent enrollments. Illustrations will be shared regarding specialized analysis conducted using National Clearinghouse data for varying groups of students (e.g. admission status - denied/admitted, athletes, gender, etc.).

Tracking Prospective, Current and Former Students for Institutional Effectiveness and Assessment. Presenter: Jeff Tanner, Vice President, Higher Education Development, National Student Clearinghouse.
Researchers will discover how to track their prospective, current, and former students for more accurate measurement and assessment of the performance of the institution using the EnrollmentSearch tool offered by the National Student Clearinghouse. Learn how to use this tool for tracking students, such as those who decline offers of admission, transfer to other institutions, graduate and continue their education at other institutions, and much more. The Clearinghouse database contains 53+ million student records from 2700+ colleges and universities comprising about 91 percent of all the US student population.

Assessment of Knowledge Acquisition in Learning Through Substantive Experimentation or Functional Design Simulation. Presenter: Christopher Druzgalski, Professor of Biomedical/Electrical Engineering, California State University, Long Beach.
Current learning goals and assessment practices within various disciplines involve seminar type learning and laboratory-based experiments incorporating actual components or an implementation of simulated computer-based design. Frequently the students of various fields of study are exposed to powerful computer-based tools without sufficient training and guidance due to the complexity of these tools, which fit primarily industrial or other professional rather than academic environment. The overall aim of this project was a validation of assessment strategies in verification of knowledge and skills acquisition in the process of substantive experimentation, utilizing actual system components versus computer-based simulated design. Measures of content learning and examples of specific skills learning tools, as well as knowledge acquisition in both of the learning environments will be discussed.

Failure Analysis of Medical Devices as an Instructional and Assessment Tool. Presenter: Christopher Druzgalski, Professor of Biomedical/Electrical Engineering, California State University, Long Beach.
Proliferation of technology across academic disciplines and constantly increasing need for technological competency necessitate an expansion of students' learning experiences through technology-related tasks which are specific to a given academic discipline. Further, health care delivery is a multi-disciplinary activity and progressively encompasses new technological solutions which include a placement of critical care units such as defibrillators in academic and other public access facilities. Therefore, multiple aspects of medical technology are being embedded in the instructional experiences of students. It also serves to illustrate the relevance and crucial concepts related to issues ranging from critical thinking, project planning and documentation across different disciplines as well as relevance of materials, engineering/science related design and analysis. Specifically, a failure analysis of particular diagnostic and therapeutic devices for critical and noncritical applications provide a powerful informative and instructional tool. Those concepts are considered in light of the Safe Medical Devices Act and subsequent amendments. An analysis of simulated failures and an exposure to actual failures of medical devices and their causes provides a formative assessment of knowledge oriented outcomes.

Survival of the Fittest: The Retention Model. Presenter: Theresa Waterbury, Research Analyst Specialist/Assistant Assessment Coordinator, Winona State University.
Winona State University developed an Assessment Database to assist in the collection, management and interpretation of student assessment data. This session will discuss how the Title III Assessment Project aided in the development of a logistic regression model to predict which students are at risk for leaving school. Specifically, one of the web-based student surveys proved to be particularly useful in developing the model by providing data on student background and family demographic variables.

Using a Technological Infrastructure to Disseminate Assessment Information. Presenter: Theresa Waterbury, Research Analyst Specialist/Assistant Assessment Coordinator, Winona State University.
Creating a culture of assessment requires (among other things) widespread access to assessment data. This presentation will demonstrate Winona State University's integrated database and reporting tools which allow faculty to access data in real time from their offices or from home. Funded by a US Department of Education Title III Grant, this project has had a tangible impact on WSU's assessment and accreditation efforts.

Assessment of Collaborative Work in Mexican Teacher Education. Presenters: Graciela Cortes-Camarillo and Gisela Leo-Peraza, Escuela Normal de Educacion Primaria "Rodolfo Menendez de la Pena".
Collaborative work between an elementary school and the college of education is a crucial factor in teacher education. However, few studies are available regarding assessment of the collaborative work between the cooperating teachers and the college professors. The aim of this paper is two-fold: to present how the collaborative work is carried out and what the problems are that arise while doing an assessment of collaborative work. A case study design was used: data collection was done through observing group sessions and by conducting individual and group interviews. Results show four problems: individuals, cultural shock, communication, and administrative conditions.

Making Sense of the Institutionalization of Assessment in a College of Education. Presenters: Larry McNeal, Professor, Educational Administration and Supervision; SueAnn Strom, Associate Professor and Program Coordinator, Higher Education; and Linda Hemminger, Associate Professor, Department of Teacher Education, University of Arkansas at Little Rock.
A study is underway at the University of Arkansas at Little Rock to analyze the institutionalization of the assessment process within the College of Education. The study is being done using administrator and faculty generated data from individual and group interviews. The presentation is a critical analysis of their responses about where they are in relationship to the process of institutionalizing assessment. The focus of this presentation is making sense of the institutionalization of assessment within a college of education as ascertained through the voices of those who have to participate in and incorporate the change in their daily practices.

Using the Levels of Implementation Tool to Understand the Institutionalization of Assessment in an Organization. Presenters: Larry McNeal, Professor, Educational Administration and Supervision; SueAnn Strom, Associate Professor and Program Coordinator, Higher Education; and Linda Hemminger, Associate Professor, Department of Teacher Education, University of Arkansas at Little Rock.
This presentation is derived from a multifaceted study of the development and implementation of assessment in the College of Education at the University of Arkansas at Little Rock. The study focuses on the implementation aspect of assessment. The data for the study is generated from the Levels of Implementation tool administered to administrators and faculty in the College of Education. The tool measures participants' responses to the implementation aspect of the change process. The importance of the study is the data it can provide about the assessment implementation process.

CMSU Report Card System. Presenter: Mike Grelle, Director of Institutional Effectiveness, Central Missouri State University.
Our Report Card is an electronically based audit of institutional quality. Each unit of the university, from the President down to a department or unit head, has defined indicators of quality in prescribed areas that are used to evaluate the effectiveness of the unit (or Vice Presidential area) in question. The prescribed areas are Academic Quality, Assessment, Budget/Resources, Productivity, Support Services, Students, Campus Climate and Culture, and Strategic Directions. Not all divisions or units provide data in each of the areas as the categories may or may not be relevant to what they do. A unit may have six or seven quality indicators for a given category; it may have none or one. It just depends. The unit is given a grade of Below Expectations, Meets Expectations or Exceeds Expectations on the various measures it has defined against goals (targets) that have be approved by the person to whom the unit reports. Most of the data for the Report Card are supplied by the Office of Institutional Effectiveness and are automatically updated as the data are collected. This process has not only improved the coordination of our assessment and planning efforts, it has greatly increased our ability to communicate both internally and externally just what it is that we do well and what needs to be improved.

An Application of Alverno College's Assessment-As-Learning Approach at a Comprehensive Public Regional Institution. Presenter: Mike Grelle, Director of Institutional Effectiveness, Central Missouri State University.
In 1991, Central Missouri State University began implementation of a comprehensive performance-based assessment system founded on the principles of Alverno College's assessment-as-learning approach. To help educate the faculty and coordinate our assessment initiatives, our assessment model was divided into ten components which were linked to annual department planning, reporting and resource allocation. The model has proven very effective at initiating and maintaining department assessment activity and at improving the coordination of program curricula. This session will involve the participants in a discussion of how the assessment model and implementation strategy utilized at Central Missouri State University could be adapted at other institutions of higher learning.

A Course Charter Approach to Successful Curriculum Synthesis and Integration. Presenter: Robert Noyd, Professor of Biology, U.S. Air Force Academy.
The charter method is a way to create an integrated curriculum with a top-down; subject-centered design that builds from a set of broad performance-based educational goals in areas of knowledge, skills, and attitudes. Under each educational goal is a set of specific learning outcomes, which are distributed to one or more required courses. The set of outcomes, along with a level of emphasis, assigned to a particular course constitute its charter. Thus, each required course has a specific role in the curriculum, which permits analysis of course content for making curricular decisions, facilitates communication among faculty, and allows instructor innovation.

Cancun Planning and Assessment Conference Website
Cancun Planning and Assessment Conference Tentative Program