Title

Liberating Insight by Walking in Other People's Shoes

Document Type

Presentation

Presentation Date

Fall 11-7-2013

Conference Name

2013 Professional and Organizational Develoment

Conference Location

Pittsburgh, PA

Peer Review

Yes

Abstract

The researchers framed this program evaluation project as an investigation of the influences on teaching practices of a teaching center program participants and non-participants. Changes in teaching practices and the motivations for these changes of fifteen randomly chosen faculty were studied. Session participants will develop and analyze brief case studies using abbreviated data sets and three of the methods that were used in the study. Through hands on analysis of data, session participants will enhance their ability to evaluate the conclusions drawn by the researchers and become familiar with useful analytical frameworks that they can use in their own research.

The predominant activity in the program assessment of teaching centers remains measurement of satisfaction (Hines, 2009; Chism & Szabo, 1997). Hines’ 2009 study of the assessment practices of 20 faculty development programs in public and private universities confirmed that although the level of activity around assessment had increased since the previous comprehensive study in 1997, the measurement of impact on teaching and on student learning remained, in effect, unmeasured. Given the close scrutiny public university budgets are currently receiving, teaching centers cannot continue to neglect rigorous evaluation of their contribution to the core academic mission. To measure the impact of the a teaching center’s grants, events, and consultation services, the researchers framed this evaluation project as a study of the influences on teaching practices of both teaching center program participants and non-participants. Seeking to move beyond the comfort zone of lower levels of impact, such as satisfaction and knowledge acquisition (Chism, Holley & Harris, 2012), researchers studied changes in teaching and the motivations for these changes of fifteen randomly chosen faculty. An additional goal of the study was to contrast teaching center program non-participants with program participants in order to discover potential ideas for program improvement. Syllabus checklists, comparison of grade distributions, course evaluation scores, and program activity reports over a five year period, curriculum vitae, the results of an online survey, and interview data were examined. Student feedback data, input from colleagues, and departmental requirements motivated most of the changes that program participants made. An unusual finding was that the teaching center program participants often did not mention it as a resource for or an influence on changes in their teaching practices or increase in scholarly teaching activities--changes that were clearly mentioned in interviews, in vitae, and on activity reports.

In this interactive session we will share our methods and conclusions, and discuss implications for future changes in our program. Working in teams of three, session participants will have the opportunity to develop and analyze brief case studies of some of our faculty participants using abbreviated data sets and three of the methods that we used in our study. Session teams will share their findings and questions, providing a catalyst for discussion of our methods, interpretations, and conclusions.

Through hands-on analysis of data, session participants will enhance their ability to critique the conclusions drawn by the researchers. They will also become familiar with useful analytical frameworks that they can use in their own research. By giving session participants the freedom to examine our data from their own perspectives, we move out of our comfort zone to create an opportunity for all to learn something new from our data and from our colleagues.

Chism, N., Holley, M. & Harris, C. J. (2012). Researching the impact of educational development: Basis for informed practice. In J.E. Groccia & L. Cruz (Eds.), To Improve the Academy, Vol. 31 (pp. 129-145). San Francisco: Jossey-Bass.

Chism, N. & Szabo, B. (1997). How faculty development programs evaluate their services. Journal of Staff, Program & Organizational Development, 15(2), 55-62.

Hines, S. (2009). Investigating faculty development program assessment practices: What’s being done and how can it be improved? Journal of Faculty Development, 23(3), 5-18.

Keywords

Scholarship of Teaching and Learning, assessment, higher education, faculty development

Disciplines

Educational Assessment, Evaluation, and Research | Higher Education | Higher Education and Teaching | Nursing | Other Education

This document is currently not available here.

Share

COinS
 
 

Link to Original Published Item

Powerpoint of Presentation