Return to Stockton College Home Page   40 Years of Excellence    

Institute for Faculty Development

Student Ratings of Teaching

 Quick links:

Presentations on Student Ratings of Teaching at Stockton designed for:

If you have questions about student evaluations at Stockton, please direct them to the Student Evaluation liason for 2013-2014: Russ Manson.  

1)  NEW MOA that simplifies the old rules for IDEA and the small class form. One big change is that now everyone can choose to be evaluated online. Another big change is that if faculty do not choose paper forms or opt out of ratings by the deadline, their students will use online forms. Graphical data on response rates compares Stockton response rates. Overall, small class online response rates have been low but otherwise online response rates should be reasonably good. In addition, national research so far indicates that even where response rates are lower with online forms, the patterns of response tend not to change.

2) Choosing IDEA objectives, which provides links to IDEA resources that can help faculty select objectives.

3) Choosing CIP codes, which provides guidance and links to IDEA resources.

4) Using additional questions, which provides guidance and examples of additional questions that faculty might use to gather data about specific items, like progress in quantitative reasoning or aspects of lab, team-taught, or performance-based courses

5) Interpreting IDEA results, which provides guidance and links to IDEA resources

6) Understanding the small class form at Stockton

1) Student Evaluations of Teaching at Stockton

Stockton is currently using two different student evaluation forms as part of the evaluation of teaching:

1) The commercial IDEA diagnostic form, which allows faculty to receive formative and summative feedback about their pedagogy, links evaluation to course objectives, and provides comparative data to other courses at Stockton, other courses in the IDEA database, and other courses in the same discipline in the IDEA database

2) The small class form, which provides qualitative feedback on similar issues as the IDEA form. The diagnostic form is used for classes with 15 or more enrolled students and the small class form is used with classes with fewer than 15 enrolled students, in order to prevent faculty from having statistically unreliable quantitative data for those students. In addition, the qualitative form encourages students to write more comments.See sample faculty analysis of these forms in self-reflective statements for files.

 2)  Choosing IDEA objectives

 IDEA recommends that faculty normally select 3-5 objectives as important or essential and indicate the rest of as minor or no importance in a given class.  In the IDEA Progress toward Relevant Objectives scores on page one of the report, items of minor importance do not count at all; items that are “essential” count double items that are “important.” Choose as objectives items about which the following are true:

Your program may have some suggestions or requests (e.g., the writing program and first year seminars suggest objectives for W and first year seminar classes, and science lab courses often have a supervisor who suggests objectives). Programs cannot force selections upon you, but they program guidance can make selecting objectives easier.

It is harder for students to make progress if the class has many objectives. Also, research indicates that student ratings tend to decrease when larger numbers of  objectives are selected.

IDEA provides multiple resources for assisting faculty in selecting objectives. See their excellent reports on Selecting Objectives and the Number of Objectives Chosen.  If you miss the deadline for submitting objectives, your objectives will all default to "important" and you might use this guide to help you interpret results.

3) Choosing CIP codes

Ideally, your code is as good a match to your class as possible. In most cases, a match has been selected for you. In the cases of new or recently reviewed classes, you or another faculty member selected a code. If your comparison code is reasonably good, do nothing. If you think it could be better, contact the Director of the IFD.

CIP codes have less impact on student evaluations than most faculty realize. They are used for the comparative data in the first row of the small, two row chart on the bottom of page one of the summary report and in a column in the tables of comparative data on pages 2 and 4 of the summary report.  Changing a bad CIP match will not change mean scores or the graph on page one of the summary report. Nonetheless, particularly in cases where a discipline or course is unusually atypical, a good CIP comparison can provide a faculty member with comparative data that is more apt.

IDEA provides two lists of codes from which to select. One is comprehensive. The other, much shorter list, includes the codes for which the IDEA company can provide comparative data at this time. If you choose a code  from the longer list but not on the shorter list, your summary report will state "No discilinary comparisons available" and an "NA" will appear in all relevant places for disciplinary comparisons in the tables. You have to decide whether you'd prefer to have a comparison to a more general discipline (say, Psychology) and get comparative data or have a comparison to a more precise discipline (say, Experimental Psychology) and get no comparative data.

4) Using additional questions

IDEA provides for instructors to use additional questions. Some programs, like QUAD, want faculty to use additional questions to gather data for their program assessment. In other cases, faculty may use them to gather data about aspects not well-represented in the general questions on IDEA, like about a lab or field work or team-taught or performance course, about quantitative learning, or about textbooks or course materials. Faculty can develop their own questions but may also find the IDEA resource with suggested questions of use.

 5) Interpreting IDEA results

Everything said below should be understood in the context that excellence in teaching at Stockton is to be measured in multiple ways, including student ratings, peer observations, review of syllabi and other course materials, and faculty teaching porfolios with discussion of teaching philosophy. In addition, many of the factors defined as excellent teaching in Stockton's policies are not rated by students.

Interpreting IDEA results well is a complicated matter. Guidance provided by an article in the Chronicle of Higher Education is practical, a quick read, and appropriate for interpreting any student ratings. However, here's the shortcut advice: to see if you or another faculty member is doing a reasonably good job, look at the data in the tables and graph on page 1 of the summary report. Use whichever number (raw or adjusted) is higher for the most fair comparison. Faculty who fall into the Similar, Higher, or Much Higher range are doing (in the perception of their students) a fine or excellent job. The data about the course and students on page 2 (the motivation of the enrolled students and their effort level, the level of challenge of the course and its learning objectives) can help put the course into more context. Interpreters should, of course, also be aware of other aspects that might affect student ratings, like the race, ethnicity, or apparent religious affiliation of a faculty member, his or her age, apparent disability, gender or gender identity, or sexual orientation. On many of these, research is inconclusive, but at least some research (and in some cases, like gender, also data collected locally) indicates there may be bias.

If you want to be able to make or recommend changes to a course, then look at the data on pages 2 and 3, especially page three. Areas that say "strength to retain" are areas of strength. Areas that say "consider increasing" are areas in which to prioritize implementing change. Areas that say "retain current use of consider increasing" are areas to implement change when those changes are easy, align strongly with a faculty member's teaching philosophy, or are the weakest areas in the course (no areas say "consider increasing.") IDEA provides a handy report on pedagogy that can help guide faculty in making pedagogical revisions based on IDEA results. In addition, IDEA has a rich library of white papers on pedagogy which can help faculty think through changes they might make. The IFD also has a library of paper books and links are available on this website. In addition, the Director of the IFD is available for confidential individual or group consultations.

 

More complex and valid interpretations of the data can be done when more factors are weighed, but the cognitive challenge and time required increase. For assistance, see the Director of the IFD, attend a workshop in interpreting IDEA, check out a DVD on interpreting IDEA from the IFD, or use some of the many rich resources on the IDEA website, including an overall Interpretive Guide with internal links to more resources, a report on the extraneous variables and IDEA,  a report on teaching general education courses, a report on teaching quantitative courses, a report on online evaluations and online courses,  a report on disciplinary differences in IDEA ratings, and a report on the value and limitations of student ratings.

 6) Understanding the small class form at Stockton

 The Small Class Form is both harder and easier to interpret than the IDEA form. It is not quantitative, so interpreters cannot rely on statistical comparisons. Instead, they must read comments and identify themes, performing a mini-qualitative analysis. The small class forms are accompanied by a printed report that indicates which objectives the faculty member chose, the time and days of class meetings, etc., which can aid in interpretation.

 

Evaluators should note that Sonia Gonsalves' analysis (Fall 2011) of Stockton IDEA data demonstrates statistically significant (with small effect sizes) correlations of student rating scores with level of course, race of the instructor (more effect in some schools than others), and School. This data aligns with published research, indicating the instructors teaching lower level courses may receive lower student ratings and that student ratings are higher in education and arts and humanities and lower in the social sciences and natural sciences and math. It also indicates that being white can advantage faculty at Stockton, particularly in some Schools (i.e., SOBL). Being male gives less advantage than in local data analysis a few years ago, but still has some effect in particular schools, where males are most likely to receive higher excellent teacher and excellent course ratings and men and women are likely to receive similar Progress on Relevant Objective ratings.