Here the authors briefly outline several suggested methods for conducting program assessment.  (Ah, the nitty gritty!)

I think that the information I will summarize below is helpful in and of itself.  However in addition, I hope to use this basic outline of methods as a base starting point for investigating published models (I mentioned finding two published articles on program assessment from WSU; I’ve also seen their assessment referred to in this chapter).

Some terms

  • direct measure = students being assessed produce writing/text, which is then evaluated
  • indirect measure = students are assessed without producing writing–multiple choice, fill-in-the-blank, etc.
  • qualitative vs. quantitative methods = something that the authors mention (“depending upon the degree to which they acknowledge contextual influences” [117]) but that is not very clearly explained.  There is an interesting but short discussion on some of the tensions that can arise from assessment methods (as higher-ups and others in the university are often looking for more quantitative, cause-and-effect kinds of measures while a WPA is more in a position to make qualititative kinds of assessments–it’s difficult to show definitively that any kind of program demonstrates clear improvements in student writing). But this is something in the theory behind assessment that I think I definitely need to investigate further.

Some advice from the authors

  • More in-depth information on the assessment methods they outline is available in research guides such as MacNealy’s Strategies for Empirical Research (1999).
  • It can be helpful to contact others on campus who have experience with empirical research–not only other faculty, but also administrator’s in research-oriented offices.

Method 1: Surveys

  • Sometimes surveys already in use (e.g., teacher evals, first-year student surveys) can be employed; other times, it is helpful to design surveys specifically for the assessment
  • Consider modifying for audience (slightly different questions for students than for teachers)
  • Keeping surveys brief helps improve response rate (10 questions max)
  • Anonymity also improves response rate

Method 2: Interviews

  • More flexible and can allow for more in-depth responses, as it presents opportunity for dialogue and follow-up questions
  • Easing anxiety:  The authors feel that a third-party can be best for conducting interviews, as “[a] formal interview can often be intimidating–especially for contingent faculty” (120).  Aggregating results is also recommended.
  • Focus-group interviews can help save time (see example in appendices – p. 191)

Method 3: Teaching Materials

  • Administrators might collect one kind of document or several (combined into a course portfolio): syllabi, classroom activities, assignment sheets, miscellaneous handouts, course readings.  Can be analyzed on their own or together.
  • Portfolios: instructors should assemble using all documents used during the period being assessed (the quarter, year, etc.), collect any observations and possibly some student samples, and should also provide a reflection that, “if it is to be useful, it should address questions or issues important to the assessment.  If the connection between course content and program learning outcomes is a concern, for example, then instructors can be asked to reflect on the ways that their course supported these outcomes” (121).
  • Administrators, if examining student samples, might consider systematically randomizing selections (e.g., all instructors should provide samples from students numbered 2, 4, 6, and 8 on their roster).
  • Reducing the obvious time-intensive nature: random sampling can be used.  OR even better, administrators might “identify one key issue to focus on during the analysis” (122).  For instance, in order to keep evaluators focused when reading course portfolios, the authors created a reading guide sheet that outlined the portfolio reading process they should follow for each sample and a questionnaire with focused questions (see appendix, pp. 184-185).
  • Easing portfolio process for instructors: informing instructors about course portfolios at the beginning of the term and encouraging they compile as they go eases the time burden.  Anonymity can also ease fears that individual faculty will be evaluated.

Method 4: Student Writing Samples

Method 5: Teaching Observations