Recently in Assessment and Measurement Category
Recently, I attended an excellent NASPA webinar entitled Powerful Data: The benefits of Direct Assessment in Student Affairs. Presenters Nathan Lindsay, Aimee Hourigan, and Jenn Smist did a great job of providing concrete examples that take some of the intimidation factor out of direct assessment.
Direct assessment is based on analysis of student behavior or artifacts (tests, papers, etc.) that demonstrate students' skills and abilities. Indirect assessment is based on reported perceptions of students' skills and abilities. So, for example, a Pulse survey that asks students to rate their ability to communicate effectively is an indirect assessment, and an evaluator's score of a student's presentation is a direct assessment.
But, only faculty can do direct assessment, right? Wrong! Admittedly it is easier to do direct assessment when you can require students to do something in order to pass a class, but it isn't impossible to do outside of the classroom. Let's start with the low-hanging fruit - those Student Affairs programs in which students are already required to meet certain requirements in order to participate. For example, resident assistants, peer educators, student employees, and student organization leaders all have to meet certain requirements in order to hold their positions. Direct assessment can be incorporated into these requirements. For example, peer educators need to have certain knowledge in order to do what they do effectively. A simple knowledge-based quiz administered before their training and after their training can provide direct evidence of knowledge gained. Further, you can go the extra mile and give them the quiz 3 months down the line to provide direct evidence of knowledge retention over time.
If you want to do direct assessment of an educational program or some other activity where you want to minimize student "work" in order to maximize student participation, you have some other options. For example, students are frequently asked to provide feedback about events via a short survey. It is possible to insert one or two questions (closed- or open-ended questions) that ask students to demonstrate what they learned or how a program affected their perspective. I believe it was Aimee who talked about this and when I inquired, she indicated that in her experience students were willing to complete these questions as long as they could be answered quickly and concisely.
Okay, but what if I want to assess higher-level skills, such as leadership ability among student organization leaders? Well, first you need to define the critical components of leadership ability. That's no small job, but you have some options. You might develop a rubric that could be used to do a 360-degree evaluation of a student leader. Members of the student organization, the organization's adviser, and other key people who interact with the leader might be asked to provide feedback using the rubric. Another option would be to require your student leaders to participate in reflective writing - probably not possible for every club president at Penn State, but perhaps with a few key leaders or with a group of RAs or peer educators you could make this work. For students that maintain an online portfolio, reflective writing about their experiences and learning can provide artifacts.
These are just a few ideas to get you thinking. What I took away from this session is that the key to direct assessment in Student Affairs is to focus in very tightly on critical or essential knowledge. This will help keep the task manageable.
I recently read an article in the Journal of Higher Education published in 2009 titled "Engaging with Difference Matters: Longitudinal College Outcomes of 25 Co-Curricular Service-Learning Programs." Among the findings that Keen and Hall reported was evidence that reflective writing guided by staff input added value to student learning during a formal service learning experience.
The program studied is one sponsored by the Bonner Foundation and the study measured the impact of a service learning program in a cocurricular setting. At Penn State, a service learning task force is soon likely to define 'service learning' as only possible within the context of the curriculum in a formal, classroom setting, but this study shows that a well-structured co-curricular setting can be effective as well.It is noted in the article that service learning is often defined as having a curricular setting:
The National Service-Learning Clearinghouse (2005) defined service-learning as a "...teaching and learning strategy that integrates meaningful community service with instruction and reflection to enrich the learning experience, teach civic responsibility, and strengthen communities."
But the authors go on to point out the proven value of constructing service learning experiences in out-of-class settings as well:
A frequent tendency in the field is to use the phrase service-learning and assume the reference is to academic service-learning based in coursework. Giles and Eyler's (1999) seminal study of programs that linked academic study with service acknowledged the value of co-curricular learning and, in defining service-learning, also mentioned "non-course-based programs that include a reflective component and learning goals" (p. 5).
For more see:
In case you were one of the few who missed it, or who couldn't squeeze into the packed room, Tuesday's Assessment Brown Bag "Writing Learning Outcomes" was well-attended by a very enthusiastic audience. Dr. Charles Brua, of the Schreyer Institute for Teaching Excellent, provided some very useful tips for writing learning outcomes and worked with participants on an activity designed to put those tips to good use.
Thirty-three people attended the session, including representatives from almost every unit in Student Affairs and CSA students. Of the 24 people who completed the session evaluation, 96% indicated that the session provided information that they would use in their work! One person wrote, "Thank you for this! We were able to think critically about our unit and some developmental opportunities."
Dr. Brua's PowerPoint presentation and handouts are available for download on SARA's Assessment Resources page at http://studentaffairs.psu.edu/assessment/resources.shtml. Upcoming brown bags in this assesment series are:
- To Survey or Not to Survey?
Betty Harper, Student Affairs Research and Assessment
Thursday, February 10th, Noon-1pm, 106 HUB
Glenn Johnson, John A. Dutton e-Education Institute
Thursday, March 3rd, Noon-1pm, 106 HUB
- Reflective Writing
Barry Bram and Darcy Rameker, Union and Student Activities
Tuesday, April 5th, Noon-1pm, 106 HUB
Please email email@example.com to register for these events.
When I came across an Inside Higher Ed article entitled "Turning Surveys into Reforms" I was instantly intrigued. I thought the article would be a case study which demonstrated at least one way in which this might be done. But alas, this article simply focuses on the need to take action based on survey findings - particularly NSSE findings. Despite this dissapointment, it is still a timely article as PSU gears up for another NSSE administration in 2011. Just what have we done with the 2008 data??
To sign up for a Twitter Account, go to http://twitter.com.
Last week I was asked to present to the Cocurricular Learning Group on "assessment in student affairs" at Penn State. Certainly no small assignment for a newbie, but thinking about what I would like to say to this particular group on this topic gave me a chance to reflect on what I've observed in my five short months as the director of Student Affairs Research and Assessment.
So, let's start with the data. It is clear that Student Affairs staff members are providing a diverse and growing number of programs for Penn State students and we are spending more time assessing these programs. In the first half of 2009-11, Student Affairs offered approximately the same number of programs and conducted almost twice as many assessments as it did in the entire 2005-06 academic year (data from the Educational Programming Record). Assessment data being collected includes participation numbers, survey data, formal and informal feedback and document analysis (student journals, reflections, etc.)
And yet, despite this flurry of assessment activity, we still have a very hard time "telling our story" to the university community. It appears to me that part of the problem is that the data doesn't make it up the pipeline. While staff members may use the data to improve their programs, the data is not being aggregated or disseminated at a level that allows us to promote our positive impacts on student learning and development.
Many units are making great strides in developing their own unit-level outcomes and assessing their activities based on these outcomes. My hope that is that as SARA moves forward in the coming years, we can increase the educational opportunities we provide to Student Affairs staff. In doing so, we can build our assessment capacity in the units, allowing SARA to play a greater role in integrating data from across the Division. Then we can shout our successes from the rooftops and back them up with data.