Melanie Green & Yukiko Watanabe (10/18/2018)
An Iterative Evolution of Practices through Evaluation
In a recent published book chapter (Envisioning Scholar-Practitioner Collaborations: Communities of Practice in Education and Sport, 2018), Tony Mirabelli (Assistant Director, ASC) and Kirsten Hextrum (GSE PhD student and Graduate Assistant in ASC at the time of publication) described the Athletic Study Center's iterative evolution of their program evaluation and improvement efforts, starting from rudimental paper feedback survey in 2001 to a more outcomes-based program practice and evaluation in recent years. They both sought participation in CTL's assessment learning community programs and collaboration. We interviewed Tony about the book chapter and what motivated him to initiate and integrate program evaluation in ASC.
What motivates you to lead and engage in assessment work?
When I first started working here, I just wanted to make sure the students were happy and I inherited a satisfaction survey that didn’t show much. I was collecting data and the assessment process wasn’t giving me what I wanted to know and what I wanted to know was if our students were achieving what we wanted them to achieve. I collected a large data set of close to a thousand student athletes and what I discovered was that any student who spent more than 50 hours of tutoring received no less than a 2.4 GPA. What’s compelling is you see that the high achieving students aren’t using it but the ones at the bottom who use it, there’s a dramatic increase in their GPA. Students who in engage with academic community are more likely to pass. The more they are engaged, the more likely they will achieve. That was a turning point with the data. I simply wanted to know if our tutorial services were making a difference in student achievement. It made me want to do better in assessment and evaluation efforts.
In the book chapter you recently published on proactive program improvement, you describe your multi-year assessment engagement since you started in ASC in 2001. Can you briefly highlight the evolution of your assessment work (as well as what informed the evolutions) and what came out of assessment results in the recent years?
Along with Kirsten Hextrum, we started collecting more and more data and we began moving past just the scatter charts, collecting smaller bits of data. And around that time Kristen published a paper about the low graduation rates of academically vulnerable student athletes who were graduating at lower rates than the general student population. Everyone wanted to point fingers at student services and began asking what was the Athletic Study Center doing. I knew I needed to up my game in terms of assessment and evaluation and things finally changed when I got involved with the Advising Council. Although I already was applying pedagogic theory to teaching and training I began investigating theory and then looking at how theory applies to practice, making theory operational. Now I’m taking aspects and further refining them by looking at critical thinking and how we measure that. We’re looking at critical thinking in a discipline specific way. Assessment and evaluation are an iterative process, and require establishing sound mission, goals and objectives, and using that as a checks and balances, to make sure we have data points connected to each one of those.
What are some conditions to successfully engage in assessment?
Start with asking questions that are basic and simple. What do you do, what do you hope to achieve, and how do you want to improve it. People underestimate the importance and value of a mission statement. Once you have a well developed mission, it’s very easy to do everything after that. Until you are clear about what you want to do, it is very difficult to take those next steps. Develop a sound mission statement then decide how do you want to fulfill it, what are the goals that you want to establish for yourself to achieve that mission statement. Next steps are creating the objectives and breaking down the goals and the final step is how do you measure all that, putting together surveys and determining what kind of data you are going to collect.
What strategies help you to sustain assessment work?
What sustains me is my interest in theory. Most everything we do as humans is predicated by our theoretical assumptions that we have about the world around us and this is very much the case in teaching and learning and the actions of an educator are very much predicated on the idea of learning. I am constantly wanting to understand my theoretical biases and how they’re impacting student achievement.
What do you hope for going forward with your assessment efforts?
The big move forward is continuing to look at how critical thinking impacts achievement. And of course, I think my report could be useful in many ways, but especially fundraising, if I can get my report in the hands of external stakeholders.
A few takeaways from the book chapter:
Mirabelli, T., & Hextrum, K. (2018). Proactive program improvement: Incorporating assessment into student-athlete academic support services. In D. Van Rheenen & DeOrnellas, J. M. (Eds.), Envisioning scholar-practitioner collaborations: Communities of practices in education and sport (pp. 73-94). Charlotte, NC: Information Age Publishing.
The book chapter describes "Collaboration with other campus members to deepen assessment practices and discuss why incorporating theoretical models of learning and a practice of regular program assessment are central to improving student experience in higher education" (p. 75). They connected with campus resources to support their evaluation effort. [Tony and Kirsten] "pursued committee work that connected them with a learning consultant for the UC Berkeley Center for Teaching and Learning...[and participated] in various professional development programs directly influenced the collaborative and practical work of designing and refining the assessment tools and improved learning outcomes for both student-athletes and peer tutors at this institution...Collectively, these collaborations offered opportunities for critical reflection between scholars, researchers, and practitioners" (p. 75). "[A]ssessments improved the unit's services and offers a template for how student support service units could incorporate such assessment tools into their programs" (p. 75). ASC's "outreach is further aided by a firm mission statement, clearly articulated program goals and objectives, all of with care front and center in our communication materials. In doing so, we have observed both staff and student increased satisfaction with the ASC's tutorial program...By aligning our mission, goals, and objectives, we have been able to inform internal and external stakeholders about the benefits of our services." (p. 86).