We’ve covered a wide range of perspectives on #SAcompetencies this month, but there’s one category left: Assessment, Evaluation, and Research (AER). Today, I want to share with you how you might evolve your AER competency by sharing how I developed mine.
TERMS and CONCEPTS
Graduate programs give us fantastic foundational knowledge around assessment vocabulary, evaluation strategies, and of course an opportunity to practicing primary and secondary research. But there’s no doubt in my mind that gaining advanced competency in AER takes good old fashioned experience. This is why I intentionally sought to improve my understanding of AER through mentorship. When I was a mid-level manager, I was fortunate to learn a great deal from a trifecta of experts – an exceptional dean of institutional research, a reputable vice president of academic affairs, and of course my always steady supervisor, the dean of student development. All three women had a passion for assessment. Our many conversations helped me become fluent in the language and concepts of AER.
VALUES, ETHICS, and POLITICS
An institution that deeply values AER creates a systematic culture of evidence across the organization. A student affairs professional who likewise embodies the values of AER embeds the value of assessment into student services. They strive for continuous quality improvement. I learned to value AER more deeply when I first became responsible for departmental budgeting and strategic planning.
I quickly learned the politics of fiscal allocations would require an ability to demonstrate measurable outcomes were being met as a result of resources already in play. Student learning outcomes were no longer applicable solely to a classroom. I learned and adapted a lot from my colleagues on the academic side. Additionally, I gained a greater sensitivity to the personal nature of data. I learned the imperative of communicating a respect for this sensitivity to potential participants in the forefront. This communication is important whether this is through a human subjects clause and/or a transparent explanation of why the data was being collected.
With the invaluable guidance of the institutional research folks I learned to design AER efforts by starting at the end. In other words, I was challenged to first answer “what is it you want to know?” before I could effectively create a means of measurement. Collecting the right data, I learned, was just as important as what you do with the data after it’s collected. And for AER to be sustainable, I learned that a systematic approach is critical. My involvement with strategic planning efforts as both a mid-level manager and now as a chief student affairs officer have assisted me with gaining intermediate and advanced skills in AER design. The most important lesson learned is AER cannot be successfully accomplished in a vacuum.
METHODOLOGY, DATA COLLECTION, and DATA ANALYSIS
Breaking down silos and working across college departments ensures data collected on one side of the house has the opportunity to benefit from decision making on another side of the house. A collaborative team approach builds consensus when it comes time to putting AER results and analysis into action at the institutional level. One of the biggest tipping points for my AER competency development was to stop thinking of data collection simply as isolated surveys. Instead, I began to track behaviors and connect dots across data points. Through later analysis, this data could provide insight into measuring student learning and engagement across the institution. After all, tracking is trending in the digital age. It’s all about big data! So I rediscovered the benefits of technology with data collection. I accepted that this will undoubtedly continue to integrate our society and higher ed.
INTERPRETING, REPORTING, and USING RESULTS
What good is collecting data points if you don’t make meaningful strategic plans based on them? My mentors hammered into my head the phrase “Closing the loop”. This premise is at the core of institutional effectiveness. Interpreting AER results, it also turns out, requires time and energy upfront to collect a baseline from which to measure and later compare trends. In other words, assessment itself requires strategy. I am still fine tuning my ability to evaluate multifaceted data. I’ve become systematic at documenting tactics and strategies that match success criterion for measurable outcomes.
Ultimately, we all need to learn to make data driven decisions to improve our effectiveness on student learning and retention. Quick throwback – I recommend you take a look at our #SAassess series archive from November 2015 for great advice on how to do assessment in student affairs.
This post is part of our #SAcompetencies series for February. Ever wish you knew then what you know now? #SApros pay it forward to #SAgrads looking for advice on soft skills and professional competencies before they job search this spring! For more info, please see Kim Irland’s intro post. Be sure to check out the other posts in this series too!