At a recent conference, I was approached by a campus colleague about how we seem to focus our research on the same issues time and time again. He wondered why the issues we end up addressing on campus each year, like homesickness and social connections, don’t seem to change that often. After mulling over the topic further, and hearing similar comments from others, I decided to take some time to study our Skyfactor data to see what I could find on our student issues and interventions.
To explore the question of why we keep addressing certain topics in both research and daily practice on campus, we calculated the mean scores for each survey factor across all first-year students from each Mapworks Fall Transition survey dating back to 2010. When we do this, we see a remarkable level of stability in factor scores, across multiple years and multiple first-year cohorts. Sure, there are some spikes and dips here or there. But all things considered, first-year students’ self-evaluations of their skills, interactions, behaviors, and commitment are remarkably consistent over time, especially considering the sheer number of students surveyed year over year (in the hundreds of thousands, if you were wondering).
So, to my colleagues who have commented how it seems like we are all addressing the same issues each and every year—that’s because you likely are. And that’s not a bad thing. The data we have on first-year students reflects a logical explanation for this pattern. Our first-year students are walking in the door with the same issues each and every year. Each year, we are going to have students who are homesick. We are going to have students who struggle with basic academic behaviors like showing up to class. And we are going to have students who come to college and struggle to make connections.
Given this reality, it’s easy to fall into a repeatable pattern: focusing on the same topics at the same time of year. However, there is a benefit—predictability. As you begin to amass longitudinal assessment data on your students and campus programs, you should begin to come into each academic year with a game plan that has evolved from a history of addressing certain issues at certain times. For instance, your campus may do a big push to get students involved at the beginning of the semester. It could be planning an outreach program to students who will have midterm deficiencies. Or, it could be an early-spring outreach to students who will most likely have high unmet financial need by that time. Regardless of the trigger or the outreach itself, the tendency to fall into a repeatable pattern is only natural. While these patterns likely became patterns for good reasons, it is imperative to periodically take time to step back and reconsider our approach. Specifically, does all of the data we have on our students lend to adjusting the timing of our interventions?
To give us an example, a common time to address academic issues and course struggles is around the mid-term. For many students, a failing grade on a mid-term exam or their first paper may be the initial flag for a professor or a campus running an early alert program that something could be going wrong. That flag then prompts us to action—reaching out to the student and trying to coordinate interventions before it’s too late to right the ship.
But did you know that first-year students can begin to see issues much earlier than that? While professors, academic advisors, and success coaches may begin reaching out to struggling students around the time mid-terms are popping up, students may already know that there are problems. According to data from the 2014-2015 Mapworks Fall Transition survey (typically administered in the third to fourth week of the first term), 59% of first-year students report that they are already struggling in at least one course. So, while some student advocates may wait until a failed mid-term to intervene with struggling students, the students themselves see the problem before they do. With only one in three first-year students saying they communicate with instructors outside of class regularly, the message may not be getting through early enough.
So what does all this mean? First, simply addressing the same issues every year does not mean something is broken—you cannot control your student population’s problems. However, just because our students are walking in the door with the same issues doesn’t mean we can’t evolve how and when we address these issues, or evaluate the effectiveness our interventions. Think about it this way—if you’re noticing a pattern on your campus, that means you’re already doing the hard work of collecting and assessing your institution’s data. Now, as the academic year starts, take that data and think about how you can use it to make targeted improvements to the efforts you implement each year to address reoccurring issues.
Interested in finding out how one campus used Mapworks data to prove the effectiveness of their student retention efforts? Meet Beth Stuart & Shariva White, Student Success Coordinators at Queens University of Charlotte.