Tomorrow's Teaching and Learning
The posting below looks at how “learning analytics” are improving students’ performance and graduation rates. It is by Mary Lord* and is from the May 2017 issue of Prism, the magazine of the American Society for Engineering Education. [www.asee.org] 1818 N Street, N.W., Suite 600, Washington, DC 20036-2479. © Copyright 2017. All rights reserved. Reprinted with permission.
UP NEXT: Internationalization within the Higher Education Context
Tomorrow’s Teaching and Learning
---------- 2,342 words ----------
Heads up – Academic Alert Systems Improve Student Performance
At East Carolina University, engineering undergraduates who do well in statics early in the semester often get an emailed kudo from their professor. Skipping class or shirking online discussions at Purdue University can prompt a red or yellow stoplight—with recommended routes for getting back on track. And when University of Michigan freshmen show signs of wobbling, they receive a warning through the school’s ECoach system, along with a personalized “get things done” list of steps to improve their learning curve. Senior Jacob Anderson, a cell and molecular biology major heading to medical school in the fall, holds ECoach “directly responsible for my success.”
Big data has begun making an impact on campuses nationwide with academic early-alert systems intended to transform teaching, advising, and outcomes, particularly in engineering and science. Gone are convocations that exhort students to look at a future washout on their left or right. These days, with fewer than half of the nation’s undergraduates earning a degree in four years and tuition costs soaring, colleges are turning to “learning analytics,” mining digital measures of class participation, quiz results, or homework completion to intervene with struggling students before they sink. Administrators can also determine which courses are chokepoints. At West Virginia University, for example, Sociology 101 is the No. 1 predictor of whether someone will graduate. Knowing this, the school connects students with tutoring or other resources to raise their odds of passing.
Research suggests that early-alert programs can help boost student motivation, grades, and persistence. That may explain their rapid expansion from a handful of pioneers in the late 2000s to an estimated 90 percent of campuses today. Academic early warnings also are among the evidence-based strategies for boosting retention and diversity highlighted in a National Society of Black Engineers tool kit developed with ASEE. The process of making faculty expectations transparent already is pushing students to take greater charge of their academic fates. As these systems grow more sophisticated, their potential to tap such digital transactions as who hit the gym or watched an assigned video has implications for overhauling admission criteria, privacy rules, and even gradebooks.
“For 100 years and more, traditional transcripts have been the only data which all institutions of higher education systematically record,” writes Tim McKay, a professor of physics, astronomy, and education who developed Michigan’s ECoach in 2012 to assist the hundreds of undergraduates in his introductory physics course. “If we are serious about using research to improve education, we must do better.” A leader in learning analytics, he offers several examples of how they allow universities “to personalize education at scale” and advance equity. The data show, for instance, that women in STEM lecture classes are earning lower grades across the board—a clear imbalance, he notes in a recent speech. Similarly, the university’s time-honored chemistry placement test turns out to produce a puny 0.2 improvement in letter grade for those assigned to the introductory course rather than organic chemistry, indicating this well-intentioned process may further impede students who already start behind. “Over the last decade, the ground has shifted beneath our feet,” says McKay, who is also director of the university’s
Digital Innovation Greenhouse.
Designed to work with learning management systems, early-alert programs flag poor attendance, sliding homework, or other predictors of distress, and provide templates that let faculty easily generate progress reports and personalize email with comments or such gentle hints as visiting during office hours—with a scheduling link. Newer versions include a kudos feature, to acknowledge when students do well.
Such personalized, real-time guidance enables students to adjust their habits as early as two weeks into the term while appealing to their penchant for playlists and apps. “They want regular feedback and reaffirmation—like Fitbit,” says John Campbell, vice provost at West Virginia University, who developed and rolled out Purdue’s early-alert program, Course Signals, in 2009. These systems also give professors and advisers an easy tool for sorting through and prioritizing students with the most warnings, and copy residential, athletic, and other counselors to ensure follow-up.
East Carolina Assistant Professor Ricky Castles, who teaches courses in electrical engineering as well as the first-year engineering core, is a fan. “We’re seeing a lot more students getting A’s and B’s and a lot fewer getting D’s and F’s” since the university introduced the Starfish academic early alert and retention tool in the fall of 2012, he says. Take statics, a notoriously difficult class full of vectors and other math concepts that stymie students. Over the past four years, the proportion earning D’s and F’s has fallen by half, from an average of 37.8 percent to 19.2 percent today. “Before Starfish, at least 15 percent of students got F’s. Now three semesters were under 10 percent, and two were under 6 percent,” says Castles, who attributes that dramatic turnaround to the constant discourse between faculty, students, and advising team—and clear, customized remedies. West Virginia University’s engineering college has seen an equally dramatic reduction in washouts. Three years ago, about 20 percent of WVU’s first-year engineering students were on probation after the first semester, says WVU’s Campbell. Today, that figure has fallen to just 10 percent.
“One of the big keys is early feedback and personalized support. Students know someone is watching them,” says Castles, who introduces the early-alert system when going over the syllabus at the start of the course to boost buy-in. Such systems are “particularly important to keep [students] in engineering,” he adds, noting that the Starfish program has pushed him and his math and physics colleagues to align their courses. The “massive exodus” from engineering might not occur, concludes Castles, “if we can keep more of these students interested and engaged and make them feel like someone cares about them.”
Early-alert systems also represent “a huge advance” in student support services, contends academic adviser John Trifilo, East Carolina’s Starfish project manager. The alert system not only provides faculty and counselors “a nice, historic way to keep information” on notifications, attendance, and interactions, he says, but also “helps us prioritize” the neediest students—the typical professional adviser’s caseload is 350 to 400—and tailor guidance to address specific weaknesses. “I see you got a flag in Psych; did you see the professor?” Trifilo might ask. “We have a much more honest conversation because we have that information in front of you. The student sees everything.”
So do key people who are connected with that undergraduate, from athletic advisers to living-learning community coordinators, even sorority advisers. “We don’t want to look at it as a bad thing,” cautions Trifilo, noting the benefits of having residential advisers check if excess absences are due to illness or other crises. Besides, he adds, “people need to hear [advice] from more than one person before they do something about it.” Castles concurs: Early alerts offer “a way to circle the wagons and make it a community-based approach to improving student success.”
A common student reaction, Trifilo finds, is one of “I never knew my professor actually was paying attention.” Even that small connection can make large introductory courses feel less impersonal for both students and faculty. “It sort of shrunk the classroom,” he says. Michigan’s ECoach, which has helped 15,000 students in introductory STEM classes and was expanded this past fall to serve all freshmen, asks participants to write down their goals at the outset of a course. When the aspiring engineers or scientists founder, they receive an alert with a reminder of those dreams, plus motivational emails from past students who also had blown the first exam but managed to turn things around in time to pass that crucial required course.
Michigan senior Jacob Anderson, who in 2012 was among the freshmen piloting ECoach, is “very pro data-driven interventions for better student outcomes, and more specifically, better learning outcomes.” He said the early-alert system saved him in general chemistry—a foundational pre-med lecture course with 1,200 students—and helped him succeed later on in Statistics 250 and elementary programming. “It didn’t ever get invasive,” he reflects, because the system was “very transparent about how it got your data and grades.”
That said, Anderson is troubled by the Big Brother potential of ever more sophisticated learning analytics—and who gets to see the data. “Skipping chemistry class—is it available to employers?” he asks. What about when applying for scholarships or graduate schools? “If students are not concerned, they should be, about what’s happening with the data,” says Anderson, “especially in first semester, first-year classes.”
Overall, his experience with ECoach was positive in large part because the alerts were both helpful and actionable. Little things, like reminders an exam is coming up, for instance, can have a big impact on forgetful freshmen still figuring out how to organize their studies. The same is true with emails from previous students saying, “Hey, I flunked the first exam, but here’s what I did to get better,” or data that let students know that 35 percent of those who did study groups got this higher grade.
“Looking back,” Anderson says, “I wouldn’t have done as well without it.”
Kudos have proven a surprising hit. “Everybody likes getting a little pat on the back,” observes East Carolina’s Castles, who believes such encouragement motivates students to press ahead in tough courses like statics. Trifilo concurs. “We never even realized how big kudos would be until we launched it,” he says. In the first year, 2011-2012, faculty were giving more warning flags than kudos. Of the school’s 28,000 students, 20,000 received kudos last year, and some instructors only use the system to send encouragements.
Too Much Coddling?
Despite their proliferation—and industrial-scale growth—early-alert programs remain a tough sell to some faculty members. Only about half of East Carolina’s faculty use Starfish, for example, and they tend to be junior faculty who teach undergraduates. “There’s a great debate among faculty” about how far to go in identifying students at risk, says WVU’s Campbell. At Purdue, where he was head of academic technologies until joining WVU as CIO in 2013, “the room split in half.” One side insisted that these are college students; they just need to grow up and do the work. The other half felt it was part of their jobs as educators to help at-risk students succeed.
As early-alert systems have gained in popularity (and generated revenue for vendors), some scholars have questioned claims of success. In 2013, Michael Caulfield, director of blended and networked learning at Washington State University in Vancouver, challenged Purdue’s claims that its Signals-enabled courses increased six-year graduation rates by 21.48 percent. After the pilot stage in 2008, the retention effect “disappeared completely,” Caulfield asserted. Students were taking more Signals courses because they persisted, not persisting because they were taking more courses with early alerts. (The increase in grades was not disputed.)
Students also have expressed skepticism. “In all honesty, if you don’t know you’re failing until you get that email, you’re probably not gonna make it in engineering anyway,” was how one responded to a Prism query on Reddit.
Pointers Derived from Practice
Several key strategies for boosting faculty buy-in and success rates have emerged from research as well as classroom experience:
Start small—and know your needs.
Purdue’s Signals began on an Excel spreadsheet. “If you know you have problems in mechanics,” says Campbell, “look at some of the data to try to identify what trigger points are for the class.” Eastern Carolina’s pilot pulled 20 faculty and advisers together to come up with the notifications, starting with one or two, then adding acknowledgments for outstanding or improved work.
Make it user-friendly.
Western Michigan University shifted gears after finding that its confusing, cumbersome early-warning system was being used by only 10 percent of faculty and—worse—had “no demonstrable impact on student success or student retention.” Systems like Starfish and Course Signals streamline the process by providing data dashboards and letting instructors generate alerts and kudos by clicking on a box.
Keep it short—and personal.
An email from the professor is more likely to provoke a response than a generic note from the counseling office. But don’t ramble. “Be very direct—less than 100 words,” advises WVU’s Campbell. Another takeaway: “Use the ‘I’ form,” says Campbell, as in “I know you can do better, I expect you to” attend office hours.
Give precise guidance.
Dismal test results? Don’t send a scolding email saying you expect to see everyone at office hours that week, as one Purdue physics professor did—only to find a line snaking down the hall awaiting him. Keep in mind that students have busy schedules. Along with posting office hours, let students know they can email you to set a time if they can’t make it.
Follow up early and often.
While engineering typically admits some of the university’s strongest freshmen, many may be used to pulling off high school assignments at the last minute or lack preparation in core science and math. Those first exams can hit like a hammer. Early-alert systems can examine dozens of data points—most have 20—and spot signs of trouble two weeks into the class. Did a student log in to a class discussion? Are the practice quizzes spotty? “If a student is half a standard deviation behind peers in a course, there’s a high probability they are at risk,” says Campbell.
Close the loop.
Iowa State’s early-alert researchers found that “a common denominator in struggling students, especially the first semester, is an unwillingness to meet with instructors or advisors to ask for assistance.” Therefore, advisors were copied on all alerts so they could offer assistance. Developing a positive early relationship with an instructor or counselor helps reduce a student’s anxiety and isolation, the researchers surmised. “And if done right, it also teaches the student that it is safe to ask for help.”
Ultimately, however, early-alert systems are effective only if students act on the feedback. Engineering educators may wish to look to their left and right for inspiration, because everyone has a role in promoting success.
*By Mary Lord
Mary Lord is deputy editor of Prism