Created and implemented at Purdue University in 2007, Course Signals integrates with the school's Blackboard-based learning management system (LMS) and displays a traffic light icon whenever students log in. Green indicates that the course is going well; yellow and red follow accordingly. Instructors set the levels and follow up as appropriate.
According to the video, Course Signals bases its prediction on the amount of points earned to date, the number of hours spent on-task (which it tracks via Blackboard), and past performance, which includes all other coursework and perhaps high school transcripts, as well.
Back in Giving Analytics the College Try, we talked about the potential for learning analytics to help steer students toward majors where they're more likely to succeed, identify students whose courseloads are too heavy or light, deliver remedial instruction automatically for students falling behind, and even provide a suggestion engine to surface courses that match a student's background and performance.
Course Signals is a good real-world example of learning analytics in action, and early indicators seem to illustrate its success. A recent feature on NPR reports that some 24,000 students have used Course Signals in various schools, including 20% of recent Purdue undergrads. "It has been shown to increase the number of students earning A's and B's and lower the number of D's and F's, and it significantly raises the chances that students will stick with college for an additional year, from 83% to 97%," according to the article.
More questions than answers
Gathering and using big data is fraught with ethical considerations, and like most every technology, capabilities are advancing far more quickly than legal or organizational safeguards. In fields like marketing, data is gathered and exploited by any means necessary; in academia, there seems to be a stronger push for introspection.
Research scientist Matt Pistilli, who helped create Course Signals for Purdue, wrote a lengthy paper about the ethics of gathering and using student data.
Among the questions raised in the NPR report are:
- Does the school inform students about the data gathering?
- Who owns the data: the school, the LMS provider, the student, or some combination?
- Should the data be used to advance the school's interests, or those of the student?
- How will the predictions influence student or instructor perceptions?
I agree that some of these are thorny issues, but the first one -- informed consent -- doesn't bother me much. Today's young people live online and have no expectation of privacy. They know their teachers are tracking their coursework via the LMS, so why would additional research bother them, especially if it helps them earn a degree more smoothly?
Pistilli concludes his paper with the assertion:
Institutions will find themselves in the awkward position of trying to balance faculty expectations, various federal privacy laws, and the institution's own philosophy of student development. It is therefore critical that institutions understand the dynamic nature of academic success and retention, provide an environment for open dialogue, and develop practices and policies to address these issues.
In other words, figure it out yourselves.
To that end, a group of academics, lawyers, and scientists convened last month to create a framework for appropriate use of data and technology in learning research. I think their conclusions, if you can call them that, reflect the uncertainty and newness of the field. In a nutshell, the group recommends respect, beneficence, justice, openness, and humanity in considering where and how to use learning data. Why not throw in cleanliness and civic awareness while we're listing vague, positive traits?
So members, I now pose the questions to you: Should learning be more organic or more data-driven? Would a tool like Course Signals have helped when you were in school? What about current students? Share your ideas below.
— Michael Steinhart, , Executive Editor, AllAnalytics.com