Moneyball

I saw this tweet by Brian Lamb yesterday:

And I followed the link to this Guardian article on learning analytics by Rebecca Ferguson. It covers the idea of how big data can help us track and intervene in students’ progress before they’re beyond the pale. All this is managed through various data-driven systems that students work in—which, by the way, quickly pushes us right back into a centralized system mindset in higher ed. We’ll obviosuly need a vendor-sponsored product for that! The work is modelled on the tried and true datamining practives used extensively in commercial spaces like GMail, Amazon, Facebook, etc. The basic premise being that, and I’ll paraphrase here, “learning analytics is the Moneyball of education.” Performance predicted by data and value premised on numbers. I’ve read a nightmare scenario along these lines recently.

What struck me in the article was one of the examples used to showcase this technology links to a 2010 EDUCAUSE Review article about Purdue’s Course Signals. If Course Signals is the forerunner in the States of the promise of this technology we’ve got a problem. Why? Well, because a lot has happened since 2010. Mike Caulfield pointed out six months ago, and Michael Feldstein re-iterated, the research claims of the effectiveness of Course Signals to increase retention are deeply problematic. What’s more, there has been no response from Purdue about any of this. As Feldstein notes, their “credibility is on the line.” I’m not saying learning analytics might no hold some water, but if Course Signals is the city upon the hill in terms of examples and the research is shoddy and a pretigious academic institution is close-lipped about their research that is ultimately a product driving an approach to edtech, we should be very concerned.

What’s more, shouldn’t TheGuardian article have some sense of the other side of all this when it comes to learnign analytics. Data is always framed by the people who are collecting it and, in Purdue’s case with Course Signals,  packaging it as a product. I couldn’t have been prouder of Caulfield (as a representative of us lowly edtech types) actually doing the math, picking up on this issue, and in the academic tradition making his concerns a part of the public dicourse. Alternatively, I’m still flabbergasted that the learning analytics discussion has gone on seemingly unconcerned with the fact that the shining example being waved like a banner for the cause is making claims for retention that can’t be supported.

This entry was posted in Uncategorized and tagged , , , , , , . Bookmark the permalink.

8 Responses to Moneyball

  1. dkernohan says:

    Nicely picked up Jim – I think I appear in one of the other articles in this series and, yes, I stiffled a giggle at “Extreme Learning” too. (I imagined fast video cuts and pounding bass-heavy nu metal…)

    But you are right that the series did not pick up on alternative voices in the debate, and presented a very one-sided “this is the future” position which did not convince.

    The series was sponsored by the UK Open University who own FutureLearn (but, in their defence, also employ Martin Weller and Tony Hirst), but I am far more inclined to blame (once again) poor journalism and inadequate research.

    • Reverend says:

      I actually agree with Brian there is a ton of good stuff in these series, but the Course Signals bit chafes me beause one of the shinging examples in the states has been called out on intellectual grounds by that no good EDUPUNK Caulfield, and the crickets responded. Dead silence. Everything is game changing until it’s not, and when the cupboards are bare and the money has been spent we all lose. I was hoping Caulfield’s work would start a dialogue around this point, but it seems the train just keeps on moving.

      What’s more, I’m not trying to pick on this piece too much, I’ve been in this position all too often on the bava. I;m all about shoddy reportage 🙂 It’s what this particular example represents more generally, everyone is one the conveyor belt of what’s next, and learning analytics is certainly a fascianting field, but the conversations more broadly are often muted when it comes to some basic questions around theory, assumptions, and a broader view of a cultural history of the data—not to mention questionable assumptions based on the mermerizing magic of numbers. The questions and conversations around data at the MRI13 conference in December were some of the most interesting for just this reason, it just seems there is little nuance to this when it comes to packaging up the next revolution.

  2. As you know, I think Learning Analytics could have some good impact on what we do (in the near term on the Small Data front, but eventually on the Big Data front too). I don’t thing long-term enthusiasm about this is necessarily misplaced. And a lot of the people I know in Learning Analytics are deeply aware of the Purdue issue (both the mathematical and institutional one) and I think are working to create a more self-critical culture.

    The point of all this should not be “Learning Analytics will save us/destroy us”, but that learning analytics has the potential do both. What path we take in utilizing student data is far more important than our level of LA-boosterism or doomsaying. Still, it’s completely depressing to see the Purdue example out there in an article written by someone who is an expert in social analytics at OU — if that is the example *experts* are still reaching for, something is very wrong. Success stories are beeing broadcast to the ends of the earth, but serious critiques are being ignored. That’s a great PR recipe, but a lousy recipe for impact.

    • Reverend says:

      Mike,
      It just blows my mind your very balanced and even-handed discussions of some of the basic issues with analytics as framed by Course Signals has been ignored by Purdue. That’s why I wrote it. Course Signals is the one I’ve heard of, and I think I heard something about another by Canvas, but I’m not sure. What are some other good examples? I ask because I don’t want to write this off entirely, after our discussion in Dallas back in December I am increasignly interested in this stuff, but at the same time deeply skeptical when it comes to assessment and tracking students success. It reminds me a bit of John Fritz’s “Check my activity” work with BlackBoard where the number of times you logged in was an indicator of your success:
      http://www.educause.edu/ero/article/video-demo-umbcs-check-my-activity-tool-students
      Really? Really?! Course Signals reminds me of this school of thought, I want a new school of thought for this data, one that uses the data for good, dammit! 😉

      • Well, I don’t really know what will be the great Big Data breakthroughs. I do think they are further out. I also think the best uses will be guided by Small Data findings. So Small Data is where I focus my thoughts right now.

        I’ll give you a really simple example. My students in stats would always say they READ the material before class, they just didn’t UNDERSTAND it.

        Now the dystopic way to deal with this is to go Brian’s scanner route “gamify” it and make it so the students have to hit 20 pages in a pattern we know simulates reading. We will make them read!

        I’m after something a lot simpler. I like to know, in aggregate, if what the students are telling me is true. Because if they *are* reading and really not getting it, I’m going to spend class time one way, but if they are not I’ll spend it another.

        Over time, I’d like to know another thing — does the level of the reading the students do correspond at all to performance? Again, because it changes what I might do. If the reading helps, then more reading. If it doesn’t then less.

        Over time, Big Data might take those insights, and recognize the behavioral signatures of students who are reading effectively and those who aren’t. It might help to identify students who are actually fine with the material, but struggling with undiagnosed/unsupported reading disabilities.

        I think my worry is, however, that we jump straight to the Big Data point, and then try to commodify it’s insights as a product immediately. I’d rather build a culture of data hackers — profs that have access and permission to play with student data — and then see the things they discover bubble up into broader products.

  3. Pat says:

    I’d wonder what is in it for the OU
    Poor Education Journalism. Who knew that was possible.

  4. Tom says:

    It seems like people latch onto Moneyball* but miss what I believe is the point (if you buy into it at all)- in a game where fans and players are totally obsessed with stats of all kinds, baseball apparently spent decades measuring the wrong things in what is a dead simple game with really obvious stuff to measure, in a sport with just a handful of players, and a really, really simple goal. All their games and practices are recorded and they have endless amounts of people and money to throw at it.

    Sounds just like education . . .

    *Haven’t seen the movie, read some articles, didn’t watch the linked videos bc I like to be calm at night

Leave a Reply to Mike Caulfield Cancel reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.