Wednesday, October 24, 2007

Data mining as an emerging method of assessing student learning

Chris Dede and Jody Clarke (the latter is a doctoral student and this is the subject of her thesis).
Maybe I should have known the session was going to be a bit iffy when I typed in the url given on the session title powerpoint:
muve.gse.harvard.edu/therivercityproject
And got nothing.
The correct url is
http://muve.gse.harvard.edu/rivercityproject/
I was really looking forward to this, but I had seen much of it at ELI last time out. If felt like an ALT paper (never thought I would say that about the great Dede) in that it was spinning out as much as possible from a project. Anyway there were some nuggets.
Their main point seemed to be that yes, yes, we all know that everything is now distributed, that cognition is distributed across human minds, that people are dispersed physically, socially, etc etc but the difference is that all this activity is data generating.

We then watched the Microsoft video that he showed at ELI:

The point seemed to be that an awful lot of stuff is being generated and captured, just like us with our ning thing, and this blog. If we could create tools to analyse all this data we might find more out about how people learn.
Data mining is for uncovering patterns in data, from which you might do some predictive modelling – it is going on in the background as you try to arrange your car or house insurance. The problem in HE is that we use a different grain size, one that is not that detailed, eg which courses are going to popular, where do most of our students come from, etc
Dede wanted to data mine to discover things about teamworking, what difference an authentic summative assessment makes to learning, is a particular student developing an increasingly sophisticated inquiry process - and then to be able to factor in other variables such as gender, age.
So he was collecting everything he could in the muve, and I mean everything. Think Blackboard course stats with knobs on.
I was getting a bit lost as he took us around his very Second Life-like River City thing, when he changed tack and went to:
www.edtags.org
which apparently is not, and I repeat, not about tag clouds, but nearer to concept maps. Got that?

2 comments:

gs said...

abbi and i were musing over the merits of data mining in our office (aka: kitchen) last week. i'm quite sceptical [i know, you're shocked, aren't you?] about how much you can learn about learning with this sort of data. yes, it might show some interesting patterns of interaction and indicate things to investigate in more detail, but as with a lot of things, surely there are too many different variables to learn anything very meaningful? sounds like the sort of thing where the answer is 'it depends'...

i realise the above doesn't add much, but what i really wanted to say was how much i liked the error message you get from clicking on the edtags link:

"is changing servers and will be down for a few days. Sorry for the inconvenience, but the tradeoff is less downtime in the future ; )"

honesty is the best policy and all...

Abbi said...

This is the same impression that I got from participating in an ELI web seminar a couple of weeks ago on Academic Analytics.

The idea: to create "actionable intelligence" from using the data we have on students to inform and improve the student experience, is very noble but I wasn't really sure they'd achieved it. It seems to me like this kind of thing is really useful for directing our attention toward areas for further investigation but we shouldn't rely on it to do any deeper analysis. It makes sense to try and use the data we have as effectively as possible but there comes a point when that data just isn't detailed enough, or doesn't actually answer the question.

I'll post a summary of the webinar on Ning if anyone's interested and we can continue the discussion there.