The current use of analytic leaves unanswered questions about how online learner behavior is attributed to outcomes because data collection is systems-based – polling the LMS, not the user. For example, “clickometrics” cannot tell us how students cognitively engage with instructional activities to make sense of subject matter – only that there is a correlation between duration/frequency of engagement and grades. It cannot tell us whether student failure is related to navigational disorientation or whether his or her conceptualization of subject matter is misaligned with instructional design or content.

This presentation calls for a user-based system of learner analytics data collection. Dervin’s Sense-making Methodology proposes that user needs and information use can be reliably predicted using a problem-centered (or, objective-centered) approach. To benefit from this approach, learners must elicit their experiences with how instructional content and activities empowered them to make sense of learning objectives and how to achieve them – a focus on “how,” rather than “what” students used to learn.

This data may help course evaluation efforts in making direct (rather than inferential) conclusions about the use of information and instruction in correlation to achievement of learning outcomes.

We propose a user-based model for data collection within the LMS learning environment based on Dervin’s and others’ research in the area of User-based Design.

Categories:

2 Comments

Martyn Cooper · March 22, 2012 at 6:45 pm

Hi,

I would like to know more about you plans. Your ideas fit will with some of my ideas around using Learning Analytics to identify intervention points for accessibility improvement.

I have outlined some of my thoughts in this blog post: http://martyncooper.wordpress.com/2012/03/21/learner-analytics-for-accessibility/

I hope we have some onward discussion.

Cheers,

Martyn Cooper

    Steve Covello · March 28, 2012 at 9:14 am

    Thank you for your interest. I suppose there are a number of ways this proposal could be used strategically, so I will only venture as far as stating what it is intended to offer compared to systems-based methods.

    The inspiration for this proposal was borne from two essential experiences. The first was being asked to assist in developing some online support content for our online learners. I asked the following questions: Why are students coming to this Web page? How did they get here? Were they sent here or did they discover it? Do they know what to look for? There were no answers.

    This led to a discussion about whether students were able to self-identify what kind of support they needed. Do they need help with orientation, technical issues, learning, or coping? As it was, students seeking support could not “see themselves” in the support information, in either its content or organization. The question became how we elicit user needs and uses of information and instruction, on broad basis, to determine what kind of support we develop, and how to organize it.

    The second experience had to do with developing an online course and recognizing that I was making a number of assumptions about my students’ information needs and uses at certain points along the course timeline. How would I know if this information or instruction is what they needed at this point, given multiple entry points in each student’s prior knowledge? Do students need certain kinds of “helps” at certain critical points in the instructional design?

    I felt that the culture of learning analytics was seeking to predict when these stopping points would occur, but that they were asking the system to explain it, not the students. When Dervin published a graphic describing all of the different kinds of “helps” available for persons in an information needs/use situation, it inspired the idea of embedding a feedback channel within the LMS environment.

    This is a User-based model that opposes a systems-based method of predicting user needs and information use.

    Whether this could be applied to improving accessibility is certainly plausible, though the feedback channel itself would need to be accessible.

Leave a Reply

Your email address will not be published. Required fields are marked *

Related Posts

Announcements

New Year, New Members!

We are pleased to introduce our two new Executive Board members C.L. Eddins and Jeff Bergin! C.L. Eddins has a B.A. in Business Administration with a concentration in Finance and a minor in History. He Read more…

Announcements

Introducing June Ahn, Ph.D. ELDc18 Keynote

We are pleased to announce that June Ahn, Associate Professor of Learning Sciences/Educational Technology at NYU, will be giving the keynote address at ELDc18.  Dr. Ahn is known for his work on the interplay between Read more…

Announcements

It’s not too late! Call for Proposals due December 1, 2017

If you still want to submit a proposal for the Emerging Learning Design Conference 2018 do not worry, there is still time! The 8th Annual Emerging Learning Design Conference (#ELDc18) May 31st – June 1st, Read more…