The current use of analytic leaves unanswered questions about how online learner behavior is attributed to outcomes because data collection is systems-based – polling the LMS, not the user. For example, “clickometrics” cannot tell us how students cognitively engage with instructional activities to make sense of subject matter – only that there is a correlation between duration/frequency of engagement and grades. It cannot tell us whether student failure is related to navigational disorientation or whether his or her conceptualization of subject matter is misaligned with instructional design or content.
This presentation calls for a user-based system of learner analytics data collection. Dervin’s Sense-making Methodology proposes that user needs and information use can be reliably predicted using a problem-centered (or, objective-centered) approach. To benefit from this approach, learners must elicit their experiences with how instructional content and activities empowered them to make sense of learning objectives and how to achieve them – a focus on “how,” rather than “what” students used to learn.
This data may help course evaluation efforts in making direct (rather than inferential) conclusions about the use of information and instruction in correlation to achievement of learning outcomes.
We propose a user-based model for data collection within the LMS learning environment based on Dervin’s and others’ research in the area of User-based Design.